You are using current AI as your baseline. There will come a point where writing code will mean there being zero bugs or vulnerabilities. Humans cannot do that. AI will, whether we want it or not, one day be able to. Idk if we are talk 10 years or 40 years, but it will happen.
In order to not have any bugs, and for anything to produce perfect software, you need to define perfect business rules, and if managers could do that, they wouldn’t have needed developers for decades.
If we have AI that can produce the perfect code, you won’t have access to it. Why giving everyone something so powerful when now you can circle around everyone easily?
You are using current AI as your baseline. There will come a point where writing code will mean there being zero bugs or vulnerabilities. Humans cannot do that. AI will, whether we want it or not, one day be able to. Idk if we are talk 10 years or 40 years, but it will happen.
LOL at that.
LLMs need to disappear before that happens.
In order to not have any bugs, and for anything to produce perfect software, you need to define perfect business rules, and if managers could do that, they wouldn’t have needed developers for decades.
If we have AI that can produce the perfect code, you won’t have access to it. Why giving everyone something so powerful when now you can circle around everyone easily?