“Revolution”? More like devolution.
Just look at some large-format advertisements at the local level, as some print shops have started to use AI slop and thus eliminating the need to hire an experienced illustrator.
Fortunately, some people are already fighting back to oppose the devolution, committing themselves to the Butlerian Jihad.
I’ve still not heard a convincing argument explaining how these companies are going to make enough money to offset the billions they’ve spent on R&D and hardware.
It’s strange really, if I was an investor that would be the first question I’d ask but I guess VCs are smarter than I am.
The same way they do in every other bubble.
- the bubble pops, most companies fail. Mostly bankruptcies, massive layoffs but also huge tax writeoffs
- of the surviving companies, a couple strike the jackpot.
Most of that huge overall investment is lost, but everyone wants to be in on the one or two that succeed, and those specific investments could have huge returns
How do they succeed though?
I’m not seeing the market for LLMs in any meaningful roles given they are prone to saying things that aren’t true. Would you hire someone who does good work 90% of the time and for the rest, tells you the work is done, when it’s not, or worse.
LLM vendors are starting to charge money. I’m sure it’s not even close to profitable but it’s a start. Perhaps when the bubble pops and market consolidates, fewer vendors with more paying customers each …
Using an LLM is a skill just like any other. If you just take what it gives you, you can’t expect good results. If you evaluate what it gives you and prompt it to improve, the results aren’t as bad.
I use an LLM for coding and a definitely a skeptic, but I do find it a useful tool and am really interested in seeing if I can make it work.
Initially I found some amount of success at lower levels, saving me some time
- it could auto complete entire lines of code (and that’s trivial to evaluate and correct if necessary)
- it was pretty good about generating unit tests since they tend to be simple and repetitive. In general corrections tend to be smarter coverage, tweaking the tests to cover more functionality with fewer tests
- it’s pretty good with utility scripts. For example today I had a decision and wanted supporting data: in minutes it generated a script to call APIs in my scm and generate some stats for 4,000 code repos …. And it worked
Currently I’ve created rulesets and project context so
- it’s been quite successful at code reviews (it finds things I miss, and has resulted in my human reviewers finding less)
- I’m proud of one for identifying refactoring opportunities. It finds good spots and makes good suggestions, but so far I have to implement myself: its code hasn’t been usable. I can also objectively verify by reduced cyclomatic complexity.
Trying to find other scenarios it can be successful, it’s clear that insufficient context is a limiting factor. The fun challenge is to see if there are more successful scenarios if you can give it enough context. I’ve gone past rulesets and project context, to connect relevant services and metadata about our product set and environment. They want a team to try vibe coding and I’m still very skeptical, but my part of the effort is a real solvable problem and fun challenge whether they succeed or not
So far, there hasn’t been an AI revolution, any more than there was a Segway or an NFT revolution.
Right? Like if all the hype was true I’d expect this golden age of software at the drop of a hat.
Instead all we get is still minor inconvenience and dark patterns that no one wants.
But they promised me that the Regurgitation Engine would solve all problems!
There has been a computer revolution for sure
There has been a mobile phone revolution, absolutely
There was even a social media revolution, changed the way we interact
AI, though, so far has been just “the next it fad” in the 2-5 year cycle, like NFT before it, like Crypto currencies before that, and what was it before that? Web 3.0, then before that there was… Trying to remember… cloud computing? Each of these fads had minor to no influence.in how we did things, and for AI we only just added in stupidity of writing documents with AI which completely misses the point why we write those documents to begin with
I was with you up to “cloud computing”. That bubble was a huge success that has really revolutionized how software is provided
- well known winners include AWS, Google, Microsoft but there are many more depending how you define cloud computing
- also some huge flops
AI has a lot of mindshare and has demonstrated contributions in several areas. For example, ai slop you see on YouTube is making some people money. As a coder I do find it sometimes a useful tool, and I can definitely see the near future where it’s a required skill, and no, if you just ask it to spit out slop you’re not getting anything but slop ). I don’t see how it’s going away. However it doesn’t (yet?) live up to its hype nor is there (yet?) a profitable business for providers.
Meanwhile the crypto and NFT bubbles were pyramid schemes that only ever made money from themselves. Web 3.0 probably looks useful to its proponents but was only ever a niche that no one else cared about
The AI fad is a lot more like the Internet fad than the crypto fad. I think once the bubble pops (like the .com bubble 25 years ago) some use-cases will definitely remain. But yeah, we definitely are in a bubble.
AI is a lot closer to a revolution than to a bust. It’s already likely going to remain an established tool for software development and process automation.
It still remains to be seen if a company can be a single person managing an army of agents can actually become a sustainable company. This would be an industrial revolution on steroids type change that’s honestly terrifying.
An equally or even more likely scenario is we get most of the way there, but it only reduces the need for developer type jobs by 20-50%. From here lots of things could happen. The job market could stay somewhat stable as while companies hire less people, there are more smaller companies with direct hires as the barrier is massively reduced. The job market drastically shrinks and software becomes a less attractive discipline compared to other types of engineering or office work. An industry wide Cobol type situation happens as those that survive the job losses retire and laid off workers have moved on to other industries and no junior positions exist.
im iffy on the social media thing. I would call it a type of stagnation. its not really improving anything. honestly the enshitification is kinda worse than stagntion. smarphones seemed incredible when the iphone and android first came out.
Software engineers use AI. Software engineers make software. Software runs all the shit. Generative AI has definitely lead to a revolution, regardless of how we feel about it. It’s not revolutionary like the internet was, nor like the steam press, nor like toast, but… maybe it’s revolutionary like … I don’t know … browsers in phones?
Not to mention AI is helping us understand how consciousness works better. Not because it actually resembles consciousness, but because it doesn’t while many people thought it would. This realization helps us develop our language and understanding better… now we distinguish between different kinds of intelligence more, and certainly understand better that you can have intelligence without consciousness. That’s a philosophical revolution.
Honestly, the technology is cool. The clout around it is ass. How the technology is being used is ass. What behaviors the technology is incentivizing is ass. But as a matter of fact, that it’s possible at all to exploit natural redundancy in language to the point of producing generative machines — that by itself is quite cool.
AI revolution, not likely. more like AI age of degeneracy.
Speedrunning the Idiocracy future while we offload cognition to a black box.
in the movie, the AI ends being the 'ceo" and controlling humans.
How can it usher in a thing that’s been going on for at LEAST the last 15 years?