When you throw more hardware at them, they are supposed to develop new abilities.
I don’t think you understand how it works at all.
Data is collected. Training is done on the data. Training is done on the trained data (deep seek). You now how a model. That model is a static software program. It requires 700 GB of ram to run (deep seek). Throwing more hardware at the model does nothing but give you a quicker response.
If everyone pays you to use your model, you have no reason to develop a new one. Like Skyrim.
I’m supposed to be able to take a model architecture from today, scale it up 100x and get an improvement.
You can make Crysis run at higher fps. You can add polygons. (remember ati clown feet?) You can add details to textures. https://research.nvidia.com/publication/2016-06_infinite-resolution-textures
But really the “game” is the model. Throwing more hardware at the same model is like throwing more hardware at the same game.
Which part of diminished returns not offering as much profit did you not understand?
Current models give MS an extra 30% revenue. If they spend billions on a new model will customer pay even more? How much would you pay more for a marginally better AI?
Current games have a limit. Current models have a limit. New games could scale until people don’t see a quality improvement. New models can scale until people don’t see a quality improvement.
Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.
If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?
And again data centers aren’t just used for AI.
If buying a new video card made me money, yes
But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.
And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.
More efficient hardware use should be amazing for AI since it allows you to scale even further.
If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?
When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?
Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?
There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.
Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.
Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.
If it wasn’t for his claim that he was using AI instead of people, you’d have a point.
His success is making more money for himself, not creating a better product.
Isn’t layoffs his point?
That’s because you posted it to Lemmy. Like Twitter, Facebook is not getting rid of fact checkers. Facebook is only getting rid of fact checkers that Zuckerberg either doesn’t like or doesn’t care about.
I assure you, “Zuck is a pedo” would be immediately deleted from Facebook if it got any views.
“training a new model”
Is equivalent to “make a new game” with better graphics.
I’ve already explained that analogy several times.
If people pay you for the existing model you have no reason to immediately train a better one.