The narrative in AI infrastructure over the last two years has been dominated by the enormous and growing demand for compute capacity and its economic consequences, such as the buildout of data centers and the consequent shortages of key resources such as land, water, power, and copper.
But of all these bottlenecks, memory is by far the most significant. The demand for memory is now outpacing the demand for other drivers of compute capacity. The implications of this will ripple through not just the economics of data centers, but the cost of every single consumer and enterprise hardware device.
In this piece, we unpack the market action around memory prices, its ripple effects across the consumer and industrial electronics market, and the supply and demand curve that is emerging around AI. Critically, we explain why the amount of memory being purchased by AI companies like OpenAI seems to be more than what they need, and how the threat of on-device inference might actually be incentivizing an engineered memory shortage.
It’s not a shortage! The supply of RAM has been essentially flat forever. It’s dumbass AI companies slurping up all the supply. That’s why there is a RAM “shortage.”
I think that’s what we call shortage.
Better way to phrase it: it’s a shortage caused by a huge increase in demand, not by a reduction in supply
The goal is to desensitize the general population into accepting thin clients instead of actual computers so that they can rent their OS as a cloud subscription.
This gives companies more control over content you consume plus your behaviours. It also gives governments more granular control over their citizens which is in vogue considering democracy is out and barbarism and force are back in.
Enshittification of computers
Asus and dell are already launching clients to save us.
The goal is to desensitize the general population into accepting thin clients
Shame on you, because Im in the thin client Hype for 2 years now
My backup PC and media center both run on 2015 thin Clients running Debian, they are really cheap, dont have any moving parts, draw 15w at most, and are Really space efficient
Yeah PXE boot has been a thing for decades, and with network speeds going from 10/100, gigabit, and now potentially 5-10gbit it’s pretty viable for home environments. It’s great for common libraries, and mine has an emulator plus a bunch of GoG etc games which I’ve been tinkering to make run nicely.
My preference though is still “thick” clients which use the network for boot and OS/storage, but still have their own CPU and RAM.
Other than storage and networking, the server side requirements aren’t huge. My plan is to make a portable environment which people can patch into and play classic games together.
(If anyone has experience with OpenSpy and getting it to work sans-Internet I’d love to pick your brain, as I really want to get BF2142 and other classics running fully without internet)
Define thin client please
Fujitsu futro s920
Well, as I get older, my body is breaking down. If it wasn’t for my shortage of memory, I would be even more miserable. So it must be intnetional. Oh… wait, not that memory…
Whether it’s intentional or just fraudulent, it’s malicious either way. This whole datacenter “investment” situation is thoroughly fucked up
It could just be irrational exuberance, and the perverse incentive to keep it going by the people running those companies.
In the dotcom bubble I’m sure there was shortages, same as the housing bubble, which destroyed peoples ability to buy a home.
Its really the Fed whose at fault, with cheap debt and QE, they now exist to create as much misallocated capital as possible. The entire financial system is fraudulent, to quote burry in the big short.
If cxmt tries to fix the memory shortage and the US responds by threatening to sanction them, then the shortage is intentional.
Get ready to only own screens. And every thing is processed via the cloud on data centers destroying your community and livelihood.
And of course you will be paying for every minute of it.
You won’t open the content, and you’ll have absolutely 0% privacy because all of your processing and data will be on somebody else’s system.
AI will go from making cute GIF’s to fully automated surveillance and ensuring nobody uses those systems for anything not approved by the regime.
Currently contracting in an “automated” manufacturing center in the US and all of our SCADA traffic is on the global network currently being routed through the UK because oops and it’s hilariously bad. Systems designed for millisecond polling talking 1-30 seconds to react.
SCADA systems are universally terrible. Clouding them does not resolve that. Fuck whoever decided to do this.
Can’t have competition if you buy resources from under them with made up money from NVIDIA and Microsoft. I hope Iran trully developed hypersonic nuclear weapons.
Yes. This is the first step to doing with technology what they have done with housing, transport, media, and agriculture. The noose has nearly closed.
Us human really are good at making shit worse, eh?
us humans are pretty good at not chopping annoying peoples’ heads
Will lower memory availability to consumers increase reliance on cloud-based storage and demand for data centers? >
No, because you need more RAM to run a smart terminal than a standalone micro, because there’s no secondary memory available to rely on.
Therefore, according to our best estimates, OpenAI likely needs less than 30% of the 10.8 million wafers it’s planning to buy>
OpenAI hasn’t actually paid for any of that, its sold on credit with a 6 year repayment period on hardware that will only last 2-3 years at most. That’s why no memory manufacturers are increasing capacity, as they would if they thought there was any long term increase in demand.
So a 3 years thirst phase and then you get memory and storage for cheap?
For those who don’t want to read several pages of unnecessary text telling you what you probably already know:
The math, while pretty involved, may tell a straightforward story (if you’re interested in the details of our analysis, see the Appendix). OpenAI has contracted 900K memory wafers per month from Samsung and SK Hynix. Partner commentary seems to indicate that’s a monthly number, so that represents 10.8 million wafers over 12 months. In terms of demand, a fully built-out 10GW Stargate cluster would require ~3 million GB200 Bianca Boards. Each board requires ~50% of a memory wafer in total; split between the HBM3e stacks embedded into its two B200 GPU (~30%) and its 480 GB of LPDDR5X system memory (~20%). That puts total wafer demand for the entire cluster at ~3 million wafers.
Therefore, according to our best estimates, OpenAI likely needs less than 30% of the 10.8 million wafers it’s planning to buy
So this is just putting some numbers to what a lot of people already guessed. The AI companies are not just buying a ton of RAM to build out their data centers. They aren’t buying enough other components to even use that RAM. They’re buying it so that no one else can.
Fuck, RIP to an entire generation of kids and teens who won’t get to experience the joy of building out their own personal machines
And market supervision – is not existent.
I’m just not connecting the dots. The amount of money they’re spending on this is astronomical, and they are burning through the cash they have at a rate they can’t sustain, while they’re fighting for their future against Google, Anthropic, plus xAI and Perplexity and others, and maybe foreign competition like Deepseek that the government can’t fully shield them from. While also competing with major data center companies themselves, who may want to build data centers for other non-AI purposes, too. And those competitors have deep, deep pockets.
If they don’t have a revenue model that actually keeps them afloat, then all their capital expenditures will end up going to benefit someone else.
In other words, the central thesis that they want to choke out competition from on-device models kinda ignores that they’re facing a much more immediate, much more pressing threat from their data center competition. It’s like trying to corner the market on snow shovels when a hurricane is bearing down.
Plus one important thing worth noting is that OpenAI purchased the option to buy that much memory, enough to persuade the memory manufacturers to change their own investment decisions for the next 5 years. They’re not necessarily going to actually buy that much. And in theory could sell that option to others. 40% of the market is enough to really move prices, but not enough to actually corner it and exclude others from buying memory. They’ll just have to make it more expensive for themselves at the same time that they make it more expensive, but not impossible, for their true competitors also outfitting data centers.
It’s OpenAI in particular trying to screw everyone else. The wafers they contracted from Samsung and SK Hynix are something like 40% of those companies’ production. There isn’t enough production volume for the other AI companies to over order like that.
So this is the strategy of putting 4 houses on your properties in Monopoly and never upgrading them to hotels because that way there are no houses for your opponents to buy
Hopefully this accelerates their crash.
Not if they sell it on the surge.
They may have millions extra, and that just means they’ve now become a shitty version of best buy as they schlepp it at surge pricing to make back the bank.
No, clearly it was accidental. 🙄 🙄 🙄
haven’t read the article just yet, but these companies are known for price fixing and colluding for decades now
deleted by creator
Oligarchs trying to buy up all the digital real estate so they can be the digital landlords of computerland
I guess that’s the point when I’ll just go offline for good and learn a solid, old-fashioned trade.
Why sell you a computer, when they can rent you one for $$ per query you do
deleted by creator
deleted by creator
deleted by creator
It’s a scary world out there, I absolutely understand the feeling of reading too far in to a slop piece and feeling like my time was stolen from me. We have to support good human content, even if it’s from some hedge fund think tank thing. Thank you for being honest, sorry for calling it bullshit. It was bullshit, but, like I’m sorry.
No hard feelings. I screwed up, and you called me on it, fair and square.
Chad mod action nuking whatever this exchange was but leaving the bit at the end where the guy was humble and respectable
(We’ve actually been deleting our own comments, which is a real shame because the one who called me out was also being a chad tbh)
I guess you can just restore things, this is news to me, but pretty neat.












