Do these companies that are buying up all the supply actually need all that RAM right now? Or are they buying it all up in anticipation of future need? If the latter, honestly this might be a case where some kind of regulation really ought to step in.
To the extent that most of this is going into AI and people are having their ChatGPT and Gemini requests throttled because of lack of capacity, they need it now.
AI is dramatically more compute- and memory-hungry than past computing models, so if that's what people are using, it's going to require a large build-out of computing capacity to support the requests that are being made right now.
Some of the spike is speculation, and the overshoot seems to be correcting itself now. But the deal that sparked it was a contract promising to buy future capacity, not just doing a big block order for a bunch of stock 'in case' (which isn't unusual: if you're a big buyer, you will almost certainly buy most things this way).
That's because you're not paying attention to the wider world. Every inference provider has been maxed out for years now. Nvidia switched from mainly selling training servers to inference servers two years ago?
People complain about the llm's hallucination, but humans making up conspiracies to shore up their own ignorance is a much worse problem
It is truly unbelievable. A 2x32GB DDR5 kit I paid $150 for last July is listed for $885 today. Even DDR4 is getting hit, a 2x16GB kit I paid $105 for a year ago is $230 today.