The RAM Crisis Keeps Getting Worse
FULL TRANSCRIPT
Hi, welcome to another episode of Cold
Fusion. Imagine walking into an
electronic store a few years from now.
[music] The shelves are stocked, the
displays still look familiar, but the
prices make you pause. A laptop costs
more than you expected. The release of a
gaming console is postponed again. A
phone upgrade quietly disappears from
the lineup. And as a consumer, you don't
immediately know why. There are no
dramatic headlines, no single event you
can point to, just a feeling that buying
technology has become more complicated
than it used to be. This future scenario
is already beginning to unfold. It's the
computer enthusiasts around the world
who began to feel it first. But why is
all of this happening? Well, behind the
scenes, a critical component of modern
computing is under pressure. It's called
random access memory or [music] RAM.
It's the kind of hardware most people
never think about, but it's vital to
almost every electronic product we use.
For the past few years, trillions of
dollars have been flowing into AI data
centers across the world. And for those
data centers to run smoothly, one of the
most valuable components is memory or
RAM. In this new AI gold rush, RAM has
become the shovel. [music]
It's the one component that every single
serious player needs to keep digging.
According to multiple industry reports,
Open AI has already secured an estimated
40% of global high bandwidth RAM. Right
now, the shift in global RAM production
is affecting computing. But if you're
not a gamer and you think you're safe,
think again. The price hikes will expand
into a large array of electronic
devices, artificial intelligence, and
specifically [music] the absolutely
insatiable demand for HBM or high
bandwidth memory used in AI training
hardware. Now, to be clear, HBM [music]
is not the same stuff that's in your
laptop. It's a far different and more
expensive product, but all memory shares
the [music] same wafer fabrication
process, and it's a zero sum game. Every
wafer allocated to an HBM stack for an
Nvidia GPU is a wafer not making it into
the LPDDR on your next laptop.
Meanwhile, Nvidia's CEO Jensen Hang was
in Soul recently, reportedly chugging
soju and demolishing Korean fried
chicken with Samsung memory executives.
The purpose of his visit was very clear.
Secure the RAM, lock it in, and make
sure competitors don't get to it first.
But why meet Samsung executives?
Specifically, the South Korean company
now makes more money selling RAM to data
centers than it does selling phones.
Other RAM manufacturers have decided to
stop selling to consumers altogether. AI
data centers are just [music] that much
more profitable. That detail matters. It
shows where priorities are shifting. If
that shift continues, every consumer
product that depends on memory will feel
the pressure. Phones, gaming consoles,
TVs, laptops, you name it. On the other
hand, the country that's best positioned
to take advantage of this memory issue
may not be the United States or South
Korea, but China. In today's episode,
we're going to take a look at how memory
became the choke [music] point of the AI
era, the companies behind it, and why
the next technology shock may arrive
without a single line of code changing.
Let's get into it.
>> You are watching Till Fusion TV.
>> We see data centers as the most exciting
asset class. Meta is building a 2 gawatt
plus data center that is so large it
would cover a significant part of
Manhattan.
>> Take a look at this chart. It shows the
price of DDR5 RAM over recent months.
For a long time, prices barely moved.
They even fell slightly. Then very
suddenly at the start of last year, it
went parabolic. If you've been trying to
build a PC recently, you already know
what's going on.
One single 256 GB RAM kit can now cost
more than a flagship GPU. In some cases,
more than an RTX 5090. And that
parabolic move is where the story
begins. We'll come back to what really
caused it later. But for now, let's look
at what RAM really is. If you're
watching this episode, you already know
the basics, but here's a quick
refresher. RAM is short-term working
memory. It's where data sits while a
computer is actively doing something.
Opening apps, loading game assets,
editing video, even switching between
browser tabs. All of that happens on
RAM. As computers do more, RAM matters
more. In AI data centers, however, RAM
is mission critical. The memory used to
train and run large AI models operates
under very different conditions from the
RAM inside your laptop or gaming PC.
These systems run consistently,
sometimes for weeks at a time. A small
error at home might freeze an app, but
in a data center, that same error can
crash the entire training run and waste
millions of dollars in compute time.
That's why servers rely on ECC memory or
error correcting code RAM. Stability
comes first. Slightly higher latency is
fine, but crashing is not. As AI demand
surged, more manufacturing capacity
began flowing towards server grade
memory. Who cares about consumer RAM?
But there's a massive problem. Only a
few companies can make that sort of
thing. At this point, you might be
wondering something. There's so many
brands of RAM you can buy, it doesn't
look like the market is controlled by
scarcity.
Even though RAM sticks are sold under
dozens of names, the most important part
of the module is the memory chip
themselves. Those small black rectangles
soldered onto the stick. And here's the
key point. Around 93% of those chips
come from just three companies. They are
Samsung, SKH Highix and Micron.
>> I don't remember anybody forecasting
that all of a sudden AI data centers
will be buying up so much of the world's
RAM that the rest of us would be scrging
for leftovers. These three companies
control 93% of the world's supply of
RAM. Of these three, one of them just
said, "We're done with consumer
business. We're just going to focus on
enterprise now. We're going to focus on
data centers because that's where the
money is." And of the other two, Samsung
and SKH Highix, they may have
contributed as much as 40% of the
world's entire supply of memory to a
single project at OpenAI going on right
now to create a massive set of AI
infrastructure there. When the world's
electronics depends on just three
companies, the system becomes fragile.
One disruption, one miscalculation, and
the whole thing starts to wobble. A
tightly balanced supply meets a sudden
series of shocks.
As we mentioned earlier, in October
2025, Open AI quietly locked up an
estimated 40% of global DRAM production
for its long-term AI infrastructure.
Then in late 2025, Micron announced that
it was stepping away from consumer
crucial RAM [music] and SSDs. Its focus
was going to be AI and enterprise
buyers. The leftover consumer stock was
expected to sell out by early 2026. And
with that single move, the entire market
shook. What followed was a scramble.
Other tech giants realized that the
window was closing. In January, Korean
media reported that US big tech
companies were staying in long-term
hotels around Pango and Pyong,
desperately begging Samsung and SKH
Highix for DRAM allocations. The
situation was reportedly so dire that
industry insiders were even calling them
DRAM beggars. Google reportedly tried to
secure additional high bandwidth memory
or HBM. This is a specialized form of
RAM designed to sit right next to AI
accelerators. In Google's case, those
accelerators are TPUs or tensor
processing units. Google's custom chips
that power much of its AI training. The
response they received was blunt.
[music]
Supply simply wasn't available.
According to reports, the executives
responsible for securing that memory
were later fired. Microsoft didn't fare
much better. Also in a panic, executives
flew to South Korea to negotiate
directly with SKH Highex and the talks
went badly. One executive reportedly
stormed out of the meeting. This was the
environment during the price spike that
you saw earlier. So given everything
happening across the supply chain, it
leads to an obvious question. If demand
is so high and prices are rising, why
don't memory manufacturers simply make
more chips?
Why don't they just make more? Well, for
a couple reasons. First, as I've alluded
to before, the current fabs that are
used to make cuttingedge chips have a
finite capacity. They already run around
the clock with some of the world's most
finely tuned supply management and
highly trained engineers overseeing
their operation. It's not as simple as
just adding a production shift or
turning the machine up to 200%. Those
are delicate operations and if something
goes wrong in the middle of a batch, it
could be weeks or even months to build
it again from scratch. So, they're not
going to change things all willy-nilly.
And second is that even if they could
boost production to meet demand, both
SKHakes and Samsung have publicly stated
they don't plan to because they don't
want to. That clip explains the core of
the problem well. Once you accept that
today's factories are already running
flat out, the obvious follow-up is why
not just build new ones, this is where
things slow down. People inside the
industry are very clear about the
timeline. One memory chip executive told
Reuters that even after a company
decides to expand, it still takes at
least 2 years for new capacity to
actually start producing chips. And
that's the optimistic version where
everything goes according to plan. In
tech, 2 years is equivalent to the time
between now and the Egyptian pyramids.
Building a fab means committing billions
of dollars today based on what you think
the demand would look like several years
from now. And who knows what AI demand
would look like then. Everything could
pan out or it could be a bubble. Even
Sam Olman, Open AI CEO, has openly
acknowledged that the current AI frenzy
may turn out to be a bubble.
>> It's an interesting [music] time for the
world's most prominent AI CEO to say
this. And so late last week, Sam Alman
um had a dinner with various
journalists. Um it was on the record and
the verge published this quote. Um so
[snorts] Sam Alman said when bubbles
happen smart people get over excited
about a kernel of truth. If you look at
most of the bubbles in history like the
tech bubble there was a real thing. Tech
was really important. The internet was a
really big deal. People got over
excited. Are we in a phase where
investors as a whole are over excited
about AI? My opinion is yes. Is AI the
most important thing to happen in a very
long time? My opinion is also yes. Um so
that was Sam Alman saying we are in a
bubble um and investors are over
excited. He said elsewhere um that he
thinks some people will probably lose a
lot of money.
>> There's also plenty of early evidence to
suggest that open AI themselves will
struggle financially in the future and
that uncertainty matters. If AI demand
cools faster than expected in the next 2
years, it makes little sense for memory
manufacturers to pour billions of
dollars into equipment. Now, they're
being extra cautious [music] because
they've seen this movie before. In the
mid-2010s, memory demand surged as
smartphones went mainstream. This was
especially true when cheap mobile data
exploded in fast growing markets like
India. [music] DRAM manufacturers ramped
up production. They were convinced that
the growth would last, but it didn't.
Demand cooled, the market flipped into
over supply, and prices collapsed. It's
a painful memory that still hangs over
the industry. [music]
So instead of a sudden flood of new
factories to meet demand, what you get
is hesitation, careful commitments, and
a lot of waiting. Under current
conditions of increased demand, the
impact is massive, and we're already
[music] starting to see it.
By this point, the effects are no longer
subtle. Inside the industry, executives
have stopped speaking in hypotheticals.
SK Group chairman Chay Taywan said,
quote, "These days, we're receiving many
requests for memory suppliers from so
many companies that we're worried about
how we'll be able to handle them all. If
we fail to supply them, they could face
a situation where they can't do business
at all." End quote. All of that pressure
is already spiraling into the real
world. According to Reuters, Japanese
electronic retailers have begun limiting
how many hard drives their consumers can
buy. Chinese smartphone manufacturers
are openly warning of price increases.
Apple is already deep in it as well.
They're reporting paying a 230% premium
for the 12 GB LP DDR5X memory used in
its iPhone 17 Pro models. Chips that
once cost $25 to $29 are now closer to
$70 each on every phone. PC makers are
facing the same reality. Lenovo and HP
are scrambling to secure memory supply
as shortages are expected to last until
2027. Dell, Lenovo, and Framework have
also announced price increases. The RAM
apocalypse continues. As RAM prices
continue to rise, Dell, Lenovo, and
Framework have all announced price
increases and other changes [music]
due to DRAM shortage with Trend Force
even predicting that Dell and Lenovo may
be going backwards and limiting devices
to only have 8 gigs of RAM. According to
IDC research, in 2026, the whole PC
market could decline by 4.9 to 8.9% and
[music] smartphones 2.9 to 5.2%
respectively. Consoles are under
pressure, too. PlayStation 6 and the
next generation Xbox could face delays.
Nintendo has already lost around 14
billion in market value amid concerns
over memory costs affecting the next
switch. Among [music] the consumers,
gamers are feeling it the most. They
rely on high performance GPUs, but those
GPUs are rumored to push towards $5,000
price tags. [music] A recent report
indicates that Nvidia is going to be
pushing the prices of RTX 5090 up to 5K.
[music]
Uh, this is still an unconfirmed rumor
from an insider source in South Korea.
From what I can understand, that's kind
of like, you know, boots on the ground
that's talking about it right now, which
is where this headline is stemming from.
Previous price was 2K. This is over
double.
That's That's an awfully uh hard bargain
there. 5,000 clams. I would imagine most
sensible sane human beings on the planet
would be pretty upset about that.
Meanwhile, there's some dire warnings
out there. CEO of the fabulous RAM
designer Fison states [music] that many
consumer electronics manufacturers quote
will go bankrupt or exit product lines
end quote by the end of 2026 [music] due
to the AI memory crisis. He claims that
mobile phone production will be reduced
by 200 to 250 million units and PC and
TV production will also be significantly
reduced. We'll see if that pans out, but
it does indicate some serious [music]
supply disruption.
At this point, we need to talk about
Nvidia. The company built its rise on
gaming GPUs, but now sits at the center
of the AI boom, and they're one of the
biggest beneficiaries. According to the
publication, The Information, Nvidia
will pause new gaming GPU releases for
consumers in 2026 due to the shortage.
It's a slap in the face for gamers.
Meanwhile, their latest Blackwell
systems are designed for data centers,
not desktops. Each rack carries enormous
amounts of memory, up to 864 GB, and
that's because modern AI models demand
it. Multiply that across hundreds of
systems that companies like Anthropic
and Microsoft want to deploy, and the
effect is obvious. Vast chunks of global
memory supply disappear into data
centers long before consumer hardware
even gets a sniff. All of this leaves
the market in an unusual place. The
companies that built the modern tech
stack are quietly reshuffling their
priorities. To consumers, Nvidia feels
like a modern Judas, but that's just the
way it is. But there's also a glaring
problem with the data center buildout
itself. What happens in 2 to four years
when the very chips that have ruined the
global RAM supply are hopelessly
outdated for their original purpose?
It's a question worth asking. And then
on top of this, you have all the
clueless investors jumping in on the
hype. Frankly, it's insane. So the
problem with data centers is everyone
thinks that data centers are real
[music] estate. And a lot of people do
real estate. Data centers are not real
estate. The common joke in the industry
now is someone says, "I'm going to have
100 megawatts of capacity for you and
I'm going to have it in three months.
Are you willing to sign?" And then you
ask a question like, "Well, um, what's
your uptime?" And they're like, "I don't
know, whatever the power grid is."
You're like, "Wait, what? Where are your
generators?" Oh, I haven't ordered
those. I'll order them now. You know
that there's a 90-month lead time on
generators right now? Oh, really?
>> 90.
And then the next question is, where are
you getting the water from? Wait, data
centers need water? I thought it was a
bunch of chips. What? What do you mean
water? So, there's a bunch of people
have no idea what they're doing going
into it because they think it's real
estate. And so, those people are now
building an over supply of data centers,
but they're not really building them.
So, they're fake data centers [music]
that people think are real.
For now, there's no obvious release
valve, but there is a dark horse in this
story. China. Just as pressure mounts on
traditional centers of chip production,
China is only a few years behind the
cutting edge. Could the giant from the
east swoop in and change the RAM
apocalypse story? Possibly, but not
overnight. China's leading DRAM
challenger, CXMT, has recently announced
that it's able to manufacture DDR5
memory. On paper, that matters. But
there is a catch. Timing. Most analysts
believe that CXMT is still 2 or 3 years
away from reaching the scale, yields,
and consistency needed to meaningfully
shift global supply. By then, today's
contracts will already be locked in.
Those contracts matter more than most
people realize. SKH Highix has
reportedly sold through much of its
production well into 2026. That means
that even if AI demand cools or the
bubble deflates, the memory is already
spoken for. Buyers who locked in early
will still be expected to take delivery,
often at prices set during today's peak.
So even with new players entering the
field, relief arrives slowly, and when
it does, it's not going to feel evenly
distributed.
So I've been covering neural networks on
this channel for over a decade. Before
Chat GPT and before the hype, I was
really fascinated by the raw potential
of this technology. But what we're
seeing now just feels different. louder,
messier, harder to justify. Between the
water usage, the electricity demand, AI
induced psychosis, non-consentual
generated images, and the growing flood
of loweffort AI content, it's fair to
pause and ask a difficult question. Are
the benefits of consumer generated AI
still worth the cost, especially when
the fallout isn't abstract anymore? So,
what's your take on all of this? Do you
think that the RAM shortage is just a
necessary step to get to where we need
to be? Or have tech CEOs become a bit
overzealous and this is all a big
mistake? Let me know your thoughts in
the comment section below. It's
abundantly clear that AI is everywhere
these days, whether we like it or not.
But have you ever thought about how
artificial intelligence works in the
first place? Well, if you have,
Brilliant is for you. I found it a great
way to learn or build on the skills you
already have. Brilliant is a learning
platform designed to help you master
both math and coding through interactive
step-by-step lessons and personalized
practice. With Brilliant, you're not
only learning by doing, you're also
solving problems visually and
interactively. I love the way it adjusts
to how you learn, so you're always
progressing at the right pace. Whether
you're 10 years old or 110, it's
designed for everyone from curious kids
to adults. I love brilliant scientific
thinking course. When you're working
through these science lessons, you're
actively solving problems step by step
until the ideas genuinely make sense.
That interactive approach makes a huge
difference when compared to just
passively watching a lecture or a video.
Everything is carefully crafted by
worldclass educators from places like
MIT, Harvard, Stanford, [music] Caltech,
and leading tech companies. With their
expanded 2025 content library, there's
more depth than ever. from everyday data
reasoning and probability to advanced
problem solving and [music] the
mathematics that underpins modern AI. So
if you want to learn more or simply just
brush up on your knowledge base, look no
further than Brilliant. Start building
the habit of learning today. Not for
grades or credentials, but for the way
it sharpens how you think, reason, and
approach challenges to learn for free on
Brilliant for a full 30 days. Head to
brilliant.org/coldfusion.
Scan the QR code on the screen or click
the link in the description. Brilliant's
also offering our viewers 20% off an
annual premium subscription, which gives
you unlimited daily access to everything
on Brilliant. Thanks to Brilliant for
supporting Cold Fusion. Anyway, that's
about it from me. My name is GoGo and
you've been watching Cold Fusion and
I'll catch you again soon for the next
[music] episode. Cheers, guys. Have a
good one.
[music]
Cold fusion. It's new thinking.
[music]
[music]
UNLOCK MORE
Sign up free to access premium features
INTERACTIVE VIEWER
Watch the video with synced subtitles, adjustable overlay, and full playback control.
AI SUMMARY
Get an instant AI-generated summary of the video content, key points, and takeaways.
TRANSLATE
Translate the transcript to 100+ languages with one click. Download in any format.
MIND MAP
Visualize the transcript as an interactive mind map. Understand structure at a glance.
CHAT WITH TRANSCRIPT
Ask questions about the video content. Get answers powered by AI directly from the transcript.
GET MORE FROM YOUR TRANSCRIPTS
Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.