Inside AI’s Memory Crisis

Plus: Agentic workflows in Windows, Poe chats, AI’s global ignorance.

Here’s what’s on our plate today:

  • 🧪 The chip wars heat up with memory as AI’s next bottleneck.

  • 🗞️ Agents in Windows, group AI chats, and global data gaps.

  • 🧰 ColdFusion’s chip explainer, TrendForce tracker, and SemiAnalysis deep dive.

  • 🗳️ Are memory bottlenecks the real AI chokepoint?

Let’s dive in. No floaties needed…

Launch fast. Design beautifully. Build your startup on Framer—free for your first year.

First impressions matter. With Framer, early-stage founders can launch a beautiful, production-ready site in hours. No dev team, no hassle. Join hundreds of YC-backed startups that launched here and never looked back.

  • One year free: Save $360 with a full year of Framer Pro, free for early-stage startups.

  • No code, no delays: Launch a polished site in hours, not weeks, without hiring developers.

  • Built to grow: Scale your site from MVP to full product with CMS, analytics, and AI localization.

  • Join YC-backed founders: Hundreds of top startups are already building on Framer.

Eligibility: Pre-seed and seed-stage startups, new to Framer.

*This is sponsored content

The Laboratory

How AI is powering a new memory chip arms race

SK Hynix HBM: the high-bandwidth ‘data skyscraper’ powering today’s AI boom amid record demand and global shortages. Photo Credit Reuters.

When end-users visualize artificial intelligence, the images that flash before their eyes may include robots from the T-800 from the Terminator, or TARS from Interstellar. While they may become part of reality in the future, current AI models require a lot more space and power to run.

In some cases, AI models can be run on edge devices like laptops and mobile phones; however, for the more powerful AI models that make generating text, images, and videos possible, companies have to rely on large data centers.

These data centers range in size and can cover millions of square feet of space, where high-end chips are used to run the neural networks. However, for these neural networks to be of any use, they have to be fed massive amounts of data. Even to train the neural networks, companies have to feed them data, which is often stored on memory chips.

So, while the T-800, in the shape of Arnold Schwarzenegger, may provide the screen presence, current AI systems need chips to run neural networks and to store the data that is fed to them.

The dependency of AI modes on memory chips is reflective of the industry’s dependence on companies beyond ones like OpenAI, which develop models, and Nvidia, which makes the chips to run these models.

And this dependence has had a serious impact on the market for memory chipmakers, which is predominantly controlled by SK Hynix and Samsung.

According to a Reuters report, Samsung raised prices of certain memory chips, which are in short supply due to the global race to build AI data centers.

The shortage has also resulted in a sharp increase in the market valuations of both Samsung and SK Hynix, underscoring how the AI boom has stoked intense demand for chip units specifically designed for AI tasks.

AI’s memory problem

Memory chips have been the unsung workhorses of computing for decades. Dynamic random-access memory (DRAM) temporarily stores data for active processing, while NAND flash provides persistent storage.

These technologies powered the personal computing revolution, the smartphone era, and cloud computing's rise. However, for most of their history, memory chips were commoditized products subject to brutal boom-bust cycles, with manufacturers like Samsung, SK Hynix, and Micron competing primarily on price and volume.

With the advent of AI, the market dynamic began shifting, and analysts realized that traditional memory chips would not be able to keep up with the needs of AI data centers.

According to the Potomac Institute, the memory industry accounted for $154 billion in sales in 2021, comprising 28% of the global $556 billion semiconductor market, and is equivalent in size to the entire category of logic (comprising CPUs, accelerators, FPGAs8, etc.).

A primary reason behind the sustainable growth of this market was the ability of researchers to increase the density, the amount of data stored for both DRAM and NAND.

However, the industry hit a wall where traditional DRAM has hit a fundamental limit due to power constraints and latency issues caused by the physical distance between DRAM and processors.

Graphics processing units (GPUs) from Nvidia, AMD, and others could perform trillions of operations per second, but they were starved for data if memory couldn't keep pace.

HBM: AI’s critical enabler

SK Hynix HBM: vertically stacked ‘data skyscraper’ memory delivering higher efficiency, lower power, and massive bandwidth for next-gen AI. Photo Credit Bloomberg.

To address the problem of storing data, SK Hynix developed a new technique called High-Bandwidth Memory chips (HBM). It stacks layers of memory vertically, akin to a ‘data skyscraper’. This unique architecture supports faster communication between the processors driving ChatGPT and other leading AI models.

The technology was a remarkable jump from the earlier tech, and in its current iteration, is capable of delivering an overall 60% advantage in cost per bandwidth. This means it needs less power, has better thermal metrics, and offers up to 36 GB capacity and over 2 TB/s bandwidth.

However, the manufacturing complexity is staggering. Producing HBM requires roughly seven major steps across front-end and back-end processes. The complexity of the technology forced many existing players to exit the market, creating clear winners.

It is estimated that in the High Bandwidth Memory market, SK hynix and Samsung are the undisputed leaders, jointly holding more than 90% of the global market share in 2024, while Micron plays the role of a challenger with a 7% share.

Despite their presence, since the take-off of the AI boom, the market has struggled with shortages.

In May 2024, Reuters reported that SK Hynix said that its high-bandwidth memory (HBM) chips used in AI chipsets were sold out for this year and almost sold out for 2025. The company is one of the major suppliers for Nvidia.

Additionally, CNBC reported that High-performance memory chips are likely to remain in tight supply through 2024.

Demand outpaces supply

Since HBM became the backbone of AI development, hyperscalers have been scrambling to stock up on them.

Big Tech’s infrastructure spending has exploded: In October 2025, The New York Times reported that four of the tech industry’s wealthiest companies were showing no signs of slowing down their spending on AI.

Google, Meta, Microsoft, and Amazon have invested so much in developing and deploying AI systems that they have sparked fears that the tech industry is heading toward a dangerous bubble.

This surge is driving unprecedented demand for advanced memory. According to Bloomberg, the HBM chip market is set to grow at an annual rate of 42% between 2025 and 2033.

The market is set to expand at an average of 42% a year, making it more than 50% of the overall dynamic random access memory (DRAM) market in 2033 and comprising 10% of industry bit shipments, all led by the growing demand for AI infrastructure.

Geopolitics tightens supply further

Meanwhile, geopolitics is intensifying supply pressures. The U.S. BIS’s December 2024 rules imposed the first-ever country-wide ban on exporting advanced HBM to China, including HBM2E and above.

With no Chinese firm capable of producing HBM2E, companies like Huawei must rely on stockpiled HBM unless domestic breakthroughs occur, tightening the market even further.

However, not everyone believes AI demand will sustain the current memory market euphoria.

Analysts at MKW Ventures Consulting have issued a warning that when enthusiasm around a new technology runs high, expectations are often overstated, and the risk of overhype must be considered, with increased capacity from major suppliers potentially leading to oversupply and triggering market corrections.

Meanwhile, intensifying talks of an AI bubble have also sparked concerns about whether the memory chip market will continue to grow. Critics question whether cloud providers' massive infrastructure investments will generate sufficient returns to justify continued spending at current levels.

Add to this the dimension of startups like DeepSeek that have trained and released models capable of running on lower compute power, and the future of HBM becomes a lot more unstable.

However, as of now, the memory chip market is experiencing a boom that is bound to reshape the industry’s fortune, for better or for worse.

As for those still waiting for the T-800 or TARS, though AI models may have become smarter, the chips required to run them, to store their memories and neural networks, and the bodies to house them have a long way to go before they can go around riding Harleys or asking us to adjust their humor settings.

Quick Bits, No Fluff

  • Windows 11 gets agentic: Microsoft is integrating AI agents directly into the taskbar, making agentic workflows a native feature of the OS.

  • Poe launches group chats: Users can now create multi-user chats with different AI models in Poe’s app, enabling collaborative prompting.

  • AI’s global blind spot: Experts warn that AI models are collapsing under their Western-centric training, missing vast chunks of global knowledge.

Visa costs are up. Growth can’t wait.

Now, new H-1B petitions come with a $100K price tag. That’s pushing enterprises to rethink how they hire.

The H-1B Talent Crunch report explores how U.S. companies are turning to Latin America for elite tech talent—AI, data, and engineering pros ready to work in sync with your HQ.

Discover the future of hiring beyond borders.

*This is sponsored content

Thursday Poll

🗳️ Are memory bottlenecks the real AI chokepoint?

Login or Subscribe to participate in polls.

3 Things Worth Trying

  • Chip Wars Explained → Watch ColdFusion’s explainer on the global chip race, how it started, and why memory matters now more than ever.

  • HBM 101 Visualized → Explore SemiAnalysis’s free HBM architecture visual guide to understand why stacking matters.

  • Track Global Supply Chains → Dive into TrendForce’s free memory market tracker to keep tabs on shortages, price hikes, and big supplier moves.

Rate This Edition

What did you think of today's email?

Login or Subscribe to participate in polls.