Who’s Really Winning The AI Race?

Plus: chip wars, hardware bottlenecks, robot takeovers, and a scary-good lawnmower.

Here’s what’s on our plate today:

  • 🧱 Nvidia—not OpenAI—is quietly deciding the future of AI.

  • 🤖 Aggressive lawnbots, agile humanoids, and OpenAI’s hallucinations.

  • 💬 Prompt of the Day imagines a world where Nvidia suddenly vanishes.

  • 🗳️ Is Nvidia’s dominance dangerous or just smart business?

Let’s dive in. No floaties needed…

Guidde—Create how-to video guides fast and easy with AI.

Tired of explaining the same thing over and over again to your colleagues?

It’s time to delegate that work to AI. Guidde is a GPT-powered tool that helps you explain the most complex tasks in seconds with AI-generated documentation.

  • Share or embed your guide anywhere

  • Turn boring documentation into stunning visual guides

  • Save valuable time by creating video documentation 11x faster

Simply click capture on the browser extension and the app will automatically generate step-by-step video guides complete with visuals, voiceover, and call to action.

*This is sponsored content

The Laboratory

Why Nvidia, not OpenAI, is setting the pace in the AI race

In June 2025, when news broke that Microsoft and OpenAI were quarreling over the definition of Artificial General Intelligence (AGI), the concept was still being understood. Up until then, it had been discussed in specialized publications, and the larger public had little understanding of AGI and its importance in the future goals of AI. Now, the word is everywhere. CEOs of major tech companies are drumming up support and competing to reach the end goal of AI, which is AGI.

In addition to hiring top talent, signing nuclear power deals, and lobbying governments for funding, AI model companies are spending billions on infrastructure. In this process, they are fueling the growth of chipmakers and enterprises that run data centers. And one of the major beneficiaries of this race is Nvidia.

Nvidia makes the chips that are used to train and run generative AI programs. Its graphics processing units (GPUs), which once powered high-res gaming, are now the favored chips to power AI, sourced by the likes of Microsoft, ByteDance, and Tesla.

According to CNBC, Nvidia currently holds between 70% to 95% of the market share for artificial intelligence chips, making it one of the most valuable corporations in the world. The company’s market capitalization briefly reached $4 trillion in July 2025, underscoring the role played by the AI boom in its success.

While Nvidia continues to work on more powerful chips, its success is also shaping the future of AI.

One company’s grip on AI development

Back in 2024, when companies were racing to release powerful AI models and secure funding, there was a lag in the world of AI. This was not caused by companies lacking the algorithmic capabilities, but because Nvidia’s AI chips were in strong demand and short supply.

Media reports suggest Nvidia’s next-gen Blackwell chips, which are key to more capable AI, were delayed due to design flaws. According to a Reuters report, delays were thought to affect customers such as Meta Platforms, Alphabet's Google, and Microsoft, which collectively ordered tens of billions of dollars' worth of chips.

The potential setbacks, for up to three months, were estimated to have delayed the construction of data center build-outs and slowed deployment timelines for models. At the time, analysts, while not sounding alarmed, flagged this as a bottleneck.

Around the same time, Nvidia also struggled to meet the rising demand for its AI GPUs. The impact of these shortages was felt as far back as 2023. Things at one point were so dire that OpenAI CEO Sam Altman said it would be better if fewer people used ChatGPT because of the processor bottleneck. “GPUs at this point are considerably harder to get than drugs,” Elon Musk told The Wall Street Journal at the CEO Council Summit in May 2023.

Though the shortages seem to have eased off since then, the problem of overreliance on one company continues to threaten future developments. This reliance impacts not just big tech companies; the concentration of computing power within one supplier also makes it difficult for small enterprises and startups to scale.

The impact on small businesses

In August 2024, a report from Wired highlighted the impact of GPU shortages on AI startups. According to the report, companies had to get creative just to stay competitive. Some companies were forced to pool cash, while others preferred to slim down model sizes to stay competitive. Some companies even worked on optimizing their code to squeeze more performance out of less hardware.

This forced investors to pour hundreds of millions into startups that craft software designed to wring efficiency out of sparse GPU availability. Even cloud providers struggled to provide adequate GPU allocation on short notice.

Even when companies struggled to get their hands on Nvidia chips, they preferred to wait or find workarounds rather than turn to rivals of the chipmaker. That begs the question, where is the competition?

The state of Nvidia’s rivals

Nvidia remains the dominant player in AI training GPUs, but competition is intensifying across several fronts: established chipmaker AMD, Intel are working to regain market share.

In the cloud space, the company is facing competition from cloud providers with custom silicon. According to a report from The Information, Google is intensifying its efforts in the AI chip space with its Tensor Processing Units (TPUs) to challenge Nvidia’s long-standing dominance in AI hardware. The cloud provider is leveraging its position to push TPUs not just for internal workloads but increasingly for external cloud service clients.

Geopolitical tensions have also forced countries like China to pursue self-sufficiency in advanced chip manufacturing. According to the Financial Times, the country is looking to triple its total output of artificial intelligence processors by 2026.

These efforts are being spearheaded by Huawei, Semiconductor Manufacturing International Corporation (SMIC), China’s leading fab, and Chinese chip designers such as Cambricon, MetaX, and Biren.

Even OpenAI is working on its first AI chip in partnership with U.S. semiconductor giant Broadcom. While the company is looking to use the chip internally and not make it available to external customers, it hints towards a larger shift in thinking aimed at reducing reliance on Nvidia.

However, Nvidia’s dominance in the AI chip market goes beyond its ability to design the most powerful GPUs. The company’s portfolio of products forms an entire ecosystem resembling that of Apple, only this time the walled garden is for enterprises, not end consumers.

Inside Nvidia’s walled garden

Nvidia dominates the AI chip market not only because of its powerful hardware but also through its integrated ecosystem that competitors struggle to replicate.

At the core of this ecosystem is CUDA, Nvidia’s proprietary parallel computing platform, which became the industry standard for AI development. CUDA, along with supporting libraries, has created deep developer lock-in, as most leading AI frameworks are optimized for it. This results in high switching costs for researchers and companies considering alternative hardware.

Nvidia also leads in performance with chips like the H100 and Blackwell series, which set benchmarks for training and inference efficiency. But the company goes further, delivering end-to-end AI systems such as DGX servers, Grace CPUs, and high-performance networking through its Mellanox acquisition. This vertical integration means customers get a fully optimized AI infrastructure from one provider. Nvidia has also used its collaboration with hyperscalers like AWS, Google Cloud, and Microsoft’s Azure to cement its ecosystem.

So, while Nvidia’s ecosystem makes it comfortable and easy for enterprises to enjoy the walled garden, it also locks them in with a high cost of switching.

The ecosystem is great for enterprises that do not wish to break free, but can be a challenge for ones willing to explore the adoption of new technologies.

The shadow Nvidia casts on AI’s future

The current AI boom is often narrated through the lens of model makers. OpenAI, Anthropic, Google DeepMind, and Meta are all racing to scale algorithms that promise AGI. But behind the scenes, Nvidia has become both the backbone and the gatekeeper of AI progress. Its GPUs form the invisible infrastructure upon which breakthroughs depend. This dominance, while impressive, raises important questions about resilience, competition, and the future trajectory of AI.

Nvidia was the proverbial early bird. It developed an ecosystem that it uses as a base to lock in companies. However, the strength of the ecosystem also exposes a vulnerability for the wider industry.

Overreliance on a single company creates systemic risks. When Nvidia’s Blackwell chips were delayed in 2024, the ripple effects stalled data center buildouts and slowed AI deployment across tech giants. For startups, the stakes were even higher. And though competition is trying to catch up, designing powerful chips is one challenge; replicating the developer mindshare and platform lock-in Nvidia has cultivated for nearly two decades is another entirely.

In the end, Nvidia may not build AGI itself, but it has become the arbiter of who gets to try, when, and at what scale. As long as the AI race runs on GPUs, Nvidia will remain the quiet pacemaker. The future of AI depends less on who reaches AGI first and more on whether the industry can escape Nvidia’s walled garden.

Roko Pro Tip

💡 Know your dependencies.

When building with AI, don’t just optimize your models—track your infrastructure bottlenecks. Whether you’re working with open weights or using APIs, understanding the chip, memory, and latency limits that power your stack is the key to building something scalable.

161,596 founders, investors and leaders read this.

Founders and leaders who read Open Source CEO end up with 2.3x the cerebral horsepower of those who don’t.

Ok, we cannot actually prove that, but we think it’s about right. What we do know is that 160k+ readers from Google, TikTok, OpenAI and Deel love our deep dive business content. Subscribe here to see what it’s all about.

*This is sponsored content

Prompt Of The Day

”How would the AI race unfold if Nvidia suddenly disappeared?”

Use this counterfactual to explore chip dependence, geopolitics, and open hardware futures. Great prompt for macro/strategy modeling or sci-fi prototyping.

Bite-Sized Brains

Tuesday Poll

🗳️ Is Nvidia’s dominance good or bad for AI?

Login or Subscribe to participate in polls.

Rate This Edition

What did you think of today's email?

Login or Subscribe to participate in polls.