The Real Economics Of AI

Plus: California’s AI bill, the rise of agent startups, and how the money really flows.

Here’s what’s on our plate today:

  • 🧪 How AI companies are turning foundational models into real business models.

  • 🧠 Humanoid robots, California’s AI safety bill, and OpenAI’s AI bubble.

  • 📊 Roko’s Pro Tip keeps your business grounded with a CFO mindset.

  • 🔮 Roko’s Prompt and poll explore AI pricing, and what SMBs really want from AI.

Let’s dive in. No floaties needed…

Guidde—Create how-to video guides fast and easy with AI.

Tired of explaining the same thing over and over again to your colleagues? It’s time to delegate that work to AI. Guidde is a GPT-powered tool that helps you explain the most complex tasks in seconds with AI-generated documentation.

Simply click capture on the browser extension and the app will automatically generate step-by-step video guides complete with visuals, voiceover and call to action.

The best part? The extension is 100% free.

*This is sponsored content

The Laboratory

How AI companies make money

In a world where economic structures significantly influence human life, scientific discoveries, while contributing to the collective human knowledge base, have to prove their commercial viability. History is littered with examples of how scientific discoveries have had to undergo the metamorphosis of commercialization for them to be accessible to the larger public.

One such example that comes to mind is that of Benjamin Franklin and the commercialization of electricity as we know it today. Back in the 18th century, when Franklin conducted his famous experiment with a kite in a thunderstorm, people realized the power of electricity. However, it would take another century before that knowledge could be utilized by Thomas Edison and Nikola Tesla to develop practical use cases for the knowledge.

In contemporary times, we appear to be at a similar stage. The underlying technology that powers AI is here, but many of its applications are yet to be understood, developed, and streamlined.

The first artificial neural network was developed in the 1950s by Marvin Minsky and Dean Edmunds. Their Stochastic Neural Analog Reinforcement Calculator (SNARC) was an attempt to model learning processes in the human brain, specifically through reinforcement learning. Since then, the technology has been streamlined into chatbots and AI agents that help automate tasks, which is where the true commercial value of the technology, at least as of now, lies.

The building blocks of AI products

Modern Large Language Models (LLMs) work as a mathematical system trained to predict the next piece of text given what came before. That may sound trivial, but scale, data, and compute turn it into something powerful. At their core, LLMs are raw technology. And for the technology to be a viable business model, companies like OpenAI, Anthropic, Google, and Meta wrap them into products. These include chatbots, copilots in office tools, coding assistants, and customer-service bots. That’s how predicting the next word becomes a service users pay for.

However, it is not easy to build a business model based on a technology that is undergoing rapid advancements while still being in the early stages of deployment.

The sheer investment needed to train and run an LLM requires companies to have a solid business plan that solves a problem for its customers, which in turn builds a sustainable business model. So, while companies like Google, Meta, and OpenAI have chosen to work on developing the underlying technology, others like Stripe, Duolingo, Snap, and Grammarly license the technology to package it into usage products.

A closer look at their differing business models brings to light how companies, whether or not they are AI-native, are working to monetize the technology.

Turning LLMs into services

Similar to how electricity is monetized both directly and indirectly, in the form of electricity supply and gadgets that use the underlying technology to provide service, LLMs are monetized through products that are wrapped in a business model. These products are where the raw power of LLMs gets packaged into something people or companies actually buy. Even ChatGPT is a product built on top of LLMs. It uses an OpenAI LLM (like GPT-4 or GPT-5) but adds a chat interface, instruction tuning, and reinforcement learning from human feedback (RLHF) to make the raw model safer and more useful in dialogue.

For companies like OpenAI that develop the underlying technology, the business model is to provide model APIs for developers and enterprises, along with ChatGPT for individual and enterprise applications.

Another way for companies to make money through increasing the adoption of AI products is by providing the underlying infrastructure, like chips and data centers. Nvidia has been the clear winner in this area. It is currently selling computers that everyone else needs. Its data center business has exploded as AI demand surged, and Wall Street’s expectations explicitly tie Nvidia’s revenue to generative‑AI spending.

The third way is through Cloud AI platforms. Clouds package models and tooling behind pay‑as‑you‑go APIs. Cloud service providers get the computing power from companies like Nvidia and build data centers. These data centers provide the cloud computing space, which is rented out to companies, such as Snap, to build their products.

Google Cloud Vertex AI lists per‑request pricing for Gemini and other models. Similarly, AWS Bedrock sells multi‑model access with cost‑optimization guidance, with users paying for tokens, fine‑tuning, and throughput.

While these business models have been great to ensure that the underlying technology can be made accessible to end-users, the immense costs of running and deploying AI models cannot be borne solely by them.

AI companies like OpenAI, Google, and Microsoft need enterprise clients and better distribution channels to ensure the adoption of LLMs continues to grow, which in turn helps them sustain and expand their operations.

The enterprise question

To ensure enterprise sales, AI companies are relying on bundling LLM-powered products into existing products. Microsoft priced Microsoft 365 Copilot at $30 per user per month for business plans and later brought Copilot to consumers (with a price bump), leaning on distribution in Office to convert that technology into predictable subscription revenue.

Similarly, Google rebranded Duet AI to Gemini for Workspace at $20 and $30 per user per month, again bundling into tools people already use daily. Adobe also integrated Firefly into Creative Cloud with generative credits, a meter designed to match real compute costs to value, which is another way to turn usage into dollars.

Finally, companies like Apple are leveraging AI by using it to sell devices and to promise privacy: Apple Intelligence runs a lot on‑device and uses Private Cloud Compute for heavier tasks, explicitly marketed as a privacy‑preserving design. It’s a distinct business model: AI drives device upgrades and services loyalty rather than per‑token fees.

SMEs are the last mile of the distribution channel

As of now, the commercial value of AI is simple. The technology is developed by companies like OpenAI, Google, and Microsoft. It is then either packed by them or licensed to other companies like Snap, Duolingo, and others. The last mile of the distribution channel, however, not only includes established corporations, but also small and medium-sized businesses (SMEs).

According to a report from the Small Business and Entrepreneurship Council (SBEC), small businesses are using AI tools to improve efficiency and increase savings. According to the report, 48% of small businesses have been using AI tools for over a year in 2023. The study further found that 93% of small business owners agree that AI tools offer cost-effective solutions that drive savings and improve profitability.

While some businesses opted to automate tasks and backend work, like staff scheduling in restaurants, others are using AI to automate repetitive tasks like email drafting, report summarization, and meeting minute taking, freeing up employees to do higher-value work.

Beyond integrating AI into their workflows, some startups are also working on building add-on services using LLMs. These include custom chatbots, document summarization tools, and localization (translation). If an SME has domain expertise, they can build an AI service tailored to their market (legal, medical, local language, etc.). LLMs have a lower barrier to entry.

The business of AI

AI is not just a technology, it is also the building blocks of products. What and how these products operate and their success then will determine the commercial viability of investing billions into the tech.

And while companies push even bigger models in their chase to build the first AGI, both OpenAI and Meta have talked about their goals. For the future viability of the technology, it needs to have a solid business plan. For this business model to succeed, companies will have to ensure their models not only save time but are also capable of reducing errors, as no CFO will keep paying $20–$30 per seat per month if the tech is unreliable.

The story of how Klarna should be taken as a lesson for all businesses about their approach to building a business that sustains over time and is not limited to chasing unrealistic growth. Klarna tried replacing hundreds of customer-service jobs with AI to cut costs, but after quality slipped, they’ve begun rehiring human agents.

So, while AI companies continue to innovate and strengthen the building blocks, businesses that cater to end consumers will have to ensure their products have a meaningful impact on the lives of their users. Because, at the end, like electricity needs gadgets that solve problems in the real world, LLMs need adequate applications and a viable product for them to be more than a computer working on permutations and combinations.

Roko Pro Tip

💡 Think like a CFO. 

If you’re building or buying AI tools, follow the money.

Ask what pain they solve, what revenue they drive, and whether the ROI justifies the price.

Don’t just get wowed by features—get real about outcomes.

161,596 founders, investors and leaders read this.

Founders and leaders who read Open Source CEO end up with 2.3x the cerebral horsepower of those who don’t.

Ok, we cannot actually prove that, but we think it’s about right. What we do know is that 160k+ readers from Google, TikTok, OpenAI and Deel love our deep dive business content. Subscribe here to see what it’s all about.

*This is sponsored content

Prompt Of The Day

Prompt: ”Design a freemium AI product for SMBs that doesn’t rely on a chatbot interface. What does it do, and how does it make money?”

Bite-Sized Brains

Tuesday Poll

🗳️ What’s the most sustainable AI business model?

Login or Subscribe to participate in polls.

Rate This Edition

What did you think of today's email?

Login or Subscribe to participate in polls.