- Roko's Basilisk
- Posts
- Creativity Meets The Model
Creativity Meets The Model
Plus: OpenAI’s profit test, Microsoft’s buzzwords, and Wikipedia fights back.
Here’s what’s on our plate today:
🧠 Adobe rewires creative tools for the AI era.
📰 Altman’s margins, Microsoft’s pitch, Wikipedia’s warning.
🧰 Remix with Firefly, sharpen prompts with PromptHero, build your own agent.
🗳️ Poll: What defines true creativity in the AI era?
Let’s dive in. No floaties needed…

Want to get the most out of ChatGPT?
ChatGPT is a superpower if you know how to use it correctly.
Discover how HubSpot's guide to AI can elevate both your productivity and creativity to get more things done.
Learn to automate tasks, enhance decision-making, and foster innovation with the power of AI.
*This is sponsored content

The Laboratory
How AI is redefining creativity tools
When the foundations of modern capitalist society were being laid, marketing emerged as a crucial method for businesses to differentiate their products and attract consumers.
In the early days, companies reached potential customers mainly through newspaper ads and printed flyers. As technology progressed, both the tools and the nature of this communication evolved dramatically before it eventually became a full-fledged industry in itself.
The digital revolution had a very important role to play in this transformation, turning marketing from a one-size-fits-all approach to an industry that depends on data, personalization, and context.
Today, marketing is seen through the lens of creativity, where companies compete to make their products stand out and capture their customers' imagination. This creativity comes to life through a variety of tools that help enterprises communicate with consumers. And, as with most other aspects of human enterprise, here too, artificial intelligence is redefining the meaning of creative tools.
Adobe’s role in the creative revolution
For decades, Adobe has been at the forefront of the relationship between creative minds and technological advancements. The company’s actions and decisions over the past few years are reflective of wider changes in the market for creative tools.
Adobe recently announced its new AI assistants for Express and Photoshop that it says can help users with image creation and editing.
The announcement marks the company’s expansion of AI across its creative tools, along with assistants and features aimed at making design more intuitive and personalized.
In Adobe Express, the company is testing a dedicated AI mode that lets users generate images and designs through text prompts, while still allowing them to switch back to the traditional editing interface.
Meanwhile, Photoshop’s new assistant, currently in closed beta, sits in the sidebar and can understand layers, automatically select objects, create masks, and handle repetitive tasks like background removal.
Adobe is also experimenting with Project Moonlight, an AI assistant that could coordinate across different Adobe apps and connect with creators’ social profiles to better learn their style. It also plans to integrate Adobe Express with ChatGPT through OpenAI’s API so users can design directly within ChatGPT.
Beyond assistants, Adobe is adding AI-powered features across Creative Cloud, such as allowing Photoshop users to choose third-party models like Google’s Gemini or Black Forest Labs’ FLUX for generative fill, and adding object masking powered by AI in Premiere Pro for easier editing and color adjustments.
A legacy of reinvention
The shift in Adobe’s strategy didn’t happen overnight; rather, it is reflective of broader shifts that have allowed it to become the tool of choice for most creators.
Adobe was founded in 1982 by John Warnock and Charles Geschke to revolutionize printing and publishing with an all-digital approach. Over the years, the company developed products like PostScript, a language that described how to render text and graphics on a page independently of the printer device.
That product helped spark part of the desktop publishing revolution, because suddenly, software and personal computers could drive high-quality print output.
In the 1990s, Adobe expanded from printing technology into creative software and document management, launching products like Photoshop and PDF, which became global standards for design and digital documents.
Through the 2000s and 2010s, the company transitioned from boxed software to cloud-based platforms, Creative Cloud, Document Cloud, and Experience Cloud, connecting creative, business, and marketing workflows.
By the late 2010s, Adobe’s Creative Cloud subscription had consolidated market leadership in professional editing (Photoshop, Illustrator, Premiere, Lightroom, Acrobat), a position reflected in steady double-digit Digital Media ARR growth into 2025. Even as competitors like Figma, Affinity, and Canva nibbled at the edges, Adobe remained the enterprise standard for high-end creative workflows.
Generative AI rewrites the playbook
However, all this changed when generative AI scrambled that order by making high-quality images, video, and audio available at prompt speed, inviting cheaper rivals and new workflows. Text-to-image and text-to-video systems threatened to unbundle the painstaking craft inside Adobe’s tools.
In March 2023, Adobe launched Firefly, an in-house family of image, vector, text-effects, and video models. Trained primarily on data that Adobe either owns, licenses, or that’s in the public domain, Adobe says it ensures commercial safety, meaning enterprises can use the outputs without fear of infringing copyrights. This was to become a major selling point; corporate buyers want legal clarity, not creative chaos.
Lawsuits from creators and against companies building and training foundational models also made Firefly look like the more viable choice.
Additionally, Adobe’s internal models run on its own infrastructure and serve as the backbone of AI features embedded throughout Creative Cloud. Which means when users employ these generative features, they’re not connecting to an OpenAI, Google, or Stability API.
They’re hitting Adobe’s own model endpoints hosted within its ecosystem. For enterprise customers, Adobe extends this control further: it allows companies to train custom Firefly models on their own brand imagery and product libraries.
Those models remain isolated to that customer, and Adobe says the data used for training does not flow back into its main Firefly foundation models. In essence, enterprises can build brand-specific versions of Firefly without relinquishing data sovereignty.
Adobe's hybrid approach
However, despite having its own in-house suite of AI models and tools, Adobe’s strategy is not entirely isolationist.
As of mid-2024, the company opened Firefly to third-party model integration, beginning with systems from OpenAI, Google, and others. This doesn’t mean Adobe has become a renter in the conventional sense; it’s more like it built a gated neighborhood where other model developers can park their engines under Adobe’s rules.
You can stay inside the Firefly interface, use its credit system, rights management, and asset libraries, but choose which model you want to generate with. For creative teams, this means they can experiment with multiple aesthetic engines without leaving Adobe’s controlled environment.
That approach also reveals Adobe’s realism. The company recognizes that no single model will always be the best for every visual task. OpenAI’s DALL-E variants might outperform Firefly for surreal concepts. Google’s Imagen or Veo might excel at cinematic imagery or text-to-video.
Rather than pretending its own model can do it all, Adobe now positions itself as the orchestrator of models. A platform that keeps creative work compliant, brand-safe, and organized while letting users tap into other engines. It’s similar to how an operating system manages different apps without writing them all.
What it means for enterprises
For enterprise users, this dual model, both proprietary and partner-based, has tangible consequences. Using Adobe’s own Firefly models offers a clearer legal chain of custody for generated images and videos. Adobe explicitly indemnifies enterprise users of Firefly outputs, which means it assumes liability for copyright challenges if its model training turns out to have infringed someone’s work.
This is something neither OpenAI nor Midjourney currently promises at the same scale. But the moment an organization chooses a third-party model within Firefly, those guarantees weaken. Adobe notes that commercial-safety and indemnity apply only to its own models, not to the partner systems it hosts.
The legal exposure, therefore, shifts back to the enterprise’s internal governance, review processes, disclaimers, and brand compliance checks.
There’s also a deeper economic layer. Firefly’s credit system is how Adobe monetizes AI use. Every generated image, extended scene, or AI edit consumes credits, similar to how cloud computing works.
When you generate content through an integrated partner model, those credits may be priced differently or shared between Adobe and the partner. This architecture positions Adobe less as a software company and more as a content infrastructure provider. A subtle but important shift.
The model itself becomes just another service endpoint that Adobe can manage, meter, and monetize.
Platform, not product: Adobe’s strategic shift
From a strategic point of view, this hybrid structure is defensive and opportunistic at once. Building in-house models secures Adobe’s long-term autonomy, making sure it does not have to rely entirely on someone else’s model infrastructure for core product functions.
But opening up to external models lets Adobe keep creative professionals inside its ecosystem even when they chase other engines’ strengths. It’s a “don’t lose them to another tab” philosophy. The user stays in Photoshop or Firefly, not in Midjourney’s Discord.
To the broader creative industry, the move signals something larger. Adobe is essentially turning Creative Cloud into a multi-model AI platform. The company’s historical strength has always been tool integration, linking photography, design, video, and document workflows.
Now it’s applying that logic to generative AI, giving enterprises a controlled, auditable, and extensible way to deploy AI imagery and video at scale. The result is that Adobe remains both gatekeeper and guide: owning the rails, but letting others run trains on them.
The future of creativity
In many ways, Adobe’s story mirrors the evolution of creativity itself, from manual craftsmanship to digital artistry, and now to algorithmic collaboration.
As generative AI blurs the line between tool and partner, Adobe’s challenge is to preserve the integrity of human imagination while expanding its possibilities. The company that once helped define desktop publishing is now trying to define the boundaries of machine-assisted creation. Whether it succeeds will shape not just the future of design software, but the meaning of creativity in an age when ideas can be generated as quickly as they are imagined.


3 Things Worth Trying
Remix with Firefly: Try Adobe’s text-to-image Firefly tool now with multi-model support inside Photoshop and Express.
Prompt like a pro: Test PromptHero’s free prompt builder to sharpen your generative art results across platforms.
Build your creative agent: Spin up a GPT with image editing skills to help with mockups, thumbnails, or branding tasks — faster than Slack back-and-forth.

Dive into the 100 Greatest Business Books Database!
If you're a leader hungry for wisdom, a manager eager to sharpen your edge, or a tech enthusiast aiming to stand on the shoulders of giants, prepare to be enlightened.
We've curated an exhaustive database of the top 100 business books of all time, as voted by legends like Steve Jobs, Warren Buffet, Jeff Bezos, and more.
And guess what? We've seamlessly integrated it into a trackable Notion checklist.
Carve out moments in your schedule to tackle these masterpieces one by one, and soon you'll be strategising and executing like the pros.
*This is sponsored content

Quick Bits, No Fluff
Altman’s margin problem: Can OpenAI scale profits as fast as hype?
Microsoft’s AI pitch: “Humanist superintelligence” is the new buzzword.
Wikipedia draws the line: Pay for the API, stop scraping.

Meme Of The Day
Wednesday Poll
🗳️ What defines true creativity in the AI era? |

Rate This Edition
What did you think of today's email? |






