The race to Zero: how AI will become a zero-cost commodity
Roughly a week ago Meta released LLaMA (Large Language Model Meta AI), an open-source Large Language Model for research.
Let me translate that: Meta built a better copy of OpenAI’s models and gave it out for free. It beats GPT-3 (OpenAI’s flagship model, the basis for ChatGPT) at several benchmarks and it’s much smaller which means it’s easier and cheaper to use. It could even be run on a single GPU chip.
Why did Meta do that?
- To attract new talents. AI talent is scarce, and being seen as “cool” is really important to attract it. OpenAI has poached hundreds of people from other tech companies.
- To speed up research. Instead of finding issues by themselves, now they can count on the entire AI research community.
- Because they can. Meta has to develop this kind of technology anyway and doesn’t have a cloud business to protect like Microsoft or Google.
The last point is the most interesting to me. AI is the next technological revolution and there’s obviously a ton of money to be made from building the “foundational models” that will power it.
But…
These models are…foundational. They’re like electricity or steam power, general technologies that can power products. Can you patent and protect electricity? I don’t think so. This means two things:
- Companies that want to make money through the commercialization of foundational AI models will have little differentiation. At some point they’ll all get so good that they won’t be able to claim “buy mine, it’s better”. Exactly how no one has “better electricity”. The only reason to claim “buy mine” will be “it’s cheaper”.
- Companies who don’t want to make money selling foundational AI models will release them for free because they can. Others will take them and put them into products or services to use them easily.
The only possible outcome from 1 and 2 is that competition + open source will turn AI into an almost free commodity.
We are already seeing this phenomenon. Last week’s “AI news” section of my Tech Pizza newsletter featured:
- 3 new major products leveraging conversational AI (Spotify, Snapchat, Notion)
- A new generative AI product team at Meta (expect to add Instagram, Facebook, Messenger, and Whatsapp to the list above)
- Coca-Cola is integrating generative AI through the Bain x OpenAI partnership both for internal use and external customer engagement.
Bear in mind: Tech Pizza is a weekly newsletter. All of this happened in one week.
Andrew NG famously said some years ago that “AI is the new electricity”. It was and still is an exciting message that inspired many (me included). Let’s remember that electricity is a commodity we take for granted today. My bet is that we’ll take for granted generative AI in less time than we’d think.