Marc Andreessen: AI Is Bigger Than the Internet
The a16z co-founder explains why AI revenue is growing faster than any tech wave in history, why tokens will get radically cheaper, and why small models chasing big models is the future.
How Marc Andreessen Views AI Economics and Market Dynamics
Marc Andreessen rarely does deep-dive AMAs on AI economics. This session is essential viewing for anyone building or investing in AI companies. His core thesis: we're three years into an 80-year revolution that finally delivered on neural networks, and the underlying economics—despite current high costs—are about to get dramatically better. The revenue growth he's seeing is unprecedented, and the cost deflation on tokens is faster than Moore's Law.
On the magnitude of the shift: "This is the biggest technological revolution of my life. The comps on this are things like the microprocessor and the steam engine and electricity... the wheel." Andreessen argues this isn't just another tech wave—it's a fundamental change in how intelligence gets deployed in the economy.
On the 80-year overnight success: "The first neural network paper was published in 1943. There's an interview with Makulla on YouTube from like 1946—him in his beach house, not wearing a shirt, talking about computers built on the model of a human brain. That was the path not taken." The ChatGPT moment wasn't invention—it was crystallization of 80 years of dormant research.
On revealed preferences: "If you run a survey of what American voters think about AI, they're all in a total panic. 'Oh my god, it's going to kill all the jobs.' If you watch the revealed preferences, they're all using AI." The gap between what people say and what they do tells the real adoption story.
On revenue velocity: "This new wave of AI companies is growing revenue—actual customer revenue, actual demand translated through to dollars showing up in bank accounts—at an absolutely unprecedented takeoff rate." And critically, companies are charging $200-300/month tiers for consumer AI—higher price points than SaaS ever achieved.
On cost deflation: "The price of AI is falling much faster than Moore's law. All of the inputs into AI on a per-unit basis, the costs are collapsing." This hyperdeflation drives elastic demand—cheaper tokens mean exponentially more usage.
On small models catching big models: "There's this Chinese company that produces Kimmy—a reasoning model that basically replicates GPT-5 capabilities. It either runs on one MacBook or two MacBooks. That's another Tuesday, another huge advance." The pattern: frontier capability gets shrunk down to run locally within 6-12 months.
On the chip glut coming: "If you look at the entire history of the chip industry, shortages become gluts. Nvidia fully deserves the profits they're generating, but they're now so valuable it's the bat signal of all time to the rest of the chip industry." AMD, hyperscaler custom chips, Chinese chips—everyone is coming. AI chips will be cheap and plentiful in 5 years.
6 Insights From Marc Andreessen on AI Revenue and Costs
- Bigger than internet - AI deploys on the internet's carrier wave; you can download AI but couldn't download electricity or indoor plumbing
- 3 years into 80-year revolution - The neural network theory existed since 1943; ChatGPT was crystallization, not invention
- Revenue growth unprecedented - AI companies growing faster than any wave Andreessen has seen; $200-300/month consumer tiers working
- Tokens will get radically cheaper - Cost deflation faster than Moore's Law; elastic demand means cheaper drives more usage
- Small models chase big models - Frontier capability shrinks to laptop-scale in 6-12 months (see: Kimmy replicating GPT-5)
- Chip glut coming - AMD, hyperscalers, and China all building; Nvidia's profits are the "bat signal" to compete
What This Means for AI Startups and Investors
Andreessen's framework is clarifying: the question isn't whether AI economics work—it's whether you understand the timeline. High costs today are temporary; token prices are in hyperdeflation. The real insight is structural: AI will likely mirror the computer industry with a small number of "god models" in data centers plus a cascade of smaller models running everywhere, including embedded systems. For organizations adopting AI, the implication is clear: the capabilities are real, the revenue is real, and the costs are about to collapse. The debate about whether this is a bubble misses the point—the question is positioning for a market that's still in its first inning.


