The history and future of AI at Google, with Sundar Pichai
By Stripe
Categories: Product, Startup
Summary
Google invented Transformers to solve product problems (translation, speech inference at scale), not pure research—then immediately deployed them in Search via BERT and MUM for massive quality gains. The real lesson: ChatGPT's dominance came from execution speed and luck with coding use cases, not technological superiority, as Google had similar tech but faced higher product quality bars and slower shipping cycles.
Key Takeaways
- Transformers were invented at Google to solve specific product constraints (scaling speech recognition inference for 2 billion users, improving translation), not as standalone research—this product-first approach enabled immediate deployment in Search.
- BERT and MUM drove some of the biggest search quality jumps in Google's history, yet remain underestimated because Google measures search quality 'religiously'—companies should instrument metrics obsessively to capture impact.
- Google had LaMDA (an early ChatGPT equivalent) ready 9 months before shipping, but constrained it due to toxicity issues and internal RLHF limitations—higher quality bars can be competitive disadvantages in consumer internet where speed and iteration matter.
- Consumer internet success requires accepting constant surprises from small teams prototyping rapidly (YouTube, Instagram, ChatGPT)—building organizational agility matters more than predicting winners, as evidenced by Google Video Search losing to YouTube.
- Latency (speed) has been Google's core product differentiator across Search, Gmail, Chrome, and Gemini—but the challenge is balancing low latency with expanding capabilities as the frontier progresses, requiring constant architectural trade-offs.
Topics
- Transformer Architecture Product Strategy
- Search Quality Metrics and BERT/MUM
- Consumer Internet Speed vs. Quality Trade-offs
- AI Product Launch Velocity
- TPU Infrastructure and Inference Scaling
Transcript Excerpt
Sundar Pichai just passed a decade as CEO of Google. Alphabet is now not only one of the world's biggest tech companies, but a leader in the AI race, with plans to spend $175 billion in CapEx in 2026. Cheers. A bit of history that people talk about a lot in the context of Google and AI is the fact that Transformers were invented at Google, but then productized outside of Google with mostly ChatGPT and that style of product. How do you reflect on that now? I think it's actually worth talking abou...