OpenAI's Greg Brockman: Why Human Attention Is the New BottleneckOpenAI's

By Sequoia Capital

Categories: VC, Startup

Summary

OpenAI's Greg Brockman reveals human attention is becoming the critical bottleneck in AI, not compute—despite processing 1.6% of global GDP, they're still compute-constrained and racing to build faster than demand grows, forcing founders to rethink how they compete beyond raw infrastructure.

Key Takeaways

  1. OpenAI operates on a simple margin-based model: buy/rent compute, resell at margin. Scale aggressively as long as margins stay positive since demand for AI solutions is effectively unlimited.
  2. Compute scarcity is real and persistent. When ChatGPT launched, Brockman told his team to buy 'all of it'—they've still never had enough. GPU availability is projected to round to zero by 2026.
  3. Scaling laws remain empirically true with no visible wall—neural networks from the 1940s paired with modern compute continue showing predictable capability gains, but innovation happens through architectural tweaks and paradigm shifts, not just adding compute.
  4. OpenAI claims to be ~80% toward AGI functionally, with models now outperforming humans at code writing. GPT-5.4 is already better than most developers at software engineering tasks, signaling the speed of capability gains.
  5. Constant micro-innovations matter as much as major breakthroughs—even small changes to data formatting can be 'a very big deal.' This suggests founders should focus on systems-level optimization, not just algorithmic leaps.

Topics

Transcript Excerpt

So, Greg, uh, thank you for coming back here. Um, I don't think we ever charge you for rent. So, uh, maybe we'll send you an invoice later, but Greg, you've been part of like two really spectacular companies. Stripe as employee number four and then the first CTO. I just recently heard that they process 1.6 billion, sorry, 1.6% of the global GDP. You must be proud of that. >> That's amazing. You must be even more proud of the fact that OpenAI has almost a billion or maybe more than a billion uh i...