How AI Transforms Different Corners of Earth

Most AI investment flows to infrastructure giants while application-layer innovation gets starved. George Lee, co-head of Goldman Sachs Global Institute and former head of Technology M&A, breaks down why energy bottlenecks—not compute—now constrain AI scaling and how geopolitical fragmentation creates regulatory arbitrage opportunities. He reveals Goldman's framework for managing probabilistic systems alongside deterministic enterprise workflows, explains why acqui-hire deals exceeding traditional M&A valuations signal a fundamental shift in Silicon Valley's social contract, and identifies which global markets are positioning for AI sovereignty through abundant energy and capital resources.
About the speaker

George Lee

Goldman Sachs Global Institute

 - Goldman Sachs Global Institute

George Lee is Co-Head at Goldman Sachs Global Institute

Episode Chapters

  • 01:32: China's AI transparency problem

    Investment numbers from China are murky at best, likely excluding massive prior investments in robotics, data infrastructure, and energy that now power their AI advantage.

  • 02:54: Middle East rulers aren't tech tourists

    Dismissing Middle Eastern AI players as "long on money, short on knowledge" is a mistake—they combine abundant capital, energy, and decisive organizational speed that Western markets can't match.

  • 05:14: AI's canal and locks reality

    AI development isn't about the next model release—it's navigating sequential bottlenecks from silicon to data centers to energy to supply chains, with energy now the critical constraint.

  • 07:19: The ratepayer rebellion is coming

    Data centers will consume 8-12% of electricity by 2030, with wholesale prices already up 22% year-over-year—public utilities answering to ratepayers will become AI's biggest constraint.

  • 09:13: Flex demand, unlock capacity

    The US energy grid has 75-125 gigawatts of slack capacity during non-peak periods—smart AI workload management could tap this abundance without building new infrastructure.

  • 10:18: Latency tolerance changed everything

    Users demand sub-second search results but happily wait for AI responses—this tolerance for inference delay unlocks flexible energy consumption models that traditional computing never allowed.

  • 14:18: Probabilistic meets deterministic chaos

    Regulators built for deterministic systems now face probabilistic AI that gives different answers to identical inputs—a paradigm shift that requires entirely new oversight frameworks.

  • 17:57: Enterprise AI needs continuous sampling

    Unlike deterministic systems you test once and deploy, probabilistic AI requires continuous monitoring and tuning—think discrete manufacturing versus continuous oil refining.

  • 20:51: Global compliance fragmentation accelerating

    The dream of harmonized global AI standards is dead—builders must now navigate multiplying data sovereignty requirements that make global deployment increasingly complex.

  • 23:19: Acqui-hires exceed traditional M&A

    Individual researcher contracts now cost more than most M&A deals—when you're spending $100 billion in capex, paying millions for 1% optimization improvements actually makes financial sense.

  • 25:31: Silicon Valley's social contract breaks

    Multibillion-dollar acqui-hires that reward 40 people while abandoning 200+ employees destroy the risk-taking culture that built the Valley—i ovation suffers when only the magic few get paid.

Episode Summary

  • The $109 Billion AI Reality Check: Why Your Infrastructure Play Is Already Dead

    The Brutal Math of AI Economics

    $109 billion. That's what US VCs dumped into private AI investment in 2024—12 times more than China, 24 times more than the UK. But here's what the pitch decks won't tell you: the real bottleneck isn't capital or compute anymore. George Lee, co-head of Goldman Sachs Global Institute and former head of Technology M&A, drops the uncomfortable truth: "The pinch point is really around energy and whether we, as a country, can meet the moment to provide sufficient power to enliven these advanced AI data centers."
  • The numbers are staggering. Data center power consumption will grow 165% by 2030, jumping from 4% to potentially 12% of total electricity demand. One grid is already seeing wholesale electricity prices spike 22% year-over-year. While founders obsess over model releases and wrapper plays, the infrastructure reality is about to steamroll anyone who hasn't secured their energy strategy.
  • The Post-Check Infrastructure Playbook

    The Canal and Locks Reality

    Forget the linear scaling assumptions in your investor deck. Lee reveals the actual progression: "It went from a singular bottleneck vision to a canal where you're traversing a series of locks." First silicon shortage, then data center capacity, now energy, next supply chain, then regulation. Each solved bottleneck reveals the next constraint. Smart founders are mapping this entire canal system, not just celebrating their Series A.
  • The Flex Computing Arbitrage

    Here's the unfair advantage nobody's talking about: US energy infrastructure is built for peak demand—those 95-degree days in Dallas. The rest of the time? Massive slack capacity. Lee points to 75-125 gigawatts of potential capacity through demand-flexible AI workloads. Google just inked deals with utilities to flex their compute demand in exchange for cheaper, more abundant electrons during off-peak hours.
  • The operational insight: Build your AI systems for curtailment from day one. Training runs, rejection sampling, reinforcement learning—all can be snapshotted and shifted. Your users already tolerate latency in AI that would kill them in search. Lee notes we're "not as latency sensitive" with chatbots—leverage that tolerance for infrastructure arbitrage.
  • The Geopolitical Chess Game Most Founders Ignore

    The China Paradox

    That 12-to-1 spending advantage over China? Lee calls BS on the transparency: "One wonders whether that includes prior investments in robotics, data, and drones." The real shock: authoritarian China is leading in open-source AI. They're using open models to project power globally, especially in the Global South, offering an alternative to Western tech stacks.
  • The Middle East Dark Horse

    Dismiss Middle Eastern players as "long on money, short on knowledge" at your peril. Lee spent serious time there and found "conviction, expertise, urgency, and ambition" backed by three unfair advantages: abundant capital, abundant energy, and societies that can organize rapidly around priorities. While you're pitching Sand Hill Road, they're building the infrastructure for the next industrial revolution.
  • The Enterprise Reality Check

    Non-Determinism Isn't a Bug—It's Your Business Model

    Stop trying to force probabilistic systems into deterministic workflows. Lee's insight: "These machines are kind of the most human of our creations. They inherit many of our foibles." The wi ers will build systems that embrace this reality, not fight it. Financial institutions already get this—they've been wrestling with probabilistic models for years.
  • The operational framework: Think continuous manufacturing, not discrete. You're not shipping code; you're managing a living system that requires constant sampling, testing, and tuning. Build your entire operational stack around this assumption or get steamrolled by those who do.
  • The M&A Endgame Nobody Wants to Discuss

    The acqui-hire deals exceeding traditional M&A valuations? That's just the opening move. Lee sees the real action coming in the application layer consolidation. But here's the kicker: platform shifts historically don't favor incumbents. This one's different because of resource intensity—advantage Microsoft, Oracle, Amazon.
  • The strategic play: Pick your geography carefully. Lee warns that harmonized global AI standards are rolling back. You'll face the same data locality nightmares, multiplied. Some giants are already pulling out of entire geographies rather than comply. Build for global scale from day one or accept you're building a regional player.
  • The Unfair Advantage Synthesis

    The $109 billion flooding into AI is chasing the wrong constraints. While everyone fights over compute and models, the real moats are forming around energy access, flexible infrastructure, and geopolitical positioning. Your AI strategy needs three pillars: demand-flexible architecture that can arbitrage energy markets, systems built for probabilistic reality not deterministic fantasy, and a clear thesis on which geography you'll dominate—because global harmony is dead.
  • The brutal truth? Most AI startups are building on quicksand, ignoring the infrastructure realities that will determine wi ers. The founders who survive won't be the ones with the best models—they'll be the ones who understood the canal and locks journey and positioned accordingly. Stop obsessing over wrapper features and start securing your energy strategy. The infrastructure wars have already begun.
About the speaker

George Lee

Goldman Sachs Global Institute

 - Goldman Sachs Global Institute

George Lee is Co-Head at Goldman Sachs Global Institute

Related Podcasts by Category

Up Next: