Technology and Innovation Community

 View Only
  • 1.  AI and Data Centers (Goldman Sachs Insight)

    Posted 14-09-2025 23:02

    Latest insight from the Team @ $GS available through the link below:

    AI and Data Centers

    Includes brief overviews of:

    The rise in power demand from data centers
    What's in an AI data center?
    Chinese semiconductor investment to rise more than expected
    Construction of data centers likely to exceed office construction
    How much did US "hyperscale" tech companies invest in 2023?

    Data Center Demand


    ------------------------------
    Todor Kostov
    Director
    ------------------------------


  • 2.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 07-10-2025 09:07

    Thanks Todor

    Another Bloomberg NEF article complementing the previous GS report:

    Power Demand Breakdown

    Key Insights:

    • Electricity demand from AI training and services is expected to quadruple by 2035, reaching about 1,600 TWh - roughly 4.4% of global power use.

    • If counted as a country, AI-focused data centers would rank fourth in electricity use, behind China, the US, and India.

    • The United States remains the largest market, with data-center demand in key hubs growing faster than that of electric vehicles, hydrogen, and other new technologies.

    • AI workloads are far more energy-intensive than traditional computing.

    • Data centers are scaling up: in 2010 most used under 5 MW; new sites now plan for hundreds of megawatts, with some approaching gigawatt scale.

    • Efficiency gains can't keep up: PUE (Power Usage Effectiveness) is expected to improve from 1.4 to 1.2 by 2030, but the growth in computing power outweighs these savings.

    • Short-term: growth is largely met by natural-gas-fired power, which is abundant and flexible in major US markets.

    • Long-term: operators face pressure from policymakers, net-zero targets, and customers to expand using cleaner energy sources.



    ------------------------------
    Carlos Salas
    Portfolio Manager & Freelance Investment Research Consultant
    ------------------------------



  • 3.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 07-10-2025 20:58

    Great info. Thanks Carlos. The focus going forward will be on ramping up energy capacity and compute. This on top of everything else.

    -Todor



    ------------------------------
    Todor Kostov
    Director
    ------------------------------



  • 4.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 14-10-2025 09:47

    New article on this topic.

    Exec Summary:

    AI's energy intensity and data-center expansion are transforming electricity demand, with nuclear power emerging as a critical, premium-priced, low-carbon baseload option. Limited new nuclear capacity and persistent hyperscaler demand could sustain elevated contract premiums and drive multi-billion-dollar EBITDA growth for leading operators.

    Key Insights:

    • AI-driven power demand:
      Growth in artificial intelligence, particularly large language models (LLMs), could significantly increase U.S. electricity consumption by 11–26% by 2030, requiring 131–310 GW of new generation capacity to support an additional 345–815 TWh of demand.

    • Energy intensity:

      • Generative-AI queries consume up to 10× more energy than standard internet searches.

      • Nvidia Blackwell GPU: 1,200 watts vs. ~150 watts for a CPU.

      • Major investors include OpenAI, Amazon, Google, and Meta, each planning multi-billion-dollar IT infrastructure outlays.

    • AI Capex forecasts (BI Technology team):

      • Training scenario: Spending peaks in 2026 after a rapid 2023–26 phase.

      • Inference scenario: Capex grows through 2032, with accelerated growth from 2023–26 and slower gains thereafter.

    • Energy source trade-offs:

      • Nuclear: Highest capacity factor (90%) but most expensive ($12,500/kW).

      • Coal: 60%, $5,000/kW.

      • Gas (CCGT): 60%, $2,500/kW.

      • Wind: 35%, $1,500/kW.

      • Solar: 25%, $1,500/kW.

      • Renewables' intermittency requires storage or backup; fossil fuels face emissions constraints.

    • Nuclear premium pricing:

      • Data centers are paying $15–$25/MWh above market.

      • Example: Constellation Energy's $840 million 10-year VPPA with the U.S. General Services Administration covers 1 million MWh/year (~$80/MWh).

      • EBITDA uplift potential: Constellation +$1.8B/year, Vistra +$500M/year if half of nuclear capacity is contracted at midpoint premium.

    • Nuclear capacity ownership (MW):

      • Constellation 22,000, Vistra 6,500, PSEG 3,800, NextEra 2,300, Talen 2,200, Dominion 2,000.

    • Contract structures:

      • Shift from behind-the-meter (BTM) to front-of-the-meter (FTM) VPPAs, avoiding regulatory hurdles but adding ~$7/MWh in transmission costs.

      • BTM offers better load control, avoids interconnection delays, and may lower capex.

      • Example projects:

        • Microsoft–Constellation (835 MW), Crane restart in 2027.

        • Meta–Constellation (Clinton plant) post-subsidy in 2027.

        • Talen–AWS (Susquehanna) expanded from 960 MW → 1.9 GW, including 300 MW BTM.



    ------------------------------
    Carlos Salas
    Portfolio Manager & Freelance Investment Research Consultant
    ------------------------------



  • 5.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 19-10-2025 21:28

    Great update Carlos.

    Below is a quick graph of even more optimistic CapEx forecasted expenditures from the Big 5 hyperscalers (source: GS, ZeroHedge).

    Hyperscaler CapEx



    ------------------------------
    Todor Kostov
    Director
    ------------------------------



  • 6.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 24-11-2025 09:51

    AI power hunger drives record private equity deals, November 2025

    KEY TAKEAWAYS
    • The surging electricity demand from AI is leading to record private equity investments in utilities and renewable energy, with deals on track to exceed $150 billion.
    • Data center operators are moving towards clean energy due to net-zero commitments and new regulations.
    • The Bloomberg Clean Energy Index has returned 31% this year, while the Bloomberg 500 and the Bloomberg Magnificent indices have seen returns of 15% and 20% respectively.

    Original Article 



    ------------------------------
    Carlos Salas
    Portfolio Manager & Freelance Investment Research Consultant
    ------------------------------



  • 7.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 6 days ago

    Todor and Carlos you have a great discussion thread going hear.  I have a couple of maybe devils advocate questions to challenge the exponential projection of energy demand.  

    1. Supply bottlenecks i.e the ability to build out supporting infrastructure not just generating capacity or grid connection/balancing but also the ability of the telecoms network capacity to carry what might be an exploding flow of data.  I haven't seen much discussion of potential network bottlenecks.
    2. Does Moore's law and innovation disruption upset what seem at least superficially to be linear progressions.  If power:compute ratio dramatically reduces power for the same AI activity has this been factored in - i.e. cost power down per compute through chip innovation.
    3. Gen AI is focused on building the biggest AI sledge hammer this favours a small number of well funded majors but what are the risks that smarter hardware and software architectures collapse the physical material and power needs of AI. 
    4. I recall my first summer job between school and uni working in a University computer services department with huge rooms of compute and tape decks whirring 24/7 for a then state of the art computer that now likley has.les s capability of my slightly aging iPhone.

    Any thoughts if the current scenarios have wargamed some of these historic IT lessons.



    ------------------------------
    David Manuel
    ------------------------------



  • 8.  RE: AI and Data Centers (Goldman Sachs Insight)

    Posted 3 days ago
    Edited by Todor Kostov 3 days ago

    Thanks David. 

    You've raised some very good and valid points to look at.

    I think all these forecasts for future energy consumption based on AI usage will change in real time as we go along since technology evolves 24/7.

    Firstly, each of the supply bottlenecks is addressed by the big hyperscalers individually. As we stand today, there is no shortage of energy (speaking about the US solely) but there is a mismatch to where energy is produced and where energy is consumed, i.e. geographical and structural mismatch. One example is the grid interconnection queues - in many regions, there is enough energy being produced, but new data centers are stuck in "waiting rooms" for years because the local utility company hasn't built the substations or high-voltage lines needed to plug them in. Another one is stranded power - there are places where renewable energy is actually "curtailed" (wasted) because the transmission lines are too narrow to carry it all to the market. AI companies are now looking to build data centers directly at the source (like near nuclear plants or hydroelectric dams) to bypass the grid entirely. The shift toward "behind-the-meter" power changes the narrative from AI not being solely a software race but also a civil engineering race. Regarding telecoms network capacity, there is not currently enough "ready-to-use" dark fiber to meet the projected AI demand. Much of the existing dark fiber which was laid during the late 90s and early 2000s does not provide the high-density, low-latency infrastructure that the AI "data explosion" requires. Infrastructure providers estimate that the U.S. alone needs to nearly double its fiber route miles-from roughly 95,000 to 187,000 miles by 2029. Because they can't find enough high-quality dark fiber to lease, the hyperscalers have pivoted from being customers to being infrastructure owners, i.e. Project Waterworth - Meta's massive subsea cable project is a "build-to-own" strategy to ensure they aren't throttled by third-party capacity.

    Moreover, software and chip innovation continues as we speak. Just last week Google introduced a set of advanced theoretically grounded quantisation algorithms that enable massive compression for large language models and vector search engines - TurboQuant (see also post from Carlos here). While this is not a physical memory chip itself, it is a breakthrough quantisation algorithm that significantly reduces the energy and memory footprint of AI systems compared to traditional memory management solutions. By compressing the data stored in a computer's RAM, it allows existing chips to perform much more efficiently. Because TurboQuant compresses data by up to 6x (down to just 3 bits per value), there is 83% less data to move back and forth. Less data movement directly translates to lower electrical power consumption. 

    Thirdly, it will be very hard to compete with the biggest 'majors' (hyperscalers, chip companies, LLM providers) going forward due to the massive capital requirements and the increased race for top human talent. However, there are good examples that smarter hardware chip architectures can be designed by challengers, i.e. Groq and Cerebras. In the example with Groq (already acquired by NVIDIA through a $20B strategic 'licensing' agreement), the LPU (Language Processing Unit) introduced by them provides a significant reduction in energy consumption for AI inference compared to traditional GPUs and CPUs. The primary energy savings come from Groq's deterministic architecture and its reliance on on-chip SRAM rather than off-chip High Bandwidth Memory (HBM) - Groq LPU is 10x more efficient per token than the standard NVIDIA H100 GPU but there is a density trade-off due to the low memory capacity (you need a rack of hundreds of LPUs which are most energy-efficient in high-traffic scenarios where the chips are constantly processing tokens).

    Finally, it's always good to take lessons from the past and compare to where we are nowadays with technology and respective solutions. I am pretty much sure the innovative mind of the human being has virtually no limits and the generations after ours will enjoy technological capabilities way beyond our imagination. As we are all following now with excitement NASA's Artemis II Mission to the Moon for the first time in more than 50 years, let's just remind ourselves that there is more computing power in the smartwatches on our wrists today, than the computer that flew to the Moon in the Apollo space craft (Apollo Guidance Computer (AGC)) in 1969.

    I hope this helps.

    - Todor



    ------------------------------
    Todor Kostov
    Director
    ------------------------------