Thanks David.
You've raised some very good and valid points to look at.
I think all these forecasts for future energy consumption based on AI usage will change in real time as we go along since technology evolves 24/7.
Firstly, each of the supply bottlenecks is addressed by the big hyperscalers individually. As we stand today, there is no shortage of energy (speaking about the US solely) but there is a mismatch to where energy is produced and where energy is consumed, i.e. geographical and structural mismatch. One example is the grid interconnection queues - in many regions, there is enough energy being produced, but new data centers are stuck in "waiting rooms" for years because the local utility company hasn't built the substations or high-voltage lines needed to plug them in. Another one is stranded power - there are places where renewable energy is actually "curtailed" (wasted) because the transmission lines are too narrow to carry it all to the market. AI companies are now looking to build data centers directly at the source (like near nuclear plants or hydroelectric dams) to bypass the grid entirely. The shift toward "behind-the-meter" power changes the narrative from AI not being solely a software race but also a civil engineering race. Regarding telecoms network capacity, there is not currently enough "ready-to-use" dark fiber to meet the projected AI demand. Much of the existing dark fiber which was laid during the late 90s and early 2000s does not provide the high-density, low-latency infrastructure that the AI "data explosion" requires. Infrastructure providers estimate that the U.S. alone needs to nearly double its fiber route miles-from roughly 95,000 to 187,000 miles by 2029. Because they can't find enough high-quality dark fiber to lease, the hyperscalers have pivoted from being customers to being infrastructure owners, i.e. Project Waterworth - Meta's massive subsea cable project is a "build-to-own" strategy to ensure they aren't throttled by third-party capacity.
Moreover, software and chip innovation continues as we speak. Just last week Google introduced a set of advanced theoretically grounded quantisation algorithms that enable massive compression for large language models and vector search engines - TurboQuant (see also post from Carlos here). While this is not a physical memory chip itself, it is a breakthrough quantisation algorithm that significantly reduces the energy and memory footprint of AI systems compared to traditional memory management solutions. By compressing the data stored in a computer's RAM, it allows existing chips to perform much more efficiently. Because TurboQuant compresses data by up to 6x (down to just 3 bits per value), there is 83% less data to move back and forth. Less data movement directly translates to lower electrical power consumption.
Thirdly, it will be very hard to compete with the biggest 'majors' (hyperscalers, chip companies, LLM providers) going forward due to the massive capital requirements and the increased race for top human talent. However, there are good examples that smarter hardware chip architectures can be designed by challengers, i.e. Groq and Cerebras. In the example with Groq (already acquired by NVIDIA through a $20B strategic 'licensing' agreement), the LPU (Language Processing Unit) introduced by them provides a significant reduction in energy consumption for AI inference compared to traditional GPUs and CPUs. The primary energy savings come from Groq's deterministic architecture and its reliance on on-chip SRAM rather than off-chip High Bandwidth Memory (HBM) - Groq LPU is 10x more efficient per token than the standard NVIDIA H100 GPU but there is a density trade-off due to the low memory capacity (you need a rack of hundreds of LPUs which are most energy-efficient in high-traffic scenarios where the chips are constantly processing tokens).
Finally, it's always good to take lessons from the past and compare to where we are nowadays with technology and respective solutions. I am pretty much sure the innovative mind of the human being has virtually no limits and the generations after ours will enjoy technological capabilities way beyond our imagination. As we are all following now with excitement NASA's Artemis II Mission to the Moon for the first time in more than 50 years, let's just remind ourselves that there is more computing power in the smartwatches on our wrists today, than the computer that flew to the Moon in the Apollo space craft (Apollo Guidance Computer (AGC)) in 1969.
I hope this helps.
- Todor
------------------------------
Todor Kostov
Director
------------------------------
Original Message:
Sent: 03-04-2026 16:03
From: David Manuel
Subject: AI and Data Centers (Goldman Sachs Insight)
Todor and Carlos you have a great discussion thread going hear. I have a couple of maybe devils advocate questions to challenge the exponential projection of energy demand.
- Supply bottlenecks i.e the ability to build out supporting infrastructure not just generating capacity or grid connection/balancing but also the ability of the telecoms network capacity to carry what might be an exploding flow of data. I haven't seen much discussion of potential network bottlenecks.
- Does Moore's law and innovation disruption upset what seem at least superficially to be linear progressions. If power:compute ratio dramatically reduces power for the same AI activity has this been factored in - i.e. cost power down per compute through chip innovation.
- Gen AI is focused on building the biggest AI sledge hammer this favours a small number of well funded majors but what are the risks that smarter hardware and software architectures collapse the physical material and power needs of AI.
- I recall my first summer job between school and uni working in a University computer services department with huge rooms of compute and tape decks whirring 24/7 for a then state of the art computer that now likley has.les s capability of my slightly aging iPhone.
Any thoughts if the current scenarios have wargamed some of these historic IT lessons.
------------------------------
David Manuel
Original Message:
Sent: 24-11-2025 09:51
From: Carlos Salas
Subject: AI and Data Centers (Goldman Sachs Insight)
AI power hunger drives record private equity deals, November 2025
KEY TAKEAWAYS
- The surging electricity demand from AI is leading to record private equity investments in utilities and renewable energy, with deals on track to exceed $150 billion.
- Data center operators are moving towards clean energy due to net-zero commitments and new regulations.
- The Bloomberg Clean Energy Index has returned 31% this year, while the Bloomberg 500 and the Bloomberg Magnificent indices have seen returns of 15% and 20% respectively.
Original Article
------------------------------
Carlos Salas
Portfolio Manager & Freelance Investment Research Consultant
Original Message:
Sent: 14-09-2025 23:02
From: Todor Kostov
Subject: AI and Data Centers (Goldman Sachs Insight)
Latest insight from the Team @ $GS available through the link below:
AI and Data Centers
Includes brief overviews of:
| ▪ |
The rise in power demand from data centers |
| ▪ |
What's in an AI data center? |
| ▪ |
Chinese semiconductor investment to rise more than expected |
| ▪ |
Construction of data centers likely to exceed office construction |
| ▪ |
How much did US "hyperscale" tech companies invest in 2023? |
------------------------------
Todor Kostov
Director
------------------------------