All articles
AI Explained

What Are Data Centres and Why Is Everyone Spending Trillions Building Them?

JTJennifer T.R.Editor in Chief, Stronk Blog8 April 202610 min read

You have probably seen the headlines: billions of dollars pouring into data centre construction, governments scrambling to build power plants, and tech companies buying up land at a pace not seen since the railway boom. Here is what is actually happening and why it matters.

What is a data centre?

A data centre is a building filled with computers. That is the simplest explanation, and it is accurate.

More specifically, a data centre is a purpose-built facility that houses thousands (sometimes hundreds of thousands) of servers — powerful computers that store data and run applications. When you send an email through Gmail, stream a video on YouTube, or ask ChatGPT a question, the actual processing happens on servers in a data centre somewhere in the world.

What is inside a data centre?

Servers — rack-mounted computers, stacked in rows, running 24/7. A large data centre might contain 100,000 or more servers.
Networking equipment — the switches and routers that connect everything and move data in and out at enormous speeds
Storage systems — hard drives and solid-state drives that hold the actual data
Cooling systems — this is a surprisingly large part of the operation. Servers generate enormous heat. Cooling them requires industrial air conditioning, liquid cooling systems, or in some cases, submerging servers in specialised cooling fluid.
Power infrastructure — uninterruptible power supplies (UPS), backup generators, and connections to the electrical grid. A large data centre can consume as much electricity as a small city.
Security — physical security (fences, cameras, biometric access) and cybersecurity systems

How big are they?

A small data centre might occupy a single room. A "hyperscale" data centre — the kind built by Google, Microsoft, Amazon, and Meta — can cover the area of several football fields and cost billions of dollars to construct. The largest facilities under construction today are measured in gigawatts of power capacity, which is the same unit used to describe power plants.

Why is everyone building them?

The short answer: AI.

The longer answer involves understanding why AI requires so much more computing power than traditional internet services.

The old model: serving web pages

For the past 25 years, data centres primarily served web pages, stored files, and processed transactions. These tasks are computationally light. A single server can handle thousands of web page requests per second. The infrastructure scaled gradually alongside internet adoption.

The new model: running AI

AI — specifically large language models like GPT-4o, Claude, and Gemini — requires a fundamentally different kind of computing. Training a large AI model requires thousands of specialised GPUs running simultaneously for weeks or months. Running that model (called "inference") requires significant GPU capacity for every single query.

When you ask ChatGPT a question, it does not look up an answer in a database. It performs billions of mathematical calculations in real time to generate a response. Multiply that by hundreds of millions of users, and you begin to understand the scale of compute required.

The numbers are staggering

According to McKinsey, an estimated $7 trillion in capital expenditure will go toward global data centre infrastructure by 2030, with over 40% of that spending in the United States.

BloombergNEF reports that the AI data centre buildout is advancing at full speed, with the 14 largest publicly owned data centre operators planning to spend close to $750 billion this year alone — up from $450 billion last year.

The American Edge Project found that the United States currently has 4,149 active data centres, with 2,788 more announced or under construction. Building those facilities will create roughly 4.7 million temporary construction jobs.

Who is building them?

The biggest spenders are the "hyperscalers" — companies that operate at massive scale:

Microsoft — building data centres globally to support Azure cloud services and its partnership with OpenAI
Google — expanding capacity for its AI services, Google Cloud, and the Gemini model family
Amazon (AWS) — the world's largest cloud provider, investing heavily in AI-optimised infrastructure
Meta — building AI infrastructure to support its social media platforms, AI research, and upcoming products
xAI (Elon Musk) — constructing what it calls the world's largest AI training facility

In January 2026, a consortium of tech companies committed up to $500 billion to build data centres across the US through the Stargate initiative.

The energy problem

This is where the data centre story becomes an energy story — and potentially a crisis.

How much power does a data centre use?

A traditional data centre might use 10 to 50 megawatts of electricity. A hyperscale AI data centre can use 1,000 megawatts (1 gigawatt) or more — the same output as a large power station.

According to Allianz Commercial's research, the global data centre industry's power demand is growing so fast that it is straining electrical grids in multiple countries. Over 100 gigawatts of data centre capacity is currently planned globally.

To put 100 gigawatts in perspective: that is roughly equivalent to the entire electricity consumption of the United Kingdom.

Where does the power come from?

This is the multi-trillion dollar question. The options being pursued:

Natural gas — the fastest to deploy, but produces carbon emissions
Solar and wind — many data centres are signing long-term renewable energy contracts. Microsoft, Google, and Amazon are among the largest corporate buyers of renewable energy globally
Nuclear power — both traditional plants and emerging small modular reactors (SMRs). Microsoft signed a deal to restart the Three Mile Island nuclear plant specifically to power its data centres. Google and Amazon have signed nuclear power agreements as well.
Grid upgrades — in many regions, the electrical grid itself needs billions of dollars in upgrades to handle the new load. Transmission lines built decades ago were not designed for this level of demand.

According to CSG Talent's data centre report, the energy question is now the single biggest constraint on data centre construction. It is no longer a question of whether companies want to build — it is whether the electrical infrastructure exists to power what they are building.

The Australian picture

Australia is part of this global buildout. Major data centre projects are underway or planned in Sydney, Melbourne, and other cities, driven by:

Growing cloud adoption among Australian businesses
Data sovereignty requirements — Australian privacy regulations and government contracts often require data to be stored on Australian soil
Asia-Pacific demand — Australia is a regional hub for cloud services serving Southeast Asia and the Pacific

The Australian data centre market is valued in the billions and growing rapidly, with both international operators (AWS, Microsoft Azure, Google Cloud) and domestic players expanding capacity.

What does this mean for business owners?

The data centre boom affects your business in several concrete ways:

1. Cloud services will keep getting better. More data centre capacity means more computing power available, which translates to faster, more capable AI models and cloud services. The AI agents you use will become more capable over time as the infrastructure supporting them expands.

2. Pricing dynamics are complex. More capacity should eventually drive prices down through competition. But the enormous energy costs of AI computing create a floor. The net effect so far has been falling AI inference costs — the price of each API call has dropped significantly over the past two years — but energy costs could slow that trend.

3. Local computing has strategic value. Running an AI agent on a Mac Mini in your office means you are not competing with millions of other users for data centre capacity. Your agent's performance is not affected by demand spikes, outages, or pricing changes in the cloud market.

4. The environmental dimension. If your business has sustainability commitments, it is worth understanding the energy footprint of the AI services you use. Providers that run on renewable energy have a different environmental profile than those running on natural gas.

The data centre construction boom is the physical manifestation of the AI revolution. Every chatbot response, every automated email, every AI-generated document exists because somewhere in the world, a server in a data centre is doing the work. Understanding that infrastructure helps you make smarter decisions about how you use AI in your business.

If you want to explore how AI agents can work for your business — whether cloud-based, locally deployed, or a hybrid of both — we are happy to talk through the options.

Discussion

Ready to put this into practice?

Book a free consultation and we will show you exactly how an AI agent applies to your business.

Book free consultation