This Canada West Foundation brief is a special mash-up of our Energy Innovation Brief and the AI&U portion of our Future of Work and Learning Brief. We hope you enjoy it.


AI data centres – taking a byte out of the environment

With the growing demand for computing power to support artificial intelligence (AI), data centres are being built fast and furious across Western Canada and the world. Many regional and provincial governments are courting data centres as a source of revenue—but their construction and operation come with impacts. In this brief, we take a look at the implications data centres have on electricity, water and land use. This discussion is particularly relevant as the federal government is currently consulting with the provinces on how to spend the $2 billion it has allocated for AI infrastructure.

Electricity needs and GHG emissions

What’s the problem?

AI is incredibly energy intensive. Training a Large Language Model (LLM) such as GPT4 can use more than 50 gigawatt-hours of electricity, almost as much as 3,500 homes in Canada. When users—like you or me—run AI queries, there are additional energy costs: an AI search can use five to 10 times the energy of a conventional web search. Although still relatively small compared to many other industries, “A.I. is having a profound impact on energy demand around the world, it’s often leading to an uptick in planet-warming emissions, and there’s no end in sight.” The International Energy Agency projects that by 2026 the AI industry will consume at least ten times its 2023 total demand.

For example, Google saw an increase in company emissions by 13 per cent in 2023 over 2022 because of the demand from data centres. Microsoft similarly saw an increase of 29 per cent in 2023 over 2020.  While these companies continue to invest in renewable energy development, the extra load poses “a grave threat” to efforts to transition energy supply to renewables. For example, in West Virginia, coal-fired power plants set for retirement are still active to meet this additional demand.

Some jurisdictions have enacted moratoriums on development of data centres due to electricity pressures. From 2019 to 2022, Singapore halted the development of new data centres to meet its target of net zero by 2050. The utility company Dominion in northern Virginia paused new power connections for a few months in 2022 to ensure the system could support growth. EirGrid in Ireland stopped issuing connections to new data centres until 2028 because the electricity provider is concerned about capacity.

What are some solutions?

In a recent statement, Alberta Premier Danielle Smith suggested that AI companies “bring [their] own electricity,” drawing on the model of cogeneration used in the oilsands where producers meet their own electricity needs and sell excess back to the grid. Major companies such as Microsoft, Google, Amazon and Meta have entered power purchase agreements with renewable energy companies. The data centre industry is expected to add 74 gigawatts of renewable energy capacity globally through these agreements.

Similar to smart charging of electric vehicles – where car batteries are charged when energy availability is highest, or demand is lowest – major AI workloads can be scheduled based on when and where there is a surplus of clean energy. AI technology has helped improve this scheduling by effectively predicting when renewable energy will be available in surplus. Doing so can reduce the strain of grids and improve integration of renewable energy into grids by mitigating fluctuations in energy supply.

Alongside sourcing reliable clean energy, researchers are finding ways to increase the energy efficiency of AI models:

  • A group at the Lincoln Laboratory Supercomputing Center can predict when a training configuration is ineffective, resulting in an 80 per cent reduction in the energy used for training AI models.
  • MIT researchers reduced energy consumption by 12-15 per cent and computer temperatures by 16° C when they operated computing chips (GPUs) with limited power. The change increased training time by only three per cent.
  • An Ottawa-based company re-networks data centres to spread out GPUs, reducing electricity and cooling needs by up to 50 per cent.
  • Columbia researchers can transmit multiple signals simultaneously over the same fiber-optic cable, decreasing computational bottlenecks and increasing energy efficiency.
  • A group at MIT designed a chip that uses a strategy known as “conservative computing” which does not convert energy to heat as frequently.

Such efficiency-based solutions risk increasing demand, offsetting the expected energy and water savings—a version of Jevon’s paradox, but for AI.

Water use and waste heat

What’s the problem?

Data centres generate a huge amount of heat from their processing units. Commonly, data centres dissipate this heat through water-based evaporative cooling systems. Some researchers have estimated that by 2027 the global demand for water for AI could be half that of the United Kingdom. Freshwater is used to prevent degradation and clogging of the cooling systems which can strain limited freshwater resources. One study estimates that LLM training in Microsoft’s state-of-the-art U.S. data centres could directly evaporate 700,000 liters of clean freshwater.

Significant water use pushed residents in West Des Moines, Iowa to file a lawsuit because the data centres training GPT-4 used about six per cent of the district’s water in July 2022. In other areas of the U.S., communities at risk of drought are raising concerns about the development of data centers nearby straining their already limited water resources.

What are some solutions?

Air conditioning systems could be used instead of evaporative cooling, increasing energy demand but requiring much less water. Some data centers in the U.S., Iceland and Norway use geothermal cooling technology – the constant cool temperature below the earth’s surface is used to cool water in a closed-loop pipe system, which means no water is consumed in the process. Newer cooling methods such as immersion cooling and liquid cooling are also being considered. Calgary-based Denvr Dataworks immerses data chips in a hydrocarbon-based liquid which transfers heat away from the chips. A dry cooling method then exchanges that heat with the outside world – all without water consumption.

The waste heat from data centres is also being used to provide heating elsewhere. Massive data centres in places like Ireland, Finland and Denmark redirect their waste heat into local heating systems, partially warming homes. The PA10 data centre in Paris is being used to heat buildings in the region of Seine-Saint-Denis including the new Olympic Aquatics Centre.

Siting issues

What’s the problem?

Data centres, like many other large, semi-industrial facilities, can annoy or anger communities depending on where and how they are situated. Here are some of the issues that emerge when a data centre is your neighbour:

  • Data centres average 100,000 square feet, is a bit larger than a professional soccer field, but can range anywhere between 10,000 to more than four million square feet. Not all locations are equally suitable and not all municipalities are ready for zoning a facility of this size.
  • Data centres can be very noisy. Communities in Virginia protested the presence of data centres near their neighborhoods because of noise pollution.
  • Locations are often either ideal for renewable energy or for cooling optimization but not always for both. Solar and wind power are often located in areas that are sunnier and drier, increasing the need for cooling. On the other hand, cooler northern locations often are not ideal for renewable energy projects.
What are some solutions?
  • Loudoun County in Virginia has defined data centres as a distinct land use allowing clearer regulation and management of data centre development. This has helped the region to become what is considered the “data centre capital of the world.”

As planning occurs for the development of data centres in Western Canada, the electricity, water and land use of this infrastructure must remain top of mind. “Canada could be a leader in responsible AI” but that responsibility will require managing environmental impacts as well.


Canada West Foundation’s summer intern Shreya Shah, a Loran Scholar entering her second year of Applied Science in Engineering at UBC, authored this mash-up edition of the brief.

We would like to take this opportunity to thank Shreya for her work at CWF over the summer. Her curiosity and adaptability to take on a wide range of questions has been incredible and we will be cheering on her future success. CWF thanks the Loran Scholars Foundation and the Canada Summer Jobs program for their continued support of our summer internship program.