By Ocient Staff
It’s no secret that data is proliferating across the globe at unprecedented rates. With the rise of the Internet of Things (IoT) and the digital transformation of nearly every industry, from farming to finance, the amount of data being generated is staggering.
At the same time, artificial intelligence (AI) is gaining ground as a powerful way to develop new insights, with many organizations eagerly looking for ways to bring AI into their daily operations, strategic planning processes, product lines, and more.
These are exciting times in the technology industry. With so much data and so many innovations in AI, the possibilities seem nearly endless. But even as we embrace AI as the next big thing, recent headlines suggest growing concerns around data center sustainability, the carbon footprint of data storage infrastructures, and the environmental impact of data.
The Environmental Impact of Data Centers
As we discussed in our Earth Day 2024 blog, data centers are expected to consume as much as 10% of the world’s energy supply by 2030. When you bring AI into the equation, energy consumption skyrockets even further, with the average ChatGPT search consuming nearly 10x more energy than a typical Google search.
In this blog, we’ll examine what’s driving energy usage in tech, some steps being taken by governing bodies to reduce usage all over the globe, and ways in which businesses can reduce their environmental impact while still fully leveraging their data to drive business value.
More Data, More Energy
Let’s begin with the simplest of data use cases. In order to be useful, data must be stored and made available for analysis. That means servers. Lots and lots of servers, typically housed in data centers. Worldwide, the number of data centers in operation exceeds 11,000, with more being built all the time. Remember, each of these consumes 10x-50x more energy than a typical office building, so their carbon footprints are substantial.
Experts believe increased demand from AI workloads will drive up data center energy consumption even further. Just training new models demands an enormous amount of energy. One recent study estimates that training a large language model like ChatGPT consumes 1,300 KWh – about as much as it takes to power 130 homes for a year. Forbes cites another study that estimates that the total energy needed to power AI will increase more than 80x between 2024 and 2030.
The Cost of Keeping Cool
Approximately 40% of data center energy expenditure goes to keeping data centers cool, with either air conditioners or evaporative coolers. Unfortunately, both cooling options have environmental drawbacks.
Air conditioners draw a lot of energy, and the refrigerants they use contribute significantly to scope 1 and 2 greenhouse gas emissions, trapping heat in the Earth’s atmosphere and elevating global temperatures. Of course, as outdoor temperature increases, the need for cooling rises, too, creating a vicious cycle.
Many data centers leverage water cooling—e.g., evaporative coolers, heat exchangers, heat pumps—to keep their servers from overheating. While these systems use less power than air conditioners and they don’t emit greenhouse gases, they do require a considerable amount of water. In fact, an average-sized data center would require 300,000 gallons of water per day to keep its servers functioning using these water-based methods of cooling. To compare, the average household in the US uses about 300 gallons of water per day. In regions where water is already scarce, this can have a devastating impact.
A Strategic Imperative
In September 2024, the European Commission began requiring organizations that operate data centers to file detailed reports on their energy and water consumption. Companies in the EU will also be required to report on their efforts to reduce energy and water consumption. This is the first step in a larger effort to reduce overall energy consumption in the EU by more than 11%.
This approach is expected to be repeated all over the globe, with other governments following suit as they look for new ways to combat climate change and hold organizations accountable for their energy and resource consumption. As more and more organizations begin scrutinizing their carbon footprints, data centers present an excellent opportunity to reduce waste.
A Golden Opportunity to Go Green
As we mentioned above, data generation and demand for computing is growing rapidly and unlikely to slow down. So how on Earth can we reduce the amount of energy data centers draw?
Here at Ocient, we’ve got a powerful solution (pun intended). Back in 2016 when we set out to create a platform that can continually ingest, store, and analyze massive amounts of complex data, we did so with a view to using commercially available hardware as efficiently as possible.
As a result, the Ocient Hyperscale Data Warehouse™ is packed full of innovations like Compute Adjacent Storage Architecture™ and Zero Copy Reliability™ that don’t just boost performance – they minimize the number of servers, racks, data centers, energy, cooling, and water required to support always-on, compute intensive workloads. By using Ocient to increase the amount of compute they get per watt, our customers can conquer their biggest data challenges while decreasing cost, complexity, and their carbon footprints.
With Ocient, organizations can reduce the environmental impact of data without compromising performance, capabilities, or data granularity. Want to learn more about Ocient’s energy reduction capabilities? Contact us.