How Machine Learning is Advancing Data Centers

Big Data revolution led to the explosion in Data Centers, which are consuming energy at increasingly higher rate. This blog reviews 2 standard methods for improving data center efficiency and argues that 3rd method - Machine Learning - is the best solution.



By Andrew Deen.

Skynet, V.I.K.I, and H.A.L. 9000 are a few examples of AIs imbued with the power of machine learning. They were created to solve problems that had become too complex for humans and were given control of everything through a neural network to increase efficiency, safety, and success. Granted, movies need an antagonist, so these AI marvels were given an unfavorable dark side – but such complex machine learning is real and has been successfully implemented.

Data centers have exploded into existence since the 2000’s. Originating from small servers in local offices, they have grown into hyperscale facilities in order to maintain our IoT and all of the associated data. The amount of data, processes, connectivity, and storage Americans use requires a substantial amount of power, meaning data centers are major energy consumers. Accounting for 2% of the country’s annual electricity usage, industry and government experts are working to increase efficiency as the need for data centers is projected to increase.

The solution is two-pronged:

  • They must make older data centers as efficient as possible
  • They must build more efficient, larger data centers

However, it’s not identifying the solution that is difficult. It’s the implementation. The true solution lies in machine learning systems. Through their processes, data center energy consumption can be reduced and made more efficient than previously possible.

 

Data Center Energy Usage

Large, unappealing buildings that house massive server farms are cropping up in more than just America. This is a global phenomenon as data and information is being digitally stored, and space is needed. Thanks to the amount being saved, we are gaining more complex insight into humanity’s patterns and tendencies. Unfortunately, storing data in servers isn’t like a library. They need constant power which generates heat, and the heat needs to be cooled – thus costing more energy.

In 2014, US data centers alone used roughly 70 billion kWh of electricity. For perspective, 1 kWh will keep a smartphone charged for one year. The energy is required to keep data centers running 24/7, cooled with extensive cooling systems, and maintain redundant power supplies in case of an emergency. Coming in at $7 billion dollars per year to maintain, the cost of energy affects everyone. The consumer, database provider, and the environment feel the consequences of such high amounts of energy consumed, yet our utilization shows no signs of slowing. Hence, utilization and energy efficiency are prime targets for adaptation.

Data center Electricity Use

Fig. 1: Data Center Electricity Use (billions of kWh/y).
Source: US Department of Energy, Lawrence Berkeley National Laboratory

There are at least 3 million data centers in the US – enough for one every 100 people in the country. The growth of data centers is complex, with most servers and related equipment being purchased from 2000-2010. As server space is used more efficiently and reliance on large data centers increases, however,statistics indicate the amount of storage will adapt instead of increase.

 

Energy Efficiency, Better Buildings Initiative, and Machine Learning

Some may think that office servers would be more efficient than new hyperscale data centers. Or that older, in-place data centers should be utilized before breaking ground and unnecessarily retiring them. These ideas have logical backing – independent servers are handled per their owner, and recycling instead of replacing – but they don’t work realistically. On-site servers require everything data centers do, so companies spend more money on their own server farms instead of using the exact amount of space needed in a remote center. And the older data centers were built without energy efficiency in mind, so even seemingly new ones could be obsolete.

Data center Electricity Consumption

Fig. 2: Total Data Center Electricity Consumption (billions of kWh/y).
Source: US Department of Energy, Lawrence Berkeley National Laboratory

Implementing energy efficiency on a large scale has been incentivized by the U.S. Department of Energy. Their Better Buildings initiative invites businesses and data centers to reduce energy consumption or utilize renewable alternatives. Major tech companies such as Google have pioneered the way, reducing energy usage in their data centers through machine learning systems.

 

DeepMind – Machine Learning for Data Centers and the Future

Machine learning is a process of AI, capable of learning from a scenario and reacting instead of responding via programmed options. Machine learning systems are given historical data, parameters (a goal), and operate with a neural network that mimic our own brain functions. They are perfect for data centers which have become to complex for previous systems and professionals to manage efficiently. Data centers experience near a billion daily events that can only be handled by intuitive systems for maximum operability.  For example:

  • Equipment-equipment/equipment-personnel interactions are unique to each center and daily occurrence
  • Traditional systems and people cannot quickly adapt to minute internal/external environmental changes that cause major energy loss

Enter Google’s DeepMind. Capable of reducing and maintaining energy efficiency by 40%, it is a general-purpose system that they hope to release on a broad-scale to reduce energy usage. The Google data center team trained DeepMind with certain operation scenarios, created adaptive parameters, input historical data such as temperatures and pump speeds, and oriented the goal towards future Power Usage Effectiveness (PUE). PUE is effectively the ratio of building energy to IT energy; used to measure energy efficiency.

Datacenter optimized by Deepmind

When deployed at Google’s data center, it successfully reduced energy usage by analyzing the unfathomable data sets and recommending actions. This doesn’t sound too far from what the AI’s mentioned at the beginning originally were tasked with. Fortunately, there is a clear line between science fiction and science. Systems like DeepMind are a benefit, capable of reducing energy consumption on a scale no longer within human grasp. In time, more systems will implement machine learning so they can maximize operability which saves people money and, more importantly, the environment.

Bio: Andrew Deen, Twitter: @AndrewDeen14 is a consultant, speaker, and writer. Discovering new stories in business, health, criminal justice & sports.

Related: