Data centers, which run the apps, websites, and services that billions of people use every day, can be dangerous places for the workers who build and maintain them. Workers sometimes have to service the electrical equipment of a data center while it is energized. They can be exposed to chemicals such as chlorine, which is used as a disinfecting agent for water circulating through liquid cooling systems for computers and servers. In June 2015, five people had to be hospitalized after a chlorine gas leak at the Apple data center in Maiden, North Carolina.
Data centers are more secure than they used to be. But in search of forward-looking solutions, some tech giants say they are exploring how artificial intelligence can be applied to prevent safety issues. For example, Microsoft is developing an artificial intelligence system that analyzes data from a range of sources and generates alerts for building data center and operations teams to “prevent or mitigate the impact of safety incidents.” A complementary but related system, also in development, attempts to detect and predict the effects on data center construction schedules.
“These initiatives are in the early testing stages and are expected to begin scaling into our production environments later this year,” a Microsoft spokesperson told TechCrunch via email.
Meta also claims to be looking at ways that AI can predict how its data centers will operate under “extreme environmental conditions” that could lead to unsafe working environments. The company says it is developing physical models to simulate extreme conditions and submitting this data to AI models responsible for optimizing power consumption, cooling and airflow across its servers.
“We have critical operational data from our data centers, in some areas at high frequency with sensors built into servers, racks, and in our data halls,” a Meta spokesperson told TechCrunch. “Each server and network device, with different workloads, will consume different amounts of power, generate different amounts of heat, and generate different amounts of airflow in data centers. Ours [infrastructure] The team collects all the data from each server and then develops AI models that can customize our servers and racks in data centers and send workloads to these servers for optimization [for] performance and efficiency.
Of course, companies have motives other than safety to ensure that data centers remain in peak condition. Interruptions are expensive – and they are becoming more frequent. According to a 2020 survey by the IT Uptime Institute, an IT consultancy, a third of data center owners and operators admitted they experienced a significant outage in the past 12 months. One in six claimed that the blackout cost them more than $1 million – up from one in ten in 2019.
Meta has more than 20 data centers in operation around the world, including new projects in Texas and Missouri that have a combined cost of $1.6 billion. Meanwhile, Microsoft operates more than 200 data centers, and says it’s on track to build 50 to 100 new data centers each year for the foreseeable future.
AI also promises to create opportunities for energy – and therefore cost savings – in the usually under-the-radar data center, which is another attractive aspect for businesses. In 2018, Google claimed that AI systems developed by its subsidiary DeepMind were able to save energy by an average of 30% compared to the historical energy usage of its data centers.
When DeepMind was reached for comment, it said it had no updates to share after the initial announcement. IBM and Amazon did not respond to inquiries. But both Meta and Microsoft say they are now using AI for similar power tuning purposes.
Microsoft launched AI “Anomaly Detection Methods” in late 2021 to measure and mitigate unusual energy and water use events within data center using tEleometric data of electrical and mechanical devices. The company also uses AI-based methods to identify and fix problems with energy meters in Data CenterIdeal positioning of servers to reduce wasted power, network, and cooling capacity.
For its part, Meta says it is taking advantage of reinforcement learning to reduce the amount of air it is pumping into data centers for cooling purposes. (At a high level, reinforcement learning is a type of artificial intelligence system that learns to solve a problem by trial and error.) Most company data centers use outdoor air and evaporative cooling systems, making airflow optimization a top priority.
Reducing the environmental footprint is an added advantage of energy-regulating AI systems. Data centers consumed about 1% of global electricity demand and contributed 0.3% of total carbon dioxide emissions in 2020, according to a report from the Environmental Investigation Agency. A typical data center uses 3 to 5 million gallons of water per day, which is the same amount of water as a city of 30,000 to 50,000 people.
Microsoft previously stated that it plans to operate all of its data centers with 100% renewable energy by 2025. Meta claimed to have achieved this milestone in 2020.
#Microsoft #Meta #join #Google #power #data #centers #TechCrunch