Back to Industries & Applications
Although often hidden from view, there can be no debate that our reliance on data centers is spiraling. In their report ‘Data Age 2025’, analyst firm IDC forecasts that within four years, society will generate 175 zettabytes (ZBytes) of data each year – somewhere between two and three times the current rate. The report goes on to suggest that there will be 6 billion connected users by that time, each of whom will interact with a data center almost 5,000 times per day – triple today’s rate and 10 times what it was in 2015.
Cloud-based services are increasing, especially via mobile devices and billions of IoT nodes, which further heightens the reliance upon data centers. In the past, data centers were used primarily for storing data, and access time was not critical. However, today, a low-latency operation is becoming the ‘norm’ to support high-demand applications and capabilities such as voice assistants and video streaming. With the rapid changes in the data landscape comes the new definition for key attributes of the modern data center, with flexibility, adaptability, scalability, and reliability all rising to the top. And all point a spotlight on the need for efficiency.
As the energy consumed by data centers inevitably rises, efficiency in data centers has become not just desirable but critical. In November 2018, the International Energy Agency estimated that data centers consume 1-1.5% of the world’s energy in their report entitled ‘Digitalization and Energy’. While efficiency improvements ensure that energy usage does not rise in exact proportion to data, clearly data centers have the potential to become even more significant energy users in the near future.
Organizations building their own data centers (or relying on a partner to do so) are facing unprecedented energy-related challenges. Many are adopting a range of measures to regularly monitor their performance, with PUE (Power Usage Efficiency) being very common. This measures how much of the total power consumed is used for powering IT equipment and how much is used for other infrastructure, including battery management, UPSs, and cooling. Essentially, PUE challenges data centers to use as little energy as possible on activities that don’t add value.
Data Center Location – A Primary Concern
Designing and managing a data center to meet customers’ growing data requirements while being environmentally aware and returning a profit is a challenging, multi-faceted exercise. One of the key decisions is data center location, bearing in mind several factors including the local economy, customer proximity, availability of not just affordable and reliable but also sufficient power, as well as agreeable networking connectivity, and the taxation landscape. From an environmental and PUE perspective, geographical location is higher on the list than many may think.
The location will largely determine the energy costs and, to an extent, whether renewable energy is available. It will also determine the cost of the energy needed – remembering that energy costs are often a very significant proportion of the lifetime costs of a data center – and rising. Given the benefit to the local economy of having a data center located there, astute operators will often consider one location versus another to ultimately obtain the best terms.
However, the location can also be leveraged to reduce running costs. The heat generated by the servers is a very good measure of losses, which leads to inefficient operation and the need for waste heat to be reduced. In a cool climate, cooling can be achieved for much of the year by filtering ambient air from outside and running it through heat exchangers, saving the not inconsequential cost of running air conditioning. The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) publish thermal guidelines showing where in the world the ‘free cooling’ is available and for how long each year. Another source of ‘free’ cooling is the use of seawater for coastal data centers.
The use of environmental elements to (partially or wholly) cool a data center is very attractive, but moisture and other pollutants that will inevitably get inside can degrade lower quality components. Further, the many connectors that provide the needed transmission of data and power for data centers with ambient-assisted cooling require the highest levels of plating and full testing and qualification to ensure they can operate efficiently under such conditions.
In Europe, London, UK is the largest data center hub with 495MW of data center power capacity, which surpasses Paris, Frankfurt, and Amsterdam. London also boasts a well-established and abundant fiber network, and many colocations and cloud providers have established themselves in London to be near the financial center. However, increasing real estate prices and high energy costs mean that the hyperscale operators (Facebook, Microsoft, Amazon) have favored Sweden, Denmark, and Ireland, where taxes, land and energy prices are lower. But London’s cool, temperate climate, enabling outside air cooling for most of the year, may prompt these organizations to think again.
Contrast this with Singapore, which views data centers as key to earning its status as an Asian business hub and financial center. The city-state provides 290MW of power for data centers – a figure which is growing rapidly. However, the land is scarce (and therefore expensive) and the year-round hot and humid climate precludes any form of evaporative cooling, requiring permanent use of chillers. The cost implications of forced cooling are so great that it prompted the Singaporean government to issue a moratorium throughout 2021 on the building of new data centers whilst more efficient power solutions are explored.
Incremental Improvements Contribute
The heat generated in a data center is an excellent guide to its efficiency. Engineers have long understood that the best way to eliminate waste heat is not to generate it in the first place. To achieve this, substantial work has been carried out to increase efficiency within server power supplies, particularly in advanced topologies and new semiconductor materials. However, the opportunities for significant ongoing gains within power systems are few and far between, so designers often turn their attention to other aspects of design.
Incorporating high-quality and efficient interconnect components is one way energy consumption can be reduced. While the savings per server may be relatively small, it adds up across all servers in a data center to amount to appreciable gains. For example, Molex’s BiPass solution allows designers to bypass the lossy printed circuit board by utilizing Temp-Flex high-speed Twinax, which allows them to achieve lower insertion loss when connecting from an ASIC in a switch or a router to another server within a rack. Advances in heat sink technologies are enabling highly efficient, reliable, and resilient thermal management strategies to support higher density in both copper and optical connectivity. From a future perspective, this great signal integrity performance and low insertion loss capability allow designers to eliminate retimers, saving the energy they would have otherwise consumed.
For more than two decades, Molex has used advanced thermal simulation to integrate various engineering technologies and advanced software to provide innovative and reliable designs for its global customer base. Because thermal simulation gives a unique representation of airflow and temperature, engineers can leverage this data to design better cooling systems. Recognizing the challenges that data center customers face, Molex has again thought outside the box and is now offering this as a service to help customers solve their thermal challenges.
Commitment To Environmental Stewardship
In our data-hungry world, the shift to digitization is only going to accelerate. As a result, flexibility, adaptability, scalability and reliability are key attributes of the modern data center. But, efficient energy use remains a primary concern. Geographic location is one of the most significant single decisions to be made for a data center. Still, once that is settled, partnering with innovators who are committed to environmental stewardship helps to ensure that energy-efficient design is at the heart of the data center to drive system efficiency and sustainability.
Share