Key Takeaways From The Uptime Institute Global Data Center Survey

24 Aug 2018 by Technology

During the first half of 2018, Uptime Institute surveyed nearly 900 IT practitioners and data center operators from around the globe in their Global Data Center Survey. The goal of the survey was to gain a better understanding of the data center IT infrastructure delivery and strategy and the main trends that are impacting them. In particular, the survey dove into the complexities, changes, and constraints that data center operators are currently facing. The following provides an overview of the survey findings and pinpoints the key takeaways that will help prospective enterprise clients, data center providers, channel partners, and industry leaders gain a more comprehensive understanding of the industry and assist them in developing more effective strategies. 

1. Increased Efficiency And Complexity

Since 2007, the industry has seen a steady improvement in the power usage effectiveness metric that has become the standard method for measuring efficiency in a data center over time. In 2007, the Uptime survey revealed that the average PUE in the industry was 2.5. This improved to 1.98 in 2011, 1.65 in 2013, and 1.58 in 2018. These numbers indicate that while efficiency improvements have been made, they are slowing.  

The main reason for this lackluster enhancement in efficiency is that enterprises are facing different challenges than they were from 2007 to 2013. IT architectures are growing increasingly complex, mainly due to hybrid approaches of mixing off-premises resources with on-premises data center capacity.  

In order to jumpstart further efficiency organizations will need to invest more. They will see the most success by investing in digital transformations that concentrate on total energy consumption of their IT load, rather than electrical and mechanical losses. 

In short, data centers are becoming more efficient. This can be seen through significant PUE (infrastructure efficiency metric) improvements. Unfortunately, new challenges are starting to surface, such as the cost and business case associated with hybrid IT configurations. 


2. The Rise Of Edge Computing

 More than 40% of survey respondents believe that they will need to begin processing data closer to its source, necessitating the acquisition of edge computing. It is likely that this move will cause enterprise data center capacity to fragment into smaller fleets of data centers. This is caused by the fact that only 37% of respondents reported that they will have the capacity to fully support added edge computing. Instead, many will use a public cloud service provider or a mix of their of data centers and colocation.  

In short, there will only be added complexity, especially in terms of operations and management, in the near future. Some of which will be created by increased edge computing capacity that many operators are planning to deploy. 


3. Rack Density Is A Growing Problem

Across the board, the average data center rack density is still relatively low. In the 2017 survey, over two-thirds of survey respondents had average densities that were below 6 kilowatts per rack. However, this average can be deceptive, especially because more enterprises are moving workloads to the public cloud and consolidating. When the survey dug for more information about rack density it found that roughly 20% of respondents' highest density rack was 30 kW or above. This is up from the single highest density rack reported in 2012, which was 26 kW. 

The reason that many operators are struggling with this is that most data centers are not designed for such high densities and do not have sufficient cooling systems. This steady rise indicates that the problem is not being addressed and, unless operators want to experience more severe issues, they will need to act.

In short, operators are struggling with cooling challenges, which can be seen through the fact that both service provider data centers and enterprises are reporting higher rack density issues.


4. DCIM Is Industry-Standard

 Data center infrastructure management, or DCIM, is a software that a useful software that enables data center leadership to get the most up-to-date and accurate information on everything from operational status to resource use. It is essentially a bird's-eye view of everything that is going on within the data center. 

 To some extent, the technology is complicated and is still in the early stages of deployment. However, the top data centers have significant success with it. 54% of respondents stated that they have purchased the technology and another 11% stated that they created a version of it in-house. Three-quarters of the DCIM users are pleased with the outcome it is producing. 

 In short, while some reports have stated that DCIM is not optimal, its implementation has been successful and many data centers are now using DCIM in some form.


5. Climate Change Is Not Being Addressed

 46% of survey respondents stated that they were not currently confronting the potential disruption that climate change could bring. Other respondents said that they are assessing their current technologies that could be impacted, especially their cooling methods. Some are reporting their carbon and energy data to corporate sustainability, developing flood risk procedures, more closely assessing data center site selection, or preparing for inclement weather.  

However, there is a clear lack of concern about climate change that the industry will need to address. They will need to have plans in place for extenuating circumstances, implement emergency drills, and look at alternative technologies that will work more effectively in droughts and higher temperatures. 

In short, data centers are susceptible to the effects of climate change, including extremely inclement weather events, water shortages, and temperature spikes. Throughout the industry, management is ignoring these issues or denying the fact that they will be affected by them. 


6. The Current Issue Of Skill Shortages Will Worsen

 With the rapidly evolving technology that data centers employ, it can be difficult to find skilled staff, particularly in the area of operations and management. 38% of respondents state that they are struggling to find qualified candidates and 17% report that they are facing retention issues. These issues are mainly being caused by internet, hyperscale and colocation providers. They have driven up salaries and have intense hiring practices. 

One solution that could create more success is increased training, as it will help to qualify more of the workforce and create increased retention through greater job security.  

In short, the data center workforce is both male-dominated and aging. This lack of diversity and skill shortage needs to be confronted. 

The data center industry is facing a number of hurdles. The good news is that there are solutions available. By better understanding the obstacles, operators, and enterprises can prevent or minimize negative effects. 

Author Technology is the fastest and easiest way for businesses to find and compare solutions from the world's leading providers of Cloud, Bare Metal, and Colocation. We offer customizable RFPs, instant multicloud and bare metal deployments, and free consultations from our team of technology experts. With over 10 years of experience in the industry, we are committed to helping businesses find the right provider for their unique needs. 


Subscribe to Our Newsletter to Receive All Posts in Your Inbox!