Low Latency a High Priority When Choosing a Data Center Location

3 Apr 2014 by Datacenters.com Technology

"Conventional wisdom about network technology dictates that data centers have to be located close to whatever application or user base they're serving to achieve low latency and ensure applications work correctly. Based on the simple realities of physics - information that travels along fiber optic cables can still travel no faster than the speed of light - distance between a data center and the end user does have a tangible effect on latency. As a result, companies have sought to overcome the challenges such realities pose by locating theirservers as close to their users as possible.

Low Latency Data Centers, Like this One, Are Use Heavily in Finance

The Low Latency Scramble
The trend has been particularly visible in the financial sector, where a few milliseconds can make a giant difference in the effects of high-frequency trading algorithms, in particular. Firms pay high premiums to build data centers in northern New Jersey near the servers of exchanges like the New York Stock Exchange and Nasdaq. The industry has also worked to pioneer lower-latency technologies, such as laser transmission between the exchanges, in a race to close the gap between trade times and the speed of light, the Wall Street Journal recently reported.

One of the most controversial aspects of this race to cut latency has been the practice of firms hosting their servers in the same colocation facilities as the exchanges' own servers, effectively eliminating latency based on distance. New York attorney general Eric T. Schneiderman recently launched an investigation into the practice, suggesting that it helps foster ""insider trading 2.0,"" the New York Times reported.

Nonetheless, the takeaway from a data center operation perspective has seemed clear: A shorter distance from the data center to the end user means less latency, which in turn means better performance. As a result, data center real estate in northern New Jersey was worth as much as four times the cost per square foot as commercial real estate in the most expensive Madison Park and Fifth Avenue high rises in New York as of May 2013, according to a separate New York Times article.

Given the high cost premium to be in a desirable location, companies have a financial motivation to look to new areas when choosing a data center location, but they have to weigh that factor against the need for rapid data transmission. The good news for many firms is that the latency differences between data center locations are effectively negligible in a large range of use cases, experts have found.

Choosing A Data Center Location While Addressing Latency Needs

Distance does affect latency, but, until the distances involved are on the intercontinental level, the effects may, in many cases, be negligible, network technology expert Philip Carden explained in a recent column for Light Reading. According to Verizon's latest IP latency statistics, for instance, data travels round trip across the Atlantic in just less than 80 milliseconds, and it travels across the Pacific in just more than 110 ms. Packets delivered across Europe make the trip there and back in an average of around 14 ms, while round trip across North America takes around 40 ms.

Whether these speeds have any perceptible difference on application performance may depend on what the application is, Carden noted, explaining it takes the human brain around 80 ms to process and synchronize sensory inputs. This lag is why, for instance, the sight and sound of someone clapping their hands register simultaneously even though the sound takes longer to travel than the sight. Once the delay between the two is more than 80 ms, it becomes perceptible, as when the sound in a video doesn't sync up with the visual component. For this reason, small latency delays are generally imperceptible to human users. Small latency factors such as network packet processing times can therefore be easily dismissed, which makes the main latency factors for an application the server response time, network queuing and the speed of fiber optic transmissions.

In many cases, then, latency caused by distance - an added 10 ms or 20 ms - is completely imperceptible to users. Even with an application making several calls to the server, a data center can be relatively far away before the customer starts seeing any effect. As a result, for a customer-facing application such as cloud software or a social network, latency should not have any major bearing on data center location.

Instead, most companies should look for locations where power is cheap and reliable, cooling is simplified by the climate, network capabilities are strong and disaster risk is low. Ironically, a 2013 study by GigaOM found that the states with the most data centers tended to be the most disaster-prone. In response to that study, Mark Thiele, executive VP of data center tech at the Las Vegas-based Switch, suggested that around 95 percent of companies could afford to let latency slide slightly by using data centers in different locations rather than concentrating infrastructure in states where disasters were common. GigaOM noted that many companies, including Google and Facebook, have been able to cut costs by relocating data centers to places with cheaper real estate like North Carolina and Oregon.

Working With Low Latency Needs

The rush to build extremely low latency solutions in the trading world aren't completely unfounded, however. In addition to high-speed trading, there is an expanding range of latency-sensitive machine-to-machine services such as car controls and virtual networking functions, Carden noted. These M2M categories are growing as more connected devices come online in the burgeoning Internet of Things sector. As a result, there will be a growing need for data centers clustered near or in the same city as their endpoints to serve these applications.

""[S]erver location will become increasingly important for a sub-set of applications, while for the majority of cloud applications, the location of data centers is already very flexible, and will become much more so over time, as protocols improve,"" Carden wrote.

Given these trends, companies will likely want to respond by examining their data center footprints and considering where they might want to locate their data center infrastructure. If their needs are trending toward M2M applications, they may need to invest in more data centers in certain regions to cut latency. However, if they are spending large sums to maintain a data center infrastructure with low latency for customer-facing applications, it may be possible to switch to cheaper solutions without any perceptible difference to customers."

Author

Datacenters.com Technology

Datacenters.com is the fastest and easiest way for businesses to find and compare solutions from the world's leading providers of Cloud, Bare Metal, and Colocation. We offer customizable RFPs, instant multicloud and bare metal deployments, and free consultations from our team of technology experts. With over 10 years of experience in the industry, we are committed to helping businesses find the right provider for their unique needs. 

Subscribe

Subscribe to Our Newsletter to Receive All Posts in Your Inbox!