In 2010, as part of Dell DCS, it became apparent to Dave McCrory that the creation and collection of data was a trend worth investigating. As that research progressed, new curiosities led to a larger question: what happens when the sources that create data and the collections of data begin to grow and interact?
The answer begins with viewing data as a planet or an object with sufficient mass. As data builds mass, it will likely attract additional services and applications. This is the same effect gravity has on objects around a planet.
A decade later Dave McCrory is vice president growth at Digital Realty and has been integral in developing the inaugural study on data gravity index. “For the last several years, I have continuously researched, tested, and tweaked a Data Gravity formula and methodology that would help explain the intensity of Data Gravity,” he explains. “Once I recognized the foundation of the formula (mass times activity), I knew I could finally start quantifying things that I could not before this realization.”
Since joining Digital Realty, McCrory has been working on what was, until now, a secret project called The Data Gravity Index DGx 1.0. “Now data gravity has a methodology, a patent-pending formula, and a decipherable pattern,” he says. “It is the first time data gravity has been measured and quantified for the Global 2000 Enterprises.”
Implications of data gravity
The Data Gravity Index DGx gives insight into just how rapidly data will be growing and consumed over the next five years and what all of this means to the Forbes Global 2000. It provides a data gravity score across 21 global metros to help enterprises understand the implications of the creation, aggregation, and private exchange of enterprise data.
“To effectively provide the first measurement and quantification of data gravity for the Global 2000 Enterprises, we spent 12 months pulling research from multiple authoritative sources,” McCrory adds. “Additionally, we conducted a comprehensive cause and effect study. In the end, we were able to develop a formula to measure, quantify, and determine the implications of the explosion of enterprise data growth.”
Calculating Data Gravity requires a methodology that includes data mass, data activity, bandwidth, and latency. The Data Gravity Index DGx implements a patent-pending formula which quantifies and predicts the continuous creation of data across 21 metros globally. Calculated in gigabytes of data per second, the score provides a relative proxy for measuring data creation, aggregation, and processing.
The attraction of data
But what exactly is data gravity. McCrory defines it as the effect that attracts large sets of data or highly active applications/services to other large sets of data or highly active applications/services, the same way gravity attracts planets or stars. “Smart companies would try to leverage the effect of data growth and attraction in the cloud,” he adds. “However, most large-scale companies do not fully appreciate how quickly their data is growing. They also didn’t recognize how many things generate data creation activity, such as sensors, systems, employees, customers and the applications/services that interface with them.”
All of these elements must operate as fast, efficiently, and securely as possible. Doing this requires being close to sources of data creation, processing, aggregation and exchange, and enrichment. Enterprises cannot expect optimal results from large volumes of data existing within slow networks that are located far away.
“If you are working with large data quantities and/or high levels of processing activity, then the other elements of the system must be brought closer to them, McCrory says.
Software is not eating the world anymore; data is eating the world, and it will continue to do so for the foreseeable future. Specifically, there are five macro trends amplifying data gravity, including enterprise data stewardship, mergers and acquisitions, digital-enabled interactions, data localization, and cyber-physical.
“These factors make up the perfect storm for this explosion of data,” McCrory explains. “Data gravity inhibits enterprise workflow performance, raises security concerns, and increases costs, all complicated by regulatory requirements and other artificial constraints. With Data gravity, the laws of physics and IT intersect to provide a proxy for a new age of business architectures that enterprises will be driven to adopt, and service providers will be pressed to support.”
Data Gravity is forcing a new data-centric architecture which inverts traffic flow to bring users, networks, and clouds to privately hosted enterprise data. With this new architecture, data gravity barriers are removed, and new capabilities are unlocked.
For the Global 2000 Enterprises, data gravity means the multi-tenant datacentre has to elevate to a whole new level. It must be a secure, neutral meeting place to colocate private hosting of data that is interconnected to public services, users, partners, employees, and things, globally.
“No industry is immune from the disruptions and barriers created by data gravity, McCrory says. “For example, when you think about what security and latency issues can do to the healthcare and financial industries, it is not hard to imagine the amount of damage it causes. Our initiative to address data gravity allows us to help place service providers in an opportune position to gain competitive advantages. The key is positioning their capabilities adjacent to enterprise data at points of business presence. By partnering with a multi-tenant datacentre provider, they will be able to gain the direct interconnections that bring their capabilities to the data.”