Quantifying the explosion of enterprise data growth

[et_pb_section fb_built=”1″ _builder_version=”4.4.5″][et_pb_row _builder_version=”4.4.5″][et_pb_column _builder_version=”4.4.5″ type=”4_4″][et_pb_blurb image=”data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTA4MCIgaGVpZ2h0PSI1NDAiIHZpZXdCb3g9IjAgMCAxMDgwIDU0MCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxnIGZpbGw9Im5vbmUiIGZpbGwtcnVsZT0iZXZlbm9kZCI+CiAgICAgICAgPHBhdGggZmlsbD0iI0VCRUJFQiIgZD0iTTAgMGgxMDgwdjU0MEgweiIvPgogICAgICAgIDxwYXRoIGQ9Ik00NDUuNjQ5IDU0MGgtOTguOTk1TDE0NC42NDkgMzM3Ljk5NSAwIDQ4Mi42NDR2LTk4Ljk5NWwxMTYuMzY1LTExNi4zNjVjMTUuNjItMTUuNjIgNDAuOTQ3LTE1LjYyIDU2LjU2OCAwTDQ0NS42NSA1NDB6IiBmaWxsLW9wYWNpdHk9Ii4xIiBmaWxsPSIjMDAwIiBmaWxsLXJ1bGU9Im5vbnplcm8iLz4KICAgICAgICA8Y2lyY2xlIGZpbGwtb3BhY2l0eT0iLjA1IiBmaWxsPSIjMDAwIiBjeD0iMzMxIiBjeT0iMTQ4IiByPSI3MCIvPgogICAgICAgIDxwYXRoIGQ9Ik0xMDgwIDM3OXYxMTMuMTM3TDcyOC4xNjIgMTQwLjMgMzI4LjQ2MiA1NDBIMjE1LjMyNEw2OTkuODc4IDU1LjQ0NmMxNS42Mi0xNS42MiA0MC45NDgtMTUuNjIgNTYuNTY4IDBMMTA4MCAzNzl6IiBmaWxsLW9wYWNpdHk9Ii4yIiBmaWxsPSIjMDAwIiBmaWxsLXJ1bGU9Im5vbnplcm8iLz4KICAgIDwvZz4KPC9zdmc+Cg==” _builder_version=”4.4.5″]

In 2010, as part of Dell DCS, it became apparent to Dave McCrory that the creation and collection of data was a trend worth investigating. As that research progressed, new curiosities led to a larger question: what happens when the sources that create data and the collections of data begin to grow and interact?

The answer begins with viewing data as a planet or an object with sufficient mass. As data builds mass, it will likely attract additional services and applications. This is the same effect gravity has on objects around a planet.

A decade later Dave McCrory is vice president growth at Digital Realty and has been integral in developing the inaugural study on data gravity index. “For the last several years, I have continuously researched, tested, and tweaked a Data Gravity formula and methodology that would help explain the intensity of Data Gravity,” he explains. “Once I recognized the foundation of the formula (mass times activity), I knew I could finally start quantifying things that I could not before this realization.”

Since joining Digital Realty, McCrory has been working on what was, until now, a secret project called The Data Gravity Index DGx 1.0. “Now data gravity has a methodology, a patent-pending formula, and a decipherable pattern,” he says. “It is the first time data gravity has been measured and quantified for the Global 2000 Enterprises.”

Implications of data gravity

The Data Gravity Index DGx gives insight into just how rapidly data will be growing and consumed over the next five years and what all of this means to the Forbes Global 2000. It provides a data gravity score across 21 global metros to help enterprises understand the implications of the creation, aggregation, and private exchange of enterprise data.

“To effectively provide the first measurement and quantification of data gravity for the Global 2000 Enterprises, we spent 12 months pulling research from multiple authoritative sources,” McCrory adds. “Additionally, we conducted a comprehensive cause and effect study. In the end, we were able to develop a formula to measure, quantify, and determine the implications of the explosion of enterprise data growth.”

Calculating Data Gravity requires a methodology that includes data mass, data activity, bandwidth, and latency. The Data Gravity Index DGx implements a patent-pending formula which quantifies and predicts the continuous creation of data across 21 metros globally. Calculated in gigabytes of data per second, the score provides a relative proxy for measuring data creation, aggregation, and processing.

The attraction of data

But what exactly is data gravity. McCrory defines it as the effect that attracts large sets of data or highly active applications/services to other large sets of data or highly active applications/services, the same way gravity attracts planets or stars. “Smart companies would try to leverage the effect of data growth and attraction in the cloud,” he adds. “However, most large-scale companies do not fully appreciate how quickly their data is growing. They also didn’t recognize how many things generate data creation activity, such as sensors, systems, employees, customers and the applications/services that interface with them.”

All of these elements must operate as fast, efficiently, and securely as possible. Doing this requires being close to sources of data creation, processing, aggregation and exchange, and enrichment. Enterprises cannot expect optimal results from large volumes of data existing within slow networks that are located far away.

“If you are working with large data quantities and/or high levels of processing activity, then the other elements of the system must be brought closer to them, McCrory says.

Software is not eating the world anymore; data is eating the world, and it will continue to do so for the foreseeable future. Specifically, there are five macro trends amplifying data gravity, including enterprise data stewardship, mergers and acquisitions, digital-enabled interactions, data localization,  and cyber-physical.

“These factors make up the perfect storm for this explosion of data,” McCrory explains. “Data gravity inhibits enterprise workflow performance, raises security concerns, and increases costs, all complicated by regulatory requirements and other artificial constraints. With Data gravity, the laws of physics and IT intersect to provide a proxy for a new age of business architectures that enterprises will be driven to adopt, and service providers will be pressed to support.”

Data-centric architecture

Data Gravity is forcing a new data-centric architecture which inverts traffic flow to bring users, networks, and clouds to privately hosted enterprise data. With this new architecture, data gravity barriers are removed, and new capabilities are unlocked.

For the Global 2000 Enterprises, data gravity means the multi-tenant datacentre has to elevate to a whole new level. It must be a secure, neutral meeting place to colocate private hosting of data that is interconnected to public services, users, partners, employees, and things, globally.

“No industry is immune from the disruptions and barriers created by data gravity, McCrory says. “For example, when you think about what security and latency issues can do to the healthcare and financial industries, it is not hard to imagine the amount of damage it causes. Our initiative to address data gravity allows us to help place service providers in an opportune position to gain competitive advantages. The key is positioning their capabilities adjacent to enterprise data at points of business presence. By partnering with a multi-tenant datacentre provider, they will be able to gain the direct interconnections that bring their capabilities to the data.”

 

 

[/et_pb_blurb][/et_pb_column][/et_pb_row][/et_pb_section]

Partner Resources

Popular Right Now

Others have also read ...

article

2019 – 2020 What – Where – Why

Edge computing relying on location, latency and bandwidth has increased with IOT demands. It is not an instead of but complimenting traditional Enterprise facilities, colo and cloud to get closer to the data source or end users. Where 5G is rolling out enterprise opportunities will follow along with edge facilities. Edge growth in other regions will be more of a steady increase until their network is upgraded

Click to View
article

Modularity and data centre demands

Digital Infra Network spoke to Heiko Sommer, Senior Development Manager Data Center Business Global at Siemens, about meeting rising data centre demands with modularity. The

Click to View