IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
Equinix: Data is pouring over the edge
Thu, 19th Oct 2017
FYI, this story is more than a year old

The center of data gravity is moving.

Enterprise data (structured or unstructured and at rest and in transit) is now not just within your centralized corporate data center, but also out at the furthest edges of your enterprise, where an increasing number of interconnected things, digital platforms and real-time user interfaces generate more and more data traffic.

A new Gartner research note, “Maverick Research: The Edge Will Eat the Cloud,” describes this phenomenon in the following way: “The architecture of IT will flip upside down, as data and content move from centralized cloud and data centers to the edge, pulling compute and storage with it.

Data is increasingly becoming the currency of the global digital economy.

And, it is the direct, private data exchange between and among businesses that is driving global economic growth.

For enterprises to monetize their data and extract the most value from it, they must re-think their data architectures to find the best ways to aggregate, exchange and manage data at the edge - and at scale.

This search ultimately creates a new demand for direct and secure Interconnection to business partners, value chains and digital ecosystems.

As a result, the amount of Interconnection Bandwidth capacity digital businesses require is increasing exponentially, well beyond what traditional connectivity approaches can achieve.

The “Global Interconnection Index,” published by Equinix, predicts that Interconnection Bandwidth capacity is estimated to grow 45% CAGR, reaching over 5,000 Tbps by 2020 - nearly 2X faster and 6X greater than global IP traffic.

That is enough Interconnection Bandwidth to process nearly 550,000 electronic payments per minute. Assuming the average value is $50, that's $27.5M per minute or $1.6B per hour.

In response to this enormous pull of data gravity at the edge, many enterprises are adopting distributed, edge IT architectures, where they can deliver high-bandwidth, low-latency data Interconnection closer to users, partners and customers for a better end-user experience.

They are also leveraging Interconnection at the edge to integrate complex ecosystems of multi-vendor (cloud, content and network providers and systems integrators) and digital technologies (e.g., IoT, cloud, big data, analytics) to achieve greater data automation for enhanced security and control.

Here's how they're doing it.

Architecting for the digital edge

To architect for the digital edge, you need to localize some data requirements in a digital edge node, balance protection with accessibility, and govern data movement and placement.

Each digital edge node (see the diagram below) is tailored for the local or shared data services at that geographic location, placing you in control of your data and performance.

Digital Edge Node

Move processing to where the data is

Putting your data proximate to users, applications, analytics, clouds and security controls reduces the latency that can cause delays in data collection as well as access, processing, analysis and the securing of data in real-time.

By keeping data local, you can enhance performance, insights, value, protection, and compliance and data sovereignty for your company and customer data. Many of our customers are leveraging an Interconnection Oriented Architecture (IOA) strategy to realize greater performance, security and control of their data.

They are also extracting greater value and monetizing digital assets as a result of real-time data exchanges between themselves and their business partners and customers.

An IOA framework allows you to better manage and secure data at the digital edge, where commerce, population centers and digital ecosystems meet. It enables you to localize all of your data capabilities in a digital edge node to balance protection with accessibility, and govern data movement and placement.

After you have re-architected your network and security infrastructures as prescribed in the IOA Knowledge Base, you can begin to plan your data implementation.

How you deploy a digital edge node is based on the amount of data processing you need and the types of capabilities you desire. These can include:

  • Securing data traffic with traceability and maintaining control of data at all times
  • Optimizing data placement for performance and governing data by policy
  • Providing multicloud application workloads with secure, low-latency access to data
  • Reduced cloud ingress/egress costs
  • Governing data by policy to stay in constant compliance with corporate and regulation guidelines and regulations and easily adapt to company or regulatory changes
  • Minimizing risk of data loss, leakage and theft without compromising accessibility
  • Cost-effectively storing petabytes of data across multiple locations, with a single global namespace
  • Leveraging cloud-agnostic data services, with secure multicloud access and reduced cloud data ingress/egress costs

Within the digital edge node, you can start with data accessibility, security and control - without compromising the critical value propositions clouds and digital ecosystems offer.

For example, running multicloud application workloads doesn't require moving data - just accessing it locally in the edge node over direct, secure, low-latency connections.

This tactic also minimizes the risks of data loss, leakage and theft. In addition, it optimizes business value, enforces regulatory compliance and helps you maintain control of your data at all times.

Data Blueprint Design Patterns

The data blueprint, which is one of four IOA blueprints (network, security, data and application) is supported by five data design patterns.

The data blueprint illustrates how to set up a data fabric at the edge.

Together, the following data design patterns map a strategy for planning your distributed, interconnected-data deployments:

Data Cache and Edge Placement

Addresses data performance and sovereignty by placing data caches/copies proximate to users, systems, applications and analytics, or clouds that need them (unless the decision is made to keep data local).

This design pattern also solves for multicloud access by placing data at network intersection points, allowing multiple clouds to operate on the data (e.g., processing, analytics, etc.), while keeping the data in your control and protected by your security policies.

Distributed Data Repository

Solves data availability issues by placing a global data namespace over the edge nodes and extending that into the various clouds, as appropriate (i.e., distributed data lakes).

Edge Analytics and Streaming Flows

Leverages edge nodes for data collection and aggregation, with local event processing to optimize streaming for rapid time to insight.

Data Exchanges and Data Integration

Combines data from disparate sources into meaningful information and delivers it as trusted data, which can be monetized and exchanged with multiple partners. Data exchanges support multiple users, processes and applications that want access to the same data (multiple in and multiple out).

Data Pipelines and Provenance

Deploys data orchestration and data provenance to facilitate and track data flows and consumption from disparate sources across the data fabric.

We will discuss each of these design patterns and how to achieve them in upcoming blogs for the series on deploying data at your digital edge.

In the meantime, visit the IOA Knowledge Base for vendor-neutral blueprints that take you step-by-step through the right patterns for your architecture.

Data at the edge use case: A distributed data repository

As part of its digital transformation roadmap, a global digital media company sought to leverage a multicloud strategy, starting with the IBM Bluemix PaaS, where its global workforce could harness a more agile and custom application platform.

Control, security, open architecture and performance were critical requirements. The company also wanted to understand consumption trends for cost allocations back to its business units.

The media multinational understood that a high-performance, distributed, data-driven architecture would require a hybrid IT platform, as new applications from its digital supply chain partners were either hosted across multiple clouds or required custom integration.

Having fast, secure and direct access to public cloud providers, as more cloud-based applications and workloads are being rolled out for its global workforce, helps the media company manage the end-user experience, while at the same time cater to any seasonal, high-traffic volumes - something the “best efforts” of the public internet could not deliver.

The company started by placing initial network and cloud aggregation nodes in an Equinix Performance Hub, with direct access to cloud edge nodes via the Equinix Cloud Exchange, in Tokyo and Singapore.

Once stabilized, the team migrated additional workloads to this hybrid IT architecture, and eventually expanded into a distributed data repository within an Equinix Data Hub.

The media company is now actively working on the next phase of its transformation journey—replicating this distributed architecture to additional global sites.

Article by Hisham Muhammad, Equinix blog network