Edge, Data Centers, and Cloud


Twenty nine billion. That’s the latest projection for the number of connected devices on the global network by 2022. And roughly 18 billion of those devices will be related to the Internet of Things.

That means 18 billion nodes on the network, all vying for critical air time to communicate valuable data being captured. For some, the distance that data will need to traverse the network will be much shorter than for others, thanks to edge computing. Envision a mesh network of micro data centers with the ability to process and store data collected from intelligent devices and then push it straight to a central data center or a cloud repository, thus reducing the backhaul traffic. Edge computing moves computation into the cloud.

The true appeal of edge intelligence comes in moments when the idea of “real-time” isn’t just hyperbole. Think telehealth applications communicating life-or-death data or remote diagnostics on an oil rig in times of an emergency. And 5G accelerates the business case for edge computing as telecom providers look to add micro-data centers that are located either into or adjacent to their 5G towers.

Edge computing is a natural solution to mitigating data latency due simply to the fact that data does not need to process over a network to a data center or cloud. In fact, some in the industry are even touting "single-digit millisecond latency" with new edge computing networks.

Bringing such public cloud capabilities to the edge of the network can be achieved in multiple ways. One is a custom software stack emulating the cloud services running on existing hardware, often referred to as the device edge. Another, referred to as the cloud edge, involves extending the public cloud to multiple point-of-presence (PoP) locations. The method chosen simply comes down to the use case.

Major cloud providers like AWS and Microsoft are launching products aimed at the edge. And these companies are spending big money on data centers across the globe to enable more in the cloud. Some reports estimate that cloud companies and data center providers together spent $20 billion in 2017 to purchase properties to house their computer servers. It’s a record investment over the previous three years combined.

Data centers come in multiple forms, from the fully distributed down to the micro edge data center, which pull data further out to the edge where it is being produced and consumed. This is vital as we look at new markets like autonomous vehicles where a bevy of computational and regulatory and compliance data needs to be processed in a moment’s notice and at all times.

Within those data centers, the total cost of ownership continues to come down, with the footprint of equipment continuing to shrink thanks to virtualized infrastructure. Virtualized high-performance compute and storage platforms, coupled with standardized software and processors not only reduce the physical infrastructure needed to operate the network, but also lower IT management costs. Even satellite providers are moving in the direction of a virtualized hub infrastructure with the ability to scale up and down on demand, with some even running intelligent gateways that can support a range of hub-side applications.

Overall, strategies that fuel the way in which data is stored and transmitted will remain in constant evolution. TIA’s working groups, training opportunities, business networking, videos and services are ready to help you navigate the developing data center, edge computing and cloud arena.

If It’s Edge, It’s Containers. But Not Exclusively!

New applications, services and workloads increasingly demand a different kind of network architecture. What are key components of that network? TIA NOW’s Clarence Reynolds discusses networking at the edge with Ildiko Vancsa, Ecosystem Technical Lead at the Openstack Foundation, Chris Price, President of Ericsson Software Technology and Beth Cohen, Cloud Networking Product Manager at Verizon.

The Road to 1 Trillion IoT Devices Begins at the Edge

Today’s cloud is not built to support the demands of tomorrow’s network infrastructure where the edge of the network will be lined with 1 trillion intelligent devices. So what do we need to begin building tomorrow’s network today? Bob Monkman, Director of Networking Software Strategy at Arm, joins TIA NOW to discuss the path forward.

Brain Meets Brawn in the Edge vs. Cloud Debate

What are the fundamental challenges of edge computing vs. cloud computing? Are there benefits to having real-time apps at the edge? Said Ouissal, CEO and Founder of ZEDEDA, joins TIA NOW to discuss the mutually beneficial coexistence of both edge and cloud.

An Awakening for ONAP

The Open Network Automation Platform (ONAP) is on a journey to maturity, seeking to fortify its resiliency, scalability and security. What challenges will ONAP encounter? Helen Chen, Principal Architect at Huawei, and Jason Hunt, Executive Software Architect at IBM, speak with TIA NOW about ONAP’s path forward.

Opening Up About Open Source

What can you expect when you allow vendors to leverage your external networking assets? What are the benefits of open specs in a network environment? Cliff Grossner, Senior Research Director & Advisor for the Cloud and Data Center Research Practice at IHS Markit, joins us with answers, and the significant breakthroughs that have happened so…

Getting Real About AI in Virtual Networks

How will network transformation impact advances in AI? Is the industry prepared or unprepared to take on these challenges? TIA NOW speaks with Arpit Joshipura, GM of Networking and Orchestration at the Linux Foundation and Manish Vyas, President of Communications Business & Chief Executive of Network Services at Tech Mahindra about preparing virtual networks for…

Data Centers: Ribbon Technology

Vernon Yow of Sumitomo Electric Lightwave talks about ribbon technology and how that impacts improved cabling. Yow speaks further about the applications of ribbon technology for data centers and how ultra-high-fiber-count ribbon cables meet the growing need for high fiber density in data centers.

Compute Power at the Edge of the Network

Scott Armul, VP and GM at DC Power and Outside Plant Products at Vertiv, spoke with TIA NOW about the balance of using legacy networks and new infrastructures. Armul goes on to say that the trend of compute power being pushed to the edge of the network allows Vertiv to reduce latency and improve the…

Supporting 5G: The Evolving Service Environment

Analysts from IDC including spoke with TIA NOW about the tough issues that the communications technology industry is currently tackling, which were covered at the TIA Connectivity Jam in Dallas, TX. These issues span from data management, edge computing, connected devices, network benchmarking and artificial intelligence.

Connecting Core to the Powerful Edge

On this segment of TIA NOW from TIA’s Connectivity Jam in Dallas, TX we are joined by industry experts on the subject of connecting the core to the powerful edge by reflecting on industry use cases. On this discussion are Michael Hites, CIO, University of Illinois System; Mark Pyatt, Senior Director, Operational Integrity, Global Oil…

Data Centers Redefined: Virtual and Physical

From service operators to the enterprise to application providers, data centers are evolving as business models shift, customer demands grow and technology capabilities develop. On this segment to tell us more about the changing role of data centers is Hugh Carspecken, CEO of DartPoints.

SDN in Data Centers

Leveraging more programmable networks based on SDN principles that support an operator's need to grow their business is what these panelists discuss with TIA NOW from TIA's Connectivity Jam.