Edge computing aims to deliver superior performance benefits by minimizing latency between client and server, and the dynamic scheduling and routing this requires also offers tremendous benefits in economizing and conserving energy resources. As we move from relatively few, huge datacenters to a diverse population with numerous, distributed points of presence, we can more ably consider the trade-offs and benefits of pursuing sustainable and efficient deployments. In this talk, we will discuss how the mechanisms required to deliver a cost-efficient edge compute experience also support energy-efficiency and conservation objectives. We present a series of experiments illustrating the performance and efficiency gains achieved by using context-aware, location-optimized workload scheduling across a distributed edge vs. a single cloud instance.
Comparing Cloud vs Edge efficiencies using context-aware, location-optimized workload scheduling
Check Out More Sessions

The Future of Edge: How the Evolution of the Cloud Infrastructure, IoT and Next-Gen Applications Are Driving the Next Wave of the Edge
Edge computing is becoming a practical reality for companies who want to deliver high-performance, low-latency services and applications to customers across the spectrum of commercial

Sustainability in a Small Form Factor
As Edge compute continues to grow, making that edge as sustainable as possible is a paramount concern. This quick session will cover software defined power