Comparing Cloud vs Edge efficiencies using context-aware, location-optimized workload scheduling

Edge computing aims to deliver superior performance benefits by minimizing latency between client and server, and the dynamic scheduling and routing this requires also offers tremendous benefits in economizing and conserving energy resources. As we move from relatively few, huge datacenters to a diverse population with numerous, distributed points of presence, we can more ably consider the trade-offs and benefits of pursuing sustainable and efficient deployments. In this talk, we will discuss how the mechanisms required to deliver a cost-efficient edge compute experience also support energy-efficiency and conservation objectives. We present a series of experiments illustrating the performance and efficiency gains achieved by using context-aware, location-optimized workload scheduling across a distributed edge vs. a single cloud instance.

About Speaker

Return to Talk Page

Check Out More Sessions