Edge computing aims to deliver superior performance benefits by minimizing latency between client and server, and the dynamic scheduling and routing this requires also offers tremendous benefits in economizing and conserving energy resources. As we move from relatively few, huge datacenters to a diverse population with numerous, distributed points of presence, we can more ably consider the trade-offs and benefits of pursuing sustainable and efficient deployments. In this talk, Kurt will discuss how the mechanisms required to deliver a cost-efficient edge compute experience also support energy-efficiency and conservation objectives. Kurt will present a series of experiments illustrating the performance and efficiency gains achieved by using context-aware, location-optimized workload scheduling across a distributed edge vs. a single cloud instance.
Comparing Cloud vs Edge efficiencies using context-aware, location-optimized workload scheduling
About Speaker
Return to Talk Page
Check Out More Sessions

Sustainability in a small form factor
As Edge compute continues to grow, making that edge as sustainable as possible is a paramount concern. This quick session will cover software defined power

Why the edge is integral to the sustainable future for computing?
As data volumes grow, the shift to edge computing is integral to the IoT ecosystem. The edge will help to make projects both economically scalable