Featuring:
Brian Hopkins, VP, Principal Analyst
Show Notes:
Emerging technologies, like deep learning and neural nets, have revolutionary potential. But they’re hampered by lumbering runtimes and massive power requirements. With connected devices, latency piles up through endless round trips back to the cloud.
Edge computing seeks to make these technologies more efficient by bringing intelligence closer to the place where intelligence is needed. On this episode of What It Means, VP and Principal Analyst Brian Hopkins discusses what edge computing is and how it can unleash the collective power of intelligent devices.
Despite the digital revolution, humans still live in the real world, only now the real world is rife with computing devices such as smartphones, laptops, and internet of things (IoT) technologies. Currently, if two computing devices wish to communicate, they must take an input, route it back to a cloud or data center for processing, and then return it the device.
This is fine for simple processes, such as using AirDrop for a photo. But for complex operations, the processing must be done at the “edge” of the network, rather than in a centralized cloud or data center. For example, one day your smartphone may recognize that you need to cross a street and trigger the pedestrian signal for you — but only if your smartphone and the traffic system can directly address each other, avoiding latency.