Like many IT innovations, edge computing began with engineers as a natural extension of technology to address a growing need. The concept isn’t new; distributed computing has been around for decades. But, at the same time standards began to converge and edge hardware started making the rounds at trade shows, the hype machine saw an opportunity. It amplified edge’s considerable promise in reducing latency, offering software-defined deployment, decreasing cloud networking costs and more. But as is too often the case, the bold feature bullets ignored the production concerns businesses must address, including edge computing’s rough spots and the additional operations complexity it adds.
Of course, edge computing will survive a little overexcited promotion, just like many of the once improbable technologies before it. People used to say, “What? Abstract all my data center applications away from the hardware as virtual servers? Impossible!” A decade later, we can’t imagine how we’d deliver traditional enterprise services, cloud computing, online retail, media streaming and everything else in between without exactly this. Virtualization survived its awkward hype adolescence, and edge computing will, too. The needs edge computing addresses are only growing.
Thanks to engineers and operations teams, the edge distributed model is moving toward practical use. It’s proving itself capable of meeting requirements for new levels of network performance through reduced latency, scalability and, more importantly, manageability. For some businesses, it’s even reducing costs over the long haul.
Not All Data Is Created Equal
With the proliferation of connected devices and a growing focus on 5G-enabled technology, tech pros should set aside their natural reluctance to wade through the edge hype and consider it a genuine possibility. Its adoption is following the rise of emerging technologies and the applications taking best advantage of it: 5G, augmented reality, autonomous vehicles, IoT and smart manufacturing. These environments require not only low upstream latency, but high-performance compute and timely result data. Light only travels so fast, pushing infrastructure closer and closer to consumers for faster, more seamless processing in the form of brand-differentiating user experiences.