At this point you’ve certainly heard rumblings of the promises of edge computing and may wonder if it’s right for your organization and how to assess its viability. The way we think about edge is as a decentralized extension, or evolution, of cloud computing, enabling many of the same technologies and applications to be leveraged in areas/locations that weren’t possible before.
I believe that the promise of edge computing and the opportunity it unlocks is valuable enough to be a major driver in our industry over the next decade. Here’s how to think about implementing edge computing in your organization.
Since edge is an evolving space, the exploration of edge computing should start as soon as possible, and we’d recommend a phased implementation approach to help deal with change as it comes.
Many of the companies we work with who delayed their cloud initiatives found that it contributed to their falling behind from a technology standpoint. Given the choice, it’s better to deal with some sharp edges — no pun intended — than to fall behind your competitors in reaping the benefits.
Edge requires careful life cycle analysis at the beginning of the journey, rather than an afterthought. The decentralized nature of edge computing raises the stakes and amplifies any potential problems you would have in other computing scenarios. So as you begin, keep in mind issues such as adaptable infrastructure, security and life cycling between hardware and software.
The Value of Adaptability, Linux and Open Source
Twenty-five years ago, innovation was a slow, laborious, bug-filled crawl — with any advances crippled by proprietary technology. The breakthrough that paved the way for the continuous, rapid pace of innovation possible today was Linux.
We need to heed the lessons from the early days around private cloud adoption and, more recently, Linux containers with regards to adaptability. Many man-hours were lost in slash-and-burn style rebuilding of infrastructures. The logistics at the edge make it impractical to have the same learn-as-we-go freedom we had inside the datacenter.
As we’re likely to see edge computing evolving quickly, the only adaptable architectures will be the ones that can avoid a total tear-down. Adaptable architectures will also be well-positioned to take advantage of the newer capabilities as they become available.
Remember, Linux has a strong history in the embedded world and is well adept at tackling a number of the historic challenges in this space. The more immediate impact is being felt at the distribution level. Our distributions, for example, have objectives around minimization and we expect to see more low-level enablement work on the security side for edge happening in the near future.
For edge computing to reach its full potential, we need to be completely driven by open source and open standards. What we’ve learned from previous technology waves is that while most proprietary technologies have their merits, they miss out on the multipliers that can come from a thriving community.
Security is a longer topic on it’s own, but given that we’re giving up many of the security barriers that are easy to take for granted in the datacenter and cloud (physical access, networking perimeter, security groups/policy, etc.), we are going to rely more and more on lower-level hardware and operating system security.
No company wants to risk a common vulnerability across a hundred thousand nodes. This makes it our job to keep enterprise deployments from following the pattern of some consumer-grade devices that are long out of software maintenance.
For edge computing innovation, we need to be thinking more about how we create sustainable solutions and technologies given how many deployments will require a longer life cycle and are more tightly bound to hardware and equipment refreshes. The path of innovation leads from Linux to and through the network edge. Companies that follow this approach will be better positioned to leverage the promise and power of the edge while avoiding fragmentation and lock-in.