Fog computing is more than just cloud computing meeting edge networks, but for practitioners more interested in results than in new buzzwords, the question is whether fog’s IoT promise deliver real-time data storage, processing and analysis.
One thing all new technologies share is the lack of a consensus on what exactly defines the technology. Ask some folks what “fog computing” is, and they’ll tell you it’s synonymous with edge networks. That’s the claim made in a July 29 post on Nanalyze, which traces fog computing concepts back to the days of mainframes and dumb terminals.
On the contrary, ReadWrite’s Ryan Matthew Pierson explained in an August 5, 2016 article the many ways that fog computing is distinguished from edge networks. The primary distinction, according to Pierson, is that edge networks connect sensors to programmable automation controllers (PACs) for data storage and processing, while fog computing requires that the sensor data be transferred to local gateway at the end of the cloud “chain.”
Pierson claims the fog computing model is more scalable than edge networks because each gateway has more power to “make more complex and dynamic decisions.” The gateway approach also accommodates a range of data at various flow rates. However, he said, this design creates more potential failure points.
Fog Computing = Edge Networks + Analytics?
As you might expect, there’s a third, middle-ground approach that defines fog computing as edge networks to which real-time analytics have been added at the node level. That’s how FogHorn Systems CEO David King explained the technology in a Sept. 13 article by CIO’s Stephen Lawson. FogHorn’s Lightning software platform is designed to deliver real-time analytics to IoT gateways, as well as to existing industrial products, including hard-wired controllers deeply embedded in equipment such as locomotives. Some of these embedded controllers are based on proprietary OSes that have been around for decades.
The key feature of Lightning’s design, according to King, is its close ties to cloud servers, which ensures that the data collected at the edge of the network is available to analytics tools that reside in the cloud. CIO’s Lawson quoted Machina Research analyst Aapo Markkanen as stating that linking IoT data sources to cloud analytics is the only way to get the “big picture” view required to ensure accurate analyses.
In an Aug. 19 interview with App Developer Magazine’s Richard Harris, King explains how fog computing addresses the shortcomings of previous distributed edge networks in terms of bandwidth use and the resulting “performance-killing application latency.” The FogHorn system is designed to move the “intelligence layer” closer to the machine data, resulting in faster and more cost-effective analysis. King claims the company’s software makes possible “real-time remote monitoring, asset optimization, proactive maintenance and operational intelligence applications.”
Fog computing adds a new layer that connects mobile, IoT (sensors, embedded controllers), and vehicle control systems to the depth and breadth of data resources in the cloud. Source: SpinalCom via SlideShare
Many of the features promised in fog computing systems—remote monitoring, asset optimization and proactive maintenance—are available today in best-in-class tools. You should be able to monitor SSH and Agent-based connectivity to apps running in private, public and hybrid clouds.
Fog Computing Delivers Analytics for Time-Critical Applications
The cloud’s centralized nature has never been a good match for the short response times required by data processing and analytics in IoT environments. Ben Dickson wrote in an Aug. 2 TechCrunch article that fog’s decentralized architecture moves the analysis tools and other resources closer to the edge, where the data is being collected and actions applied. Telemedicine, autonomous vehicle control and remote security systems are examples of time-critical applications requiring near-real-time analytics.
Another advantage of maintaining data at the edge of a widely distributed network is the privacy and security benefits of keeping sensitive data off the internet, where it would be subject to strict privacy regulations. Dickson presented the “fog layer” of the network as offering the compute, storage and network resources required to “mimic cloud capabilities.” By ingesting data locally, you can analyze it and convert it into action almost instantly.
The complete IoT model extends from the data source (human or machine) to the data center in the cloud, with much analytics-driven decision making occurring in the “fog layer.” Source: Moor Insights & Strategies, via Forbes
Dickson pointed out that the fog layer serves as a complement to cloud computing rather than as a replacement. In fact, fog computing couldn’t exist without all the data, tools and other resources now residing in the cloud. The larger data sets and deeper analyses being done in the cloud serve as the foundation for the fog layer’s quick-turnaround analytics. The fog’s capabilities are constantly being updated and enhanced based on the big-picture view the cloud affords.
Fog Computing Standards Take Shape
Today’s mobile networks simply can’t meet the needs of tomorrow’s data operations. Dropped connections, bandwidth bottlenecks and latency—the time it takes for data to travel from point to point on the network—are all too common. Upgrading the “infrastructure” at the edge of the network requires a concerted effort to ensure open access for diverse tools and resources—some of which we can’t even imagine, let alone prepare for.
Princeton University’s Edge Lab is taking the lead in bringing together tech vendors, organizations and customers of all stripes to create the Open Fog Consortium, which is intended to design an architecture for fog computing. The official announcement of the group was announced at the Internet of Things World Forum in Dubai Dec. 6, 2015; John Sullivan reported on the announcement in a Dec. 16 post on the Princeton University blog.
Open systems have proven their worth many times over, but an open architecture for IoT networks is even more important. Estimates are that by the end of 2018, there will be 22 billion connected devices generating an unending stream of unstructured and semistructured data, according to numbers from IDC research quoted by Forbes’ Gil Press in a Nov. 10, 2015 article.
The flood of data would overwhelm the bandwidth and network capacity organizations currently have, as Jelani Harper explains in a March 10 article on Data Informed. Only an open, decentralized model that keeps the bulk of data and its supporting resources at the edge of the network will prevent a “datapocalypse.”
For many network requirements, cloud computing and fog computing complement each other. Source: Cisco Systems
How much of a company’s data can be maintained in these edge-network nodes? Harper projects that as much as 90 percent of the data that will be collected by IoT devices will never reach the data center. Not only does this reduce strain on the network, it also speeds up “time to action” because analyses are performed where the data resides: on the edge.
The advent of fog computing can be seen as an expansion of the cloud via the creation of a “fog layer” in which many data assets and resources are reallocated. That layer is where IoT’s incredible data reach meets the cloud’s infinitely scalable compute, storage and big data analytics. What is fog, after all, but cloud you can feel?