The limitations of Cloud and why Fog Computing will play a key role in the Internet of Things

04 Nov 2015

We now, mostly, have a firm grasp on the business and economic value that a well structured cloud strategy can bring to most organisations.  Few would argue with the benefits as many companies are already reaping the rewards.

One of these benefits is the ability to centralise vast amounts of data in the cloud.  Aside from the cost reduction with running distributed networks and increased stability etc., it also now allows organisations to analyse their data from a single location. So by allowing more data from disparate systems to interact, this then also creates context which in data terms can be invaluable.

So what do you do when you have vast amounts of data emanating from a huge variety of end nodes that needs to be stored, processed and actioned in milliseconds; perhaps even across many different countries?  This problem then becomes exponentially worse when the need arises to analyse data streams such as traffic data, web searches or sensor data.  This type of data has a very limited life span and significantly diminishes in value by the minute/second.  Network congestion or latency can easily render the information useless before it’s even reached the cloud.

Enter “Fog Computing”!  Rather than have data analysis occur at the other end of the network in the cloud, Fog Computing resides at the ground level.  Data is still collected in much the same way as any other node, however the difference is that sensitive real-time streaming data is processed and subsequently actioned on-site almost instantaneously.

One comparison might be to say that whilst the human brain (aka the cloud) acts as the central decision maker in the human body, your hand is the equivalent of an end node that resides at the “Fog” layer. Your hand can tell you if something is hot or cold, rough or smooth, light or heavy.  However if you require the assistance of your forearm, your other hand or even your leg, you will need to adjust your body position where you are still relying on your brain to analysis and orchestrate those actions.

Now imagine the hand is instead a traffic sensor within a small Fog network at a busy city intersection that has just experienced a major accident.  Immediate decisions need to be made on the scene to divert traffic. Information is then shared within the local Fog network so that it can start making changes to the surrounding intersections and possibly interconnecting freeways to avoid the impending gridlock set to occur as the afternoon peak hour begins.

The cloud can then start assigning emergency services, etc., to the scene but more importantly collect valuable data on how and why the crash occurred so that improvements can be made in future.

Another benefit of localised Fog Computing and analytics is the node’s ability to collect daily data but only transmit data anomalies as they occur.  When connected to thousands of intersections across any given city, this can prevent network clogging and any potential failure caused by unnecessary data duplication.

The above example illustrates how Fog Computing can work in one type of application. The Internet of Things (IoT) will add many benefits to corporations and consumers alike, however with this technology comes greater complexity and this will be where Fog Computing really begins to shine.

Peter Sondergaard, Senior Vice President at Gartner and global head of research, predicts that over 30 billion “things” or nodes will be connected in the next five years.  He also states that one million new devices will come online each and every hour in the year 2020.

The technical logistics of managing everything in the cloud are immense and therefore the collection and analysis of data will need to be alleviated somewhat on the ground in real-time.

Fogging will become more commonplace and embed itself within our daily lives so expect to see more of this technology in the coming years as our hunger for real-time information grows.