Elasticity, scalability, security; moving to the cloud has allowed organizations in all sectors to improve the efficiency and effectiveness of their IT systems in unprecedented ways.
But as IT consultant Andrew Froehlich observed recently in a commentary at InformationWeek, most those advances have come inside the cloud, leaving stakeholders with what he considers “one final problem -- connectivity to the cloud.”
As he sees things, “While bandwidth and latency can be managed with relative ease when working with private clouds and leased lines, most companies are moving toward the use of public clouds and the use of Internet broadband connections to access cloud content. . . . What this means is that cloud data must be accessible no matter who the user is, where they are located, or what device is used. This issue becomes incredibly complex for IT administrators because the use of the public cloud and Internet broadband technologies to access the cloud means they lack the ability to control the network between the end user and that cloud. The further away a user is from the cloud, the more likely they are to run into connectivity and latency problems that can impact the usefulness of a real-time application.”
What’s the solution? Well, simply put, move cloud resources closer to the end user. One way of doing that, Froehlich says, is by using “fog computing,” which he defines as “an architectural design that uses a three-tiered model between the end device and cloud resources the device is interacting with. With this method, large cloud data centers still exist and manage all the data and much of the data computation. But at the same time, a portion of the CPU – and bandwidth-intensive processing – is moved much closer to the edge of the network,and ultimately closer to the end device.”
Rapidly emerging “edge computing” is another option, he suggests, noting that whereas “fog computing moves a portion of the data processing to a layer between the edge device and the cloud, edge computing occurs on the end device itself. …So instead of centralizing processing in the cloud and waiting for the results across a slow connection, edge computing apps harness the power of the end device system on chip (SoC) itself to perform real-time calculations and processing.”
In the end, he says there’s no right or wrong architecture when it comes to the elimination of cloud connectivity bottlenecks. Each have their own pros and cons. “The bottom line, however, is that we will soon reach the point where connectivity to cloud resources is going to become a significant issue in terms of real-time computing functions. Thus, the solution to the problem likely won't be to eliminate cloud computing all together. Rather, it will be to bring the cloud closer to us.”