IoT devices provide the data and big data analytics allows for extracting insights. However, a monumental challenge arises: Where will all this data be processed and stored?
The Internet of Things (IoT) has been a hot area in the last few years. The number of connected devices has been growing steadily with Gartner forecasting that IoT devices will outnumber the world’s population in 2017: 8.4 billion connected things in 2017 and 50 billion in 2020. These connected devices generate massive amounts of data. Today, devices and appliances that were not previously connected (fridges, cars, watches, etc.) are equipped with sensors and peripherals that generate data.
Alongside IoT, enterprises are betting hard on big data. Data is the most precious resource of our digital economy. Many enterprises are applying big data analytics to harness this vast amount of data and take advantage of the insights it provides: identifying trends and patterns to deliver improved services and experiences to their customers, helping companies monitor and streamline their operations, or perform preventive maintenance of machinery and infrastructure.
The business process is similar across many applications. IoT devices provide the data and big data analytics allows for extracting insights. However, a monumental challenge arises: Where will all this data be processed and stored?
The rapid growth of computing devices is not the only driver for the explosion of data challenging the central cloud computing model. Another important trend has caused a shift in the production and consumption of data: user-generated content at the edge of the network.
Mobile internet and social media has empowered ordinary people to become producers of data. Today, nearly 500 million photos are uploaded on Facebook and Instagram and roughly 500 thousand hours of video is uploaded to YouTube daily. Also, more video is uploaded to YouTube in one month than the three major US networks created in over 60 years. These figures give a sense of the astonishing amount of data that users generate on a regular basis. In machine applications, there is a similar trend. Edge devices have many embedded sensors or even cameras generating massive amounts of data.
Transporting all the data generated at the edge to the central cloud, processing and analyzing it on servers in remote data centres and then transporting it back to edge devices (whether a smartphone, a fridge, a car, or a robot) is not feasible and scalable. Centralized cloud computing has two big limitations when it comes to meet the demands of a connected world: bandwidth and latency.
Using central cloud, bandwidth will be the bottleneck for the growth of IoT. Even if the network capacity is miraculously increased to cope with the data, laws of physics inhibit remote processing of data in the central cloud due to large latencies in the long-haul transmission of data. It is clear that we need a new computing model to cope with the hyper-connected world.
Decentralization the Future of Computing
Computing started with a centralized architecture of mainframes which then evolved to a distributed computing model in the 1980s as personal computers came into play. The Internet era initially began with a centralized client-server architecture that later became the current central cloud computing model. The question is, where are we going next?
We clearly need a paradigm shift to transform tens of billions of devices from a challenge to an opportunity, unleashing the power of computing devices at the edge. A pragmatic solution is to build a fully decentralized architecture where every computing device is a cloud server. Edge devices can process data locally, can communicate with other devices directly and can share resources with other edge devices to unburden central cloud computing resources. This architecture is faster, more efficient and more scalable. Also, there are significant social and economic implications. A decentralized architecture is more private in nature since it minimizes central trust entities and is more cost efficient since it leverages unused computing resources at the edge.
Does this mean central cloud computing is dead? I believe not. Edge cloud will not replace central cloud. Some applications may be better suited to use centralized resources. However, central cloud (servers in data centers) should be considered as computing nodes working along with all the edge devices to build a distributed edge cloud architecture. Is your business ready to harness computing resources at the edge and achieve better efficiencies, privacy, and create opportunities for new applications?