Cloud computing has become mainstream and is still growing, since more businesses are shifting to ecommerce, whilst harnessing every available cloud solution. Yet an emerging technology known as Edge Computing has recently been trending, promoting many to ask if poses as a better solution Cloud Computing.
According to Nima Negahban, CTO of Kinetica, Edge Computing in simpler words is data analysis that happens live in a device and in real-time. The difference of Cloud Computing and Edge Computing is that the latter is focused on locally processing the data. The former is about processing data on a public cloud or a data center.
Red Hat chief technology strategist E.G Nadhan described Edge Computing as the science of letting Edge Devices do all the work without need to transfer data to a server environment.
Main Distinction of Edge Computing From Cloud Computing
Principal analyst with ABI Research, Ryan Martin, compares cloud and edge computing to the investment structure called hub-and-spoke model. Describing the Cloud as the Hub, while everything not inside the Hub is the edge, enabling organizations to move decision making and analytics closer to the production of data.
Edge Computing was invented to solve the problems of Cloud Computing, which suffers from latency when getting data to datacenter for processing. With the increasing use of devices working on the Internet of Things (IoT), Edge Computing is also projected to grow, as market analyst project it will be worth $6.72 billion by 2022.
After all, this new technology trend is already in wide use, from optimization of streaming video to its use in smart watches, in analyzing traffic flow, in managing drone-enabled crop management, and in monitoring the safety of oil rigs.This trend will not only grow but will create new jobs for people in the industry, particularly software engineers.