Edge Computing became a fashionable concept in the world of technology, but what's the core of it and how it differs from other buzzwords - like cloud computing?
At the beginning of the computer age, computing machines were large devices that occupied entire rooms. Because of the convenience and ergonomics of use, centralization of calculations in large data centers was the main goal of engineers and only simple data entry terminals were available to users. In addition, due to the high price, usage of computers was limited only to scientific centers and large companies.
This trend reversed with the advent of the PC era initiated by the first models of Apple and IBM PC computers. Finally, users got machines that could perform quite complex operations. This state continued until the beginning of the mobile era and cloud computing. Premiere of Apple's iPhone can be considered as symbolic date of change in the data processing paradigm. Mobile devices with lower data processing efficiency in some way, gave an impulse to transfer this processing to the cloud. In addition, a large-scale data analytics - so-called Big Data - changed progress direction of technology. Such processing required a lot of data and high computing power in one place - a powerful server room - marketing calls it the cloud.
However, today we are witnessing another revolution - the IoT (Internet of Things) is expanding. Processors and sensors are cheaper, which causes their use in a growing number of devices. The amount of data produced by these machines is huge. Usage of video services on demand or even streaming games is becoming more and more popular. In the future, large amount of data will be also generated by autonomous cars. It changes the approach to data systems design. As it turns out -c sending all data to one (or several) central servers ceases to be optimal. Modern and efficient processors as well as smaller and decentralized data warehouses, which are closer to the customer, unload the network and provides lower transmission delays. And that's what edge computing is, a kind of return to data processing, not in data centers, but on the edge of the network.
A great example where edge computing can show its strength is video monitoring using artificial intelligence. The development of AI allows to detect objects on video recordings and it can help security services which started using these possibilities. System detects people on recordings and notifies the operator only when an anomaly is detected.
Until now, the standard practice was to send an HD stream from the camera to the central servers, where the appropriate computers made the classification of objects on the image and processed information. As a result, using such function required broadband Internet access and generated high data processing fees.
Currently, technological capabilities allow you to start the neural network directly on the camera. It constantly analyzes the image and detects anomalies locally. The server gets a part of the recording only if a disturbing situation is detected. This significantly reduces transmission consumption and a long-term price of the solution.
Such solutions are already used, for example, by Nest in their cameras for monitoring.
The development of Edge Computing is inextricably linked to the development of the IoT market that is growing at a rapid pace, as we have already written in previous articles. Microsoft itself declares investments in the segment at the level of 3 billion dollars. As you can see, Edge Computing is the future of IoT development.
Published by: admin in blog