The diverse and successful implementations of Artificial Intelligence (AI) across domains have enabled the delivery of augmented and personalised experiences to users at work as well as in their everyday lives. However, in recent times, edge-based AI is upping the ante and offering an enhanced experience by bringing the data and the compute closer to the point of interaction. Most of us are already consuming it in our daily lives. The autocorrect suggestions on our smartphone keyboards or several capabilities of the voice assistant that we use countless times throughout the day are examples of consumer-facing edge-based AI solutions.
What is Edge-based Artificial Intelligence?
Edge-based AI uses machine learning algorithms to process data generated by a hardware device (Internet of Things endpoints, gateways, and other devices at the point of use) at the local level instead of sending data to remote servers or the cloud.
When an endpoint device interacts with the cloud, the information it captures is sent to the cloud, where the entire processing is done, and results are then sent back from the cloud to the device – which then communicates with the end-user or machine. However, edge-based AI allows processing of data on the device itself. This not only reduces the amount of network traffic flowing back to the cloud devices as well as decreases the response time for IoT devices but also improves the overall performance.
Why is AI on Edge relevant?
Edge-based AI is the only answer to business situations where we need near real-time responses. While it is highly desirable in other instances for its ability to read and analyze data in near real-time and thereby tailor experiences that are hyper contextualized.
The reduction of bandwidth, as well as response time it offers, is significant, particularly, if there are thousands or millions of connected devices going through any kind of communication media, whether it is a Wi-Fi or cellular or RF. Also, where the connectivity is expensive or unavailable, especially in remote areas.
For instance, in remote locations such as mining fields or oil rigs, data from IoT applications, sensors, and devices can deliver real-time operational intelligence. In medical science, AI on the edge can be critical where a split-second delay in data availability or transmission could mean the end of life.
AI on the edge also brings to us the advantage of perfecting AI models on the cloud before deploying them on the edge. The cloud can train and build models based on updates received from different remote edge-based AI models and serve as a centralized learning center. Local context-based knowledge can be shared with the cloud by sharing only the model parameters and not providing the actual data itself. This helps manage data privacy too. The central model once updated shares it back with all edge-based devices or clients. This method of distributed learning, also known as federated learning, is currently becoming very popular as it addresses the challenges of data privacy, shareability, and network transport limitations.
Success Stories of AI on the Edge
In case of an autonomous buggy used in controlled campus environments like industrial, educational campuses, and airports, for example, the buggies sense objects in real-time via advanced LiDAR (Light Detection and Ranging) and vision technologies and have to take a decision whether to stop or deviate basis the conditions. If the vehicle sends that information to the cloud and gets response post-processing, it might not be quick enough. Such autonomous vehicles, therefore, have an on-board computer equipped with an AI and deep learning engine to detect objects, lanes, and curbs and control the steering and brake systems.
Then there are advanced security cameras with AI capabilities like face recognition or motion detection orin the current times, even mask detection. On these cameras, all the image recognition and object identification are done locally on the edge device for a seamless user experience. Even while using drones for surveillance, it is not ideal to send data to the ground station and onboard processing is preferred. In the last one year, many organizations have adopted Edge-based AI applications to ensure the safety of those employees who are having to travel for work during the pandemic.
A federated model is the way forward
For long, across organizations, data was handled in silos across domains. Then came the evolution towards data lakes – a single repository of data. However, a centralized setup is not tenable for a lot of scenarios and poses privacy and security challenges. Therefore, as discussed, organizations are now moving to a federated model which offers a harmonized, decentralized architecture with comprehensive edge analytics and edge processing.
Edge-based AI creates a new realm of possibilities in terms of increasing operational efficiency and unlocking innovation opportunities. It allows AI models to be leveraged in remote locations with limited or no network connectivity, processing data and taking decisions in real-time.
By Balakrishna DR, Senior Vice President, Service Offering Head - Energy, Communications and Services, AI and Automation, Infosys