Advertisment

Human centric artificial intelligence: Application in autonomous driving

Artificial intelligence plays a critical role in current applications, particularly in self-driving cars, conversations, medical devices, and robotics

author-image
DQINDIA Online
New Update
Artificial intelligence

As one of the world's fastest-growing economies and second-largest population, India has a significant stake in the Automotive Artificial Intelligence (AI) revolution. Originally conceptualized as a system that would mimic human intelligence, AI has developed well beyond its original vision. Intelligent systems may now be used to perform many tasks, enable communication, and boost productivity, thanks to remarkable breakthroughs in data collection, processing, and computation power.

Advertisment

The potential applications of AI across industries have proliferated as the capabilities of the technology increased. AI has the potential to add USD 957 billion, or 15 percent of India’s current gross value, in 2035, according to Accenture's 2021 AI research report. AI plays a critical role in current applications, particularly in self-driving cars, conversations, medical devices, and robotics. Today, globally, AI is helping in building systems that contribute to safety and convenience. For instance, AI/ ML plays a significant role in technologies that contribute to Vision Zero like ADAS systems, eHorizon, etc., and technologies built on the idea of Connect.Inform.Integrate, like Natural 3D displays, Digital Companion, and others.   AI is also critical for the autonomous driving of the future.

Artificial Intelligence powered Autonomous Driving

Autonomous fleets for ride-sharing, semi-autonomous features including driver assistance, and predictive engine monitoring and maintenance are all possible use cases in the field of AI. In addition, the technology can also have an impact on autonomous driving and delivery, as well as better traffic management. Moreover, AI also contributes to how the advanced systems used in these autonomous driving technologies are manufactured and tested.

Advertisment

For autonomous driving in an urban environment, reliable traffic light detection and categorization are essential. Along with detecting objects and pedestrians, AI is programmed to detect traffic signals, recognize the color of the light and let the vehicle make its move accordingly. Furthermore, autonomous vehicles can detect lanes and lane markings while tracking the lane going forward.

Making sure that autonomous cars stay on the road and in the right lane until they arrive at their destination might be difficult in some situations, especially while driving at high speeds. The first and most crucial task is reliable navigation, which is frequently accomplished using system vision to gather RGB images of the road for further processing. Long Short-Term Memory (LSTM), designed based on RNN, is used to employ lane tracking in autonomous cars. LSTM has excellent effectiveness in such broad practical applications.

Furthermore, AI in Autonomous Driving boosts the detection capabilities of the vehicle. Object detection and pedestrian detection abilities are enhanced by employing various algorithms like Region-based Convolutional Neural Networks (R-CNN), Fast Region-based Convolutional Neural Networks (Fast R-CNN), and Single Shot Detector (SSD). These algorithms work in the lines of Deep Learning, and the images that result as end-products of detection are based on state-of-the-art Deep Learning Architecture (ResNet-50), which improves the quality of the result by a factor of 5.

Advertisment

Object detection feature also helps offer solutions to avoid accidents due to lost cargo/debris lying on the road. This is one of the essential factors to be addressed in Autonomous Driving safety. For example, if a large tire is lying on the road, it could cause severe accidents. Such a case can trigger a chain of collisions. This approach is required to avoid such pileups, which can range in size from minor to major, causing a great deal of uncertainty and multiple injuries and fatalities. To prevent this, AI in AD is used to detect objects on the road, instructing the car to maneuver itself in a manner in which to avoid contact with the objects.

In addition to this, under the pedestrian detection segment, AI helps carry out functions like the Pedestrian Pose Estimation, where the algorithms detect gestures and intention of the pedestrian by comparing with the holistic human pose. Pose estimation algorithms have a wide range of applications, including gesture control, action recognition, and augmented reality. PoseNet regresses pedestrian pose from a single RGB image using a Convolution Neural Network (CNN) model/Deep Neural Network (DNN) model. DeepPose is a solution that helps to formulate the pose estimation problem triggered due to the DNN-based regression problem towards body joints.

Challenges

Advertisment

As clear from above, employing AI in autonomous driving is a complicated process. The DNN model used in AI cannot interpret data on its own. These neural networks are essentially black boxes.

Deep neural networks approach reality modeling differently. They put together thousands of linear functions into a high-dimensional manifold that fits the training set, changing each one slightly for each training example. They can't help but make predictions due to inductive pattern-matching onto what they've observed before. They mirror the chaos and complexity of the phenomena they witness rather than explaining it, as people do. Explainable/Interpretable frameworks like the Tensorflow 2.0 are needed to overcome this challenge.

In reality, today's neural networks are excellent interpolators but poor at extrapolation. Deep Learning is not good at dealing with edge cases. They're powerful pattern-matchers who can bend themselves to fit practically any dataset, but their fitting ignores the mechanics that create the data in the first place. Neural networks do not yet engage in the creative search for explanatory theories that explain why the data is as it is like people do. They also don't go on to try to disprove each of those creative theories until one emerges as the best explanation for the data observed as humans do. This poses a considerable risk in terms of autonomous driving. These networks arrive at the results by induction and not a deduction.

Advertisment

Human-Centric Artificial Intelligence

With a high intelligence of its own, AI can stir up many challenges. AI applications are engineered to review and overcome a particular group of loss functions, resulting in undesirable side effects in other loss functions. AI also comes with many unintended consequences in ethics, bias, privacy, etc., calling for scrutiny and goal-directed actions. This is where human-centricity comes into the picture. To put it in simple terms, Human-centered AI is defined by systems that are continuously improving because of human input.

Human beings will feel much safer in a vehicle that thinks and drives like them. Currently, AI is being made smarter by using the approach of Better Machine Learning (Optimizing learning algorithms). In comparison, the human-centered approach will make use of Better Machine Teaching (Optimizing data annotation). Humans will teach AI, Human teaching – Machine learns from humans and improves itself. Human supervision will also play an essential role in maintaining ethics and safety, without which AI will probably not be safe/fair and not be perfectly explainable. Human-centered AI is the solution to this issue.

Advertisment

AI systems need to be transformed with the integration of Human Intelligence into the loop of training and real-world operations. Deep Learning can be improved with humans’ participation in the loop by encoding human value into the learning process. During real-world operations, human sensing helps perceive human beings’ physical and mental states like emotions, fuzzy, etc. Human-Robot interaction experience will prove significant in methods for immersive and meaningful interactions like medical diagnosis and ethical decisions.

For example, World’s largest mainstream Electric car manufacturer has employed Shadow Mode - a feature that is being tested using the learning from humans approach. A car is driven by a human or a human with autopilot in shadow testing. The vehicle also has a new version of the autopilot software installed, which receives data from the sensors but does not control the vehicle in any way. Instead, it uses the sensors to make decisions about how to drive, and these decisions may be compared to those made by a human driver or an older version of the autopilot.

When it comes to autonomous driving, the vehicle, which is intelligent and autonomous, is not yet trained to perform a function with very high complexity. As a result, to be on the safer side, highly autonomous vehicles give back control to human drivers when driving on high-traffic streets.

Advertisment

Finally, specific road regulations cannot be taken care of by autonomous vehicles and will need a human presence to overcome these challenges. Human-centric AI has tremendous growth potential across industries, including the automotive and manufacturing sectors.

Conclusion

Artificial Intelligence is going to transform the existing processes and soon dominate the real world of applications. However, AI in its nascent stage possesses the ability to result in unintended consequences in terms of bias, ethics, explainability, and privacy. Human-centric AI is a new concept that is being developed to overcome these side-effects of AI. Human-centric AI applies to autonomous driving, manufacturing, and every other field where AI plays a significant role in day-to-day growth activities.   Autonomous Driving is all about keeping “Human out of the loop” of Driving Equation to increase road safety. Human-centric AI in autonomous driving is going to transform this line of sight in autonomous vehicles.

India is spearheading Human-centric AI in a new direction, and Continental is playing a major role in it.

Bhanu Prakash Padiri

The article has been written by Bhanu Prakash Padiri, Head of Advanced Engineering - Sensorics, Advanced Driver Assistance Systems (ADAS) business unit, Continental Automotive India

Advertisment