Our Cookies

This site uses cookies, including from our partners, to give you the best browsing experience, to create content personalised for you and to analyse website use.

Blog

The Role of AI in Preventing Pedestrian Accidents: Where We Are and What's Next

In recent years, the rise of autonomous driving technology has sparked incredible advances in road safety. AI-driven systems can now help vehicles detect and avoid obstacles, pedestrians, and other road users. However, the complexity of real-world environments still poses significant challenges. My personal experience of being hit by a car got me thinking about how AI systems in vehicles could evolve further to prevent such incidents.

Andrea Rosales, Lead Data Scientist

With a PhD in Computer Science, Andrea Rosales specialises in domain adaptation, transfer learning, continual learning, and generative AI. Andrea is passionate about developing innovative data science models that deliver impactful solutions. She has a proven track record of creating novel deep-learning models to address real-world problems in both industry and academia, and she is recognised as a Global UK Talent.

Andrea Rosales

Lead Data Scientist

State of the Art in Autonomous Driving

Automatic emergency braking (AEB) leverages a combination of computer vision, machine learning, and sensor fusion to detect objects, people, and other vehicles. Key components of these systems include:

LIDAR (Light Detection and Ranging): Measures distances using laser beams to create a 3D map of the surroundings.

Cameras: Capture real-time footage to classify objects like pedestrians, cyclists, and cars.

Radar: Detects the speed and distance of nearby objects, crucial for collision avoidance.

Currently, advanced driver assistance systems (ADAS) offer features such as automatic emergency braking (AEB) and pedestrian detection. AEB can warn a driver of an upcoming danger, applying the brakes if they don't respond within due time.

Leading safety experts consider AEB to be one of the most important recent road safety advances. A 2015 study by The European New Car Assessment Programme (Euro NCAP) and Australasian NCAP found that AEB led to a 38% reduction in real-world rear-end crashes.

AEB may not be effective in all situations, such as when the system detects shadows or steep driveways on the road. Weather conditions like rain, fog, and snow can also negatively impact AEB performance. Some edge cases, like someone trapped under a moving vehicle, still challenge current AI models.

Detecting Pedestrians

Detecting pedestrians accurately is one of the most crucial aspects of autonomous driving. AI algorithms, particularly those based on computer vision, are designed to identify human forms using training data. However, the challenge lies in dealing with a variety of environments — weather conditions, lighting, and occlusions can affect the system's ability to detect people.

For example, a stoplight in conjunction with sunset, junction, and heavy traffic is just one combination that makes the assurance of safety difficult. The image below shows a situation where a green traffic light is covered by a red ball. Automated systems must be able to correctly detect such unusual situations and perform the appropriate actions.

Source: Identifying real-world problems with automated vehicles by detecting behavioral differences in steering movements between the human driver and machine.

Source: Identifying real-world problems with automated vehicles by detecting behavioral differences in steering movements between the human driver and machine.

Beyond that, there are another two layers of complexity: repeatability — it is not possible to simulate a real scenario as there are too many influencing factors. This generates an infinite number of test cases, which would take an endless time for validation.

The second one is detecting human activities. Autonomous systems must distinguish people performing different activities. For over 7 years, I've been focused on sensor-based human activity recognition (HAR), both during my PhD and as a research fellow. My work revolves around addressing the key challenges in pattern recognition for HAR systems. Below I describe the most relevant for this topic:

  • Intraclass Variability. There are many ways to perform a simple activity, for example, people may walk at different paces. Recognition systems should be able to recognise the same activity performed differently by different individuals.Intraclass variability can also occur when an activity is performed differently by the same individual. For example, walking on a treadmill can be different from walking outdoors.
  • Interclass Similarity. Interclass similarity occurs when activities have similar sensor data characteristics. To deal with this problem, accurate and distinctive features need to be designed and extracted from sensor readings. A human activity recognition system must be general enough to model all possible changes in a particular activity and distinguish between them.
  • Multi-subject Interactions. In many real-world applications, activities are performed with the interaction of other persons and objects. Therefore, it is challenging to track multiple subjects or to recognise group activities. To recognise group-based human activities, a higher level representation must be introduced, which can model the activity as a composition of simpler activities.
  • Data Collection. There is never enough training data, as users might behave differently or perform the same activity in different manners.
  • Class Imbalance. A major challenge is to distinguish activities with subtle differences and imbalanced distributions, which can have a significant implication in real-world applications. For example, life-threatening situations like falls or heart attacks are often not frequent and may have subtle differences from other daily activities. Recognising them effectively will enhance the robustness of an activity recognition system.
Sudden and unexpected movements

Sudden and unexpected movements

All these challenges translate into real-world challenges for AEB — how well can AEB detect someone crossing the street while running versus walking? Will it react quickly enough in more dynamic scenarios?

If someone runs across the street, a vehicle's AI system must respond faster than it would to a person walking. Sudden and unexpected movements, like a child running into the road, can confuse systems that aren't well-trained to recognise rapid changes in human behaviour. Currently, there's no guarantee that AEB systems will stop in time. This is where real-world data and reinforcement learning come in to help improve systems that anticipate and react to fast-changing activities.

Moreover, systems today often struggle with activities that are uncommon in training datasets. In many real-world cases, there have been accidents where the AI system didn't stop because it failed to recognise the risk in time.

Thatcham Research says AEB is “probably the most significant development in car safety since the seat belt and could save an astonishing 1,100 lives and 122,860 casualties in the UK over the next ten years.”

Today, around 21% of new cars have AEB fitted as standard, while it is optional on 27% of vehicles.

Today, around 21% of new cars have AEB fitted as standard, while it is optional on 27% of vehicles.

Opportunities to Improve: GenAI and Reinforcement Learning

Autonomous braking systems should offer safe and comfortable brake control without exhibiting too early or too late braking. Most conventional autonomous braking systems are rule-based, which designates the specific brake control protocol for each different situation. Unfortunately, this approach is limited in handling all scenarios that can happen on real roads.

Generative Artificial Intelligence (AI) has emerged as a powerful tool in the development and enhancement of Advanced Driver Assistance Systems (ADAS). By generating diverse and synthetic training data, GenAI could create realistic simulations of rare, dangerous situations, including low probability but high-risk pedestrian accidents. This could accelerate AI learning without requiring dangerous real-world data collection.

Additionally, GenAI enhances ADAS by interpreting road signs and adapting to dynamic environments, ultimately improving both safety and efficiency in transportation. By expanding the diversity and volume of datasets, GenAI helps models learn a wider array of lane markings, road conditions, and driving scenarios, leading to more reliable object detection and lane boundary recognition.

Reinforcement Learning (RL) is emerging as a key technique for advancing the capabilities of AEB systems. Unlike traditional machine learning approaches that rely solely on pre-labelled data, RL enables AI systems to learn through trial and error, continuously improving their decision-making processes in complex, real-world driving environments.

DRL-based autonomous braking systems

DRL-based autonomous braking systems

Deep Reinforcement Learning

Autonomous braking systems utilising Deep Reinforcement Learning (DRL) can smartly manage the vehicle's speed in situations where a collision is imminent without intervention. The agent (vehicle) interacts with the uncertain environment where the position of the obstacle could change in time and thus the risk of collision at each time step varies as well. The agent receives the information about the obstacle's position using the sensors and adapts the brake control to the state change such that the chance of an accident is minimised.

Additionally, reinforcement learning could be used to teach autonomous systems how to improve their detection and reaction strategies over time. For example, AI could simulate a scenario like my accident — where a foot is caught by the wheel — and learn the best course of action before and after the accident. By applying RL in virtual simulations, the system would be able to make rapid improvements to prevent similar accidents.

Challenges Ahead

While the potential for AI to improve pedestrian safety is vast, challenges remain:

1. Edge Cases and Anomalies: Training AI models to account for every possible scenario is nearly impossible. AI systems need to learn to handle rare but catastrophic situations, which often remain underrepresented in training data.

2. Ethics and Regulation: Who is responsible if the AI makes a mistake? Determining liability in cases involving AI-driven vehicles remains a significant legal challenge. Ensuring these systems are ethical and transparent in their decision-making will be crucial.

3. Sensor Limitations: While AI models can improve, they are only as good as the data provided by sensors. Environmental factors like fog, snow, or heavy rain can still impede even the most sophisticated systems.

AI has already made vehicles smarter and safer, but there's still much to be done. By advancing detection systems, refining reinforcement learning techniques, and leveraging generative AI, we can push the boundaries of what's possible in road safety. Preventing accidents like mine is not just a technological problem — it's an opportunity to rethink how AI interacts with our everyday lives.

Resources

https://www.rac.co.uk/drive/advice/road-safety/autonomous-emergency-braking-what-you-need-to-know/

https://www.rac.co.uk/drive/advice/road-safety/top-10-safety-features/

D. M. Schwarz, L. Rolland and J. B. Johnston, “Identifying real-world problems with automated vehicles by detecting behavioral differences in steering movements between the human driver and machine,” 2022 IEEE 28th International Conference on Engineering, Technology and Innovation (ICE/ITMC) & 31st International Association For Management of Technology (IAMOT) Joint Conference, Nancy, France, 2022, pp. 1-9, doi: 10.1109/ICE/ITMC-IAMOT55089.2022.10033188.

J. Wang, Y. Chen, S. Hao, X. Peng, and L. Hu. Deep learning for sensor-based activity recognition: A survey. CoRR, abs/1707.03502, 2017.

S. Zhang, Z. Wei, J. Nie, L. Huang, S. Wang, and Z. Li. A review on human activity recognition using vision-based method. Journal of Healthcare Engineering, 2017:1-31, 07 2017.

https://arxiv.org/pdf/1702.02302

https://conti-engineering.com/areas-of-expertise/systems-functions/aeb-system/