
Passion for Technology
Discover the latest technology trends, meet technology enthusiasts, understand what’s behind talked about technological terms and get inspired by our passion for technology.
Passion for Technology
Can AI in cars reduce accidents caused by human error?
This article readout is part of The Quintessence magazine. The latest issue explores the latest trends in technology and offers valuable insights into the fascinating world of Human Machine Interfaces. Access it free of charge here: https://library.ebv.com/link/140915/
In this episode, we explore how advanced driver monitoring systems are making cars safer by keeping drivers alert and aware. From infrared cameras that track eye movements to AI-driven analysis of heart and respiratory rates, these systems are designed to prevent accidents caused by fatigue and distraction. Discover how multi-sensor solutions, combining radar and imaging technologies, are providing deeper insights into driver health and readiness.
Cars that Take Care of Their Passengers
By 2030, the EU aims to halve the number of traffic deaths and injuries. This ambitious endeavour encompasses everything from the mandate of state-of-the-art vehicle technologies to the modernisation of infrastructure. However, one factor plays a particularly significant role: the human. More than 90 percent of all accidents are caused by human error. In addition to violations such as speeding and driving under the influence of alcohol, it is important whether the driver is tired or distracted. According to the European Commission, 10 to 20 percent of accidents and near-accidents occur as a result of fatigue.
To address this problem, the European Commission published a regulation in August 2021 that, effective in July 2022, mandates the use of Driver Drowsiness and Attention Warning (DDAW) systems. They assess the driver’s vigilance by analysing other vehicle systems such as steering and lane keeping and warn the driver if necessary.
However, relying solely on data from other vehicle systems is not necessarily sufficient to assess a driver’s condition. Therefore, from mid-2024 in the EU, new vehicles must be equipped with an Advanced Driver Distraction Warning (ADDW) system. The first generation of ADDW solutions primarily relied on the driver’s eye movements: a camera with a CMOS image sensor monitors the driver using invisible infrared light. “The infrared light generates a reflection on the cornea of the eye, which is captured by the camera,” explains Martin Wittmann, marketing director for the sensor division at OSRAM Opto Semiconductors. “By tracking the direction of gaze, we can see whether the driver is looking at the road. The size of the pupil also indicates how awake the driver is. Finally, we can also recognise when the driver becomes tired by the movements of the eyelids.” When this is the case, the system warns the driver and redirects their attention to the road.
Fatigue or lack of attention are highly complex states, so the latest generation of solutions capture additional parameters besides eye movement. The company Smart Eye has integrated the capture of vital signs into its driver monitoring software.
Using AI methods, the new function analyses several physiological signals to accurately determine the driver’s heart and respiratory rate. Smart Eye, in particular, uses remote photoplethysmography (rPPG), a contactless, camera-based method that measures fluctuations in light reflection from the skin to estimate heart rate. Another method is micro-movement analysis, which allows the software to detect subtle changes in movements associated with breathing or pulse that are not visible to the human eye. “By integrating heart and respiration rate detection into the driver monitoring system software, we provide an even deeper layer of insight into the driver’s state of health,” says Henrik Lind, Chief Research Officer at Smart Eye. This can be lifesaving if, for example, a driver suffers a heart attack or seizure.
A more accurate capture of the driver’s condition is enabled by multi-sensor systems, such as those being developed jointly by emotion3D, Chuhang Tech and SAT. The “human analysis” software from emotion3D derives information about the driver from camera images, while Chuhang Tech’s radar solutions analyse the driver’s vital parameters. These two measurement methods are combined with SAT’s algorithms for predicting sleep onset. Wogong Zhang, CTO and co-founder of Chuhang Tech says: “We believe that our combined solution, which combines radar technology with advanced imaging algorithms, will revolutionise fatigue detection.”
Driver monitoring systems are becoming increasingly important in view of the increasing automation of driving. As a vehicle becomes more autonomous, better safety systems are needed – for example to monitor whether a driver is ready to take over control of the car in a difficult situation. “Particularly well-functioning systems, especially in areas such as adaptive cruise control and lane keeping, tempt many road users to turn to tasks other than driving,” said Jann Fehlauer, Managing Director of DEKRA Automobil, at the presentation of the DEKRA Road Safety Report 2023. Several serious accidents have already been the result of such a misjudgment.