Active Auto Safety Gets in Your Face



Cars are getting smarter and beginning to react on their own, but the gray matter manning the helm is still the vehicle’s Achilles heal. So to really get inside a driver’s head, automakers are going through their faces, analyzing expressions and muscle movements to determine whether the person at the wheel is too distracted, too tired or even too angry to safely control their ride.


In conjunction with PSA Peugeot Citroën, scientists at the Transportation Center and Signal Processing 5 Laboratory of Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland are developing a technology that uses a camera to capture facial expressions and software to look for telltale signs of distraction as well as emotions that could indicate that the driver is not up to the task at hand. Think of it as a concerned co-pilot who can not only read your mood, but also take action before your mental condition clouds your driving decisions.


“We aim at optimally exploiting computer vision technologies to improve safety and comfort in cars through more natural human-machine interfaces,” Jean-Philippe Thiran, the director of EPFL’s Signal Processing 5 Laboratory, told Wired.


Olivier Pajot, PSA Peugeot Citroën’s EPFL representative, said in a statement that the automaker is using the research to “make the interface between the car and the driver more intuitive,” and that reading intentions from a driver’s facial expression “is a very natural interactive mode.”


While facial-recognition technology has become commonplace for everything from surveillance to social media data mining, applying it in the car presents a unique set of challenges, starting with where to place the camera so that it doesn’t obstruct the driver’s view. “One of the possible options is to place it behind the steering wheel, which would require a system that’s quick enough to recapture the face detection after the arms of the steering wheel interrupt it,” Thiran said.


Another issue is adapting to the changing lighting conditions, such as when a car goes into a tunnel or at night when active safety systems are most beneficial – and when nodding off at the wheel usually happens. EPFL also acknowledges that the technology has to perform as well when drivers are facing the camera as it does as when they turn their heads.


The next step is to test the facial-recognition technology in real-world conditions. It’s currently fitted to a prototype vehicle, and EPFL is refining the technology by increasing the number of images processed.


Other automakers and suppliers are also exploring safety research by pointing a camera at the driver’s mug while behind the wheel and capturing and analyzing the data.


Toshiba showed a facial-recognition system that searches for distraction and also allows tuning the radio with a blink of an eye. BMW’s “pupilometry” research focuses on tracking driver’s eyeballs to better understand how much visual stimuli can be absorbed before distraction ensues, while the Swedish company Tobii Technology has developed a system that watches a drivers’ eyes – even when they’re wearing sunglasses – to tell if they’re glued to their smartphone or showing signs of fatigue.


“Our goal is to build the technological base to detect and situate a driver’s face at any moment in time,” Thiran said. “Using this, it will then be possible to build and test various driver assistance applications such as eye tracking, fatigue detection, lip reading and so on,” although it’s doubtful that your car could automatically book you into an anger-management program. For now, at least.


You're reading an article about
Active Auto Safety Gets in Your Face
This article
Active Auto Safety Gets in Your Face
can be opened in url
http://itchynews.blogspot.com/2012/11/active-auto-safety-gets-in-your-face.html
Active Auto Safety Gets in Your Face