New driver attention study “could help turn the corner on road safety”

A new study is underway to test how drivers can be overwhelmed by information at intersections, and work out ways to stop that happening and risking deadly crashes on roads.

The University of Toronto in Canada is evaluating driver attention and gaze towards pedestrians and cyclists at junctions, using eye-tracking equipment to better understand the interplay between driver attention, infrastructure design and collisions.

The study will take inputs from the University’s department of mechanical and industrial engineering team up with its departments of psychology and also geography and planning along with transportation safety specialists and the mechanical and industrial engineering department.

Around 50 participants will drive a pre-assigned route while wearing eye-tracking glasses in a vehicle fitted with cameras capturing internal and external scenes. Participants will drive for around 30 to 40 minutes each, with a break in between to prevent fatigue.

“We have a camera that faces the driver, which gives us information about their body position, head movement, emotional reactions and so on, and we have a road-facing camera which gives us a dashboard view to see objectively what the scene is in front of the vehicle as the person drives,” says Joelle Girgis, a second-year master’s student in Donmez’s lab who will be leading the data collection and analysis. “The most critical equipment would be the eye-tracking glasses. These glasses will give us a view of what drivers are looking at, even as they move their heads. 

“This way, we know both what the objective road scene is in front of them, as well as where they’re gazing specifically.”

Once the data is collected, drivers’ turns at intersections will be coded according to whether they gazed at areas that were previously identified as being important.  

“The question we’re asking is: Did they or did they not look at certain critical areas where a pedestrian or cyclist may appear?” says Girgis, whose master’s thesis will focus on the study. “So, we’ll view the videos and decide on whether the driver did or did not pay attention or directly gaze at a pre-determined area of importance.

“That gives us information that we can then turn into trends and statistics to see gaps. For example, drivers might not be checking their blind spot or right mirror when they’re stopped at a red light and need to turn right; maybe they’re overwhelmed; maybe there’s a lot of traffic coming from the left side; they’re also trying not to hit a pedestrian in front of them.

“So, there are all these things related to cognitive load that we might be able to infer based on the specific circumstance of where they are – and are not – looking.”

Girgis says driving routes and intersections will be chosen based on data about problem areas, as well as to cover different kinds of infrastructure and turn situations.

Donmez notes that driving data collected by the researchers still represents a “best-case scenario” since it won’t be able to take into account common in-car distractions such as cell phones and conversations with passengers.

“The participants aren’t doing anything else. They’re just focusing on their driving. Although it is an unfamiliar vehicle – so there’s that caveat there – I would expect that they have a lot less failures in our study compared to how they normally drive,” she says, adding that future studies may drill down on particular problem areas with the ultimate goal of informing policy and infrastructure design.

She says her research shouldn’t be interpreted as blaming drivers. Rather, its focus is on finding ways to reduce the number of things competing for drivers’ attention – with the result being a safer environment for everyone.

(Picture – University of Toronto, Birsen Donmez)

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on email
Email
Share on print
Print

Related Stories

HIGHWAYS... DAILY

All the latest highways news direct to your inbox every week day

Subscribe now

This website uses cookies to ensure you get the best experience on our website.