Monday, April 25, 2022
April is National Distracted Driving Awareness Month. Throughout the month, Marc Gentzler, professor, psychology, will share his expertise on driver safety discovered through years of research. Marc, a professor at Valencia since 2013, obtained his doctoral degree in human factors psychology in 2014 from the University of Central Florida, with a focus on the neuroscientific aspects involved in driving. His dissertation was titled “Driving performance adaptation through practice with and without distracters in a simulated environment.” He has 14 peer-reviewed publications, 15 conference presentations and 14 conference poster presentations. Further, Marc reviews papers in his field and has previously done consulting analyzing the perceptual and cognitive factors involved in real car accident cases.
By Marc Gentzler, Professor Psychology
One type of automation that has existed for a long time is cruise control. In new cars today, we have many different modes of automation. We now have the possibility of fully autonomous cars that can literally drive themselves without any driver input (in theory). But before I get more into that, I want to explain how over-reliance on automation in aviation can be deadly.
Possibly the most famous crash involving automation was Eastern Airlines Flight 401 in the Miami Everglades back in the early 70s. The L-1011 was a very technologically advanced jet for its time, and, in fact, Lockheed loved touting that the plane could land itself. One night in late December, as Flight 401 was getting ready to land in Miami, the crew noticed that the nose gear light indicated the gear was not down. The crew then put the plane on autopilot while trying to figure out what the problem was and what to do about it.
At some point, the autopilot got shut off and no one noticed. The plane began slowly descending into the Everglades, and by the time the crew realized what was happening, it was too late. The saddest part is that the gear was down the whole time, and the problem was just a bad lightbulb in the cockpit.
So what can we learn from aviation accidents such as this one regarding automation in driving? Many cars today have automatic braking if it detects a hazard in front of it, lane monitoring (usually a small light in the side rear-view mirrors to indicate when there is a hazard to the side) and even technology to keep the car maintained in the lane. These technologies can certainly help reduce the chance of an accident.
But what are the potential cons? Could the driver become over-reliant on the automation, much like in some aviation accidents? Could the driver not bother looking to change lanes if they have lane assistance, or not pay as much attention and decide to text while driving because they know that they have automatic braking? Could this automation lead to a degradation in driving skills because they are not relied on as much?
Drivers must always be ready to take over if the automation fails; thus, they must pay attention regardless of how much automation their vehicle has. I’m not arguing that these examples of vehicle automation mentioned above are bad ideas, but I am suggesting that we be cautious when implementing these technologies in terms of the possible disadvantages.
Another potential con is that drivers might lose a sense of situational awareness (where they are and other objects are in relation to them) if they rely on automation. Take the crash of Air France 447 for example. The pilots were caught off guard during cruise flight over the Atlantic when the autopilot suddenly shut off. They seemed disorientated, not knowing what was happening or how to correct the issue.
The situation is analogous to a driver who fails to pay attention to the road, gets startled when the automation disengages and then does not have enough time to get orientated to the current driving situation.
There have been several crashes involving Tesla’s semi-autonomous vehicles over the years. In one example, a Tesla driver appeared to have their hands off the wheel while driving and was focusing on something like a cellphone just before the crash. The National Transportation Safety Board (NTSB) concluded that the crash was “due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla Autopilot design, which permitted the driver to disengage from the driving task.” Tesla has stated that those driving their semi-autonomous vehicles must pay attention and be prepared to take over.
I will say that I am against fully autonomous cars. Many people assume that with a fully autonomous vehicle, they could just read a book or take a nap while the car drives them to their destination. But that is not really the case. Again, the automation could always fail, and the driver needs to be ready to take over at any second. The more distracted they are, the less situational awareness and longer reaction time they will have.
The problem with vigilance tasks (monitoring the automation) is that it can lead to boredom and fatigue — both of which are not great for driving. Many claim that driving is boring, especially when traveling long distance. But one can imagine that the levels of boredom will increase if the driver is not actually controlling the vehicle and is just monitoring the automation.
Having to drive the vehicle should enhance alertness. Many suggest that fully autonomous cars will help elderly individuals with transportation if they are no longer safe to drive. But I would argue that if they are not safe to drive, then they may not be ready to take over when needed.
In the end, I think it is better to have a human driving and the automation help the driver, as opposed to the other way around where the automation drives and the driver is just there to help aid the automation. Just because we have the technology to do something, doesn’t mean that we should.
We are actually pretty good drivers when we are focused. Many accidents occur daily, but you need to compare the accident rate to the number of cars on the road and miles driven.
Can we create a machine that drives as well as a human? Probably never, because replicating the human brain will likely always be an impossible task. Our brains are just too complex.
Going back to aviation, modern airliners can fly themselves. Nevertheless, a human needs to monitor the system.
I will leave you with this question: Would you feel comfortable if there were no pilots aboard an airliner on which you were traveling?
I would like to thank you all for reading this article series on driving safety. I hope that it was both interesting and informative, and maybe made you think about driving in some different ways.