When you listen to the radio in your car, you’re listening -- but mostly driving. Your hands are on the wheel, eyes on the road, and you’re aware of the cars around you, your speed, and your environment.But, it’s really easy to take our eyes off the windshield, even just for a second.
Despite strict laws and high fines that punish texting while driving, we still do it, a lot. In 2011, more than three thousand people were killed in crashes involving a distracted driver. More than three hundred thousand people were injured in those crashes.
“People love to multitask and the windshield becomes just one more screen among many screens,” Clifford Nass explains.
Nass is an expert on distraction and multitasking, and he’s one of the people trying to engineer us out of this problem. He directs Stanford University’s automotive program and works at the University’s Volkswagen Automotive Innovation Lab (VAIL). He says soon our cars may conform to reality, and let us glance at that text without risking other people's lives because they’ll be driving themselves.
You may have already seen some of these driverless cars zipping around Bay Area roads. Automotive scientists predict similar cars will be available for all of us within the text ten years.
However, that doesn’t mean our problems will disappear. Self-driving cars come with their own challenges. To find out what they are, researchers use a high-tech, million dollar, life-like car simulator.
This simulator is almost indistinguishable from the cars in the parking lot outside. It’s a real Toyota Avalon outfitted with a small computer in the hood, surrounded by three giant, movie theater-style screens.
To my surprise, once I start driving, it actually feels just like a real car. Of course, I’m not going anywhere. Within seconds of putting my foot on the gas, I’m cruising around some generic-looking city streets. When I look out the front, towards my side mirror, or in my rearview mirror, I see realistic images of the road and my surroundings. When I press my foot on the gas, the images start to move. The whole experience looks and feels real.
“Your brain isn’t built to think about simulators. Your brain is built so that when it sees something it thinks of it as real life. So the fact that you are in a car, you’re hitting the gas and brake the cars responding totally appropriately makes your brain say ‘oh this must be real life,’” Nass explains.
When Nass tests the simulator on research volunteers, they’re usually covered in patches and wires, and hooked up to an EEG that tracks brain activity.
“Because we have the ability to measure every part of what your brain is doing, your heart is doing, your body is doing, We can get extremely accurate measurements,” Nass explains.
There are three types of distraction he’s trying to measure - manual, visual and cognitive. Manually distracted means you’re taking your hands off the wheel. Visually distracted means you’re taking your eyes of the road. Cognitively distracted means you’re taking your mind off the road. Texting hits all three categories.
“We determine what's going on in the brain when you're not just looking at the screen, but when you're thinking about the road as opposed to something else,” Nass says.
Studying our concentration and awareness is going to get a whole lot more complicated when driverless cars hit the road.
After I get the hang of driving the car simulator through an easy course, Nass tells me I can move on to the next level: the semi-autonomous course. Half of the time, the car will completely take over. I won’t have to worry about squirrels, stoplights, or stop signs.
“It will know everything it needs to know that is important because obviously we wouldn't allow cars on the road that couldn't tell about stop signs or stoplights or whatever, and those [obstacles] turn out to be challenging,” he tells me.
All of a sudden, an animated white Mercedes Benz appears in front of me. I don’t have my hands on the wheel. I grab for it, and push the brake. Before I have time to slow down, I realize the car has already done it for me.
“When that happens we are very interested in your ability to react, your reaction time. Later when you have switched from autonomous mode to manual mode, how will you adjust?” he asks.
There’s only ever been one accident in an autonomous car, and it was when a human was driving. Nass says that’s actually a problem because they don’t know what would cause the cars to crash.
“If an autonomous car drives perfectly safely in autonomous mode, but drivers become incredibly unsafe, will autonomously cars that are perfectly safe, nonetheless become unsafe? It’s a weird paradox,” Nass says.
After the Benz incident, the car asks me to take over. I try to be on my best driving behavior. But it doesn’t take long before I let my mind wander again. While I am tuned out, a pedestrian runs into the simulated street. This time, the car didn’t catch my mistake. If this were real life, I would have hit a real person. It’s scary.
“At least for the next 20 years we’re going to have a mix of hybrid, autonomous and non autonomous vehicles on the road,” says Nass. “And we have to figure out how to deal with that.”
Nass says one way to do that is through smart design. The Lab is working with car companies like Toyota and Nissan to experiment with different ideas.
“We actually have a camera out towards the windshield and if the driver starts looking at, lets say, the center panel in the middle of the car for more than two seconds, a picture of the road has popped up,” Nass explains. “So you see what you see on the windshield. “You don't want to look where you should be looking? We'll put the road wherever you're looking!”
Nass and the Lab plan to run the simulator thousands of times, on all sorts of drivers, each time getting a better look at how we interact with our cars. The hope is to make us better drivers when we take the wheel, and when we’re away from it.
To see a video of the simulator in action, click here. To listen to this story, click on the audio player above.