When people are actually comfortable enough to take an automated car out for a spin, who exactly is in charge? And, if you get into an accident, who’s held responsible?
Earlier this year, the National Highway Transportation Administration decided that under Federal Law, the computer in Google’s self-driving car could be deemed the “driver.” It’s a big step toward getting autonomous vehicles like the Google car on the road. And it means that if a driverless car were to get into an accident, hit someone or run a red light, the computer — and not the human — would be at fault.
Stephen Wu is a lawyer with the Silicon Valley Law group. And he specializes in subjects like robots, and driverless cars. KALW’s Angela Johnston asked him to explain the legal and ethical implications of letting the computer take the wheel.
STEPHEN WU: We would be putting a lot of responsibility in the hands of the manufacturers, that's why having people on the staff of manufacturers, thinking about ethical issues is critical.
Click the audio player above to listen to the interview.