Who's to blame when an autonomous car hits a pedestrian? Car companies are already trying to force the question for their own benefit.
In the months after an Arizona pedestrian was killed by a driverless car, tech companies developing the technology are trying to shift blame to those on foot, as it becomes increasingly clear that self-driving cars are having trouble detecting pedestrians, reports Jeremy Kahn at Bloomberg:
Driverless proponents ... say there’s one surefire shortcut to getting self-driving cars on the streets sooner: persuade pedestrians to behave less erratically. If they use crosswalks, where there are contextual clues—pavement markings and stop lights—the software is more likely to identify them.
“What we tell people is, ‘Please be lawful and please be considerate,’” Andrew Ng, a machine learning researcher whose venture fund invests in driverless startups, told Bloomberg.
In other words, the paper concluded, "no jaywalking."
Elaine Herzberg was, technically, crossing illegally when she became the first pedestrian killed by a self-driving car this spring — a crash that revealed major problems with autonomous tech. The Uber Volvo SUV that hit her as she walked her bike in a pedestrian-heavy area had a hard time identifying her, plus the car was programmed not to brake if it believed it had detected "false positive." The National Transportation Safety Board has not yet issued its final report on that crash — including who would be at fault.
But there is more than enough evidence from NTSB's preliminary report to show that Uber made a lot of dangerous mistakes. For example, the backup driver was watching a television show on her phone at the time of the crash and Uber had programmed the car to avoid braking in part because it made an active decisionto reprioritize safety in its rush to get to market.
But jaywalking is something that people do — and, indeed until fairly recently, did with impunity. Laws to criminalize jaywalking were initiallypromotedby car companies about a century ago to shift blame for traffic injuries and deaths to the pedestrians they were killing.
“If you ask people today what a street is for, they will say cars,” Peter Norton, an assistant professor at the University of Virginia and the author of Fighting Traffic: The Dawn of the Motor Age in the American City, told CityLab. “That’s practically the opposite of what they would have said 100 years ago.”
And now we have Andrew Ng and his high-tech cohort that wants to go further than merely summonsing pedestrians — they want to turn pedestrians into robots so that the robot cars can avoid them. (As a point of information: Ng's quote itself suggests he doesn't understand crosswalk laws. Every intersection is technically an unmarked crosswalk — a legal crossing zone — even without "contextual clues" like stripes and traffic lights that could help alert computer systems.)
Beyond that, self-driving cars are billed as a major safety breakthrough to rid us of the single biggest adverse effect of human drivers: how frequently and violently they kill their fellow road users. Driverless proponents have arguedthey shouldn't be subject to existing safety regulations because anything that delays the widespread adoption of autonomous vehicles could cost lives.
So it's disturbing to see promoters placing responsibly for safety now on external actors, especially those most vulnerable to their shortcomings.
A potentially larger question — who will be liable in crashes between driverless cars and pedestrians — remains largely open.
The Governor's Highway Safety Association recently discussed the situation in a State-Farm Insurance-sponsored report. The organization laid out a scenario in which a pedestrian signals to a car that he is planning to cross mid-block, but is struck anyway. Who's at fault in these situations, GHSA wonders?
It is sad that this is even a question open for debate. "Jaywalking" shouldn't be punishable by death. And pedestrians should not be re-educated into robotic machines that move in predictable ways to meet the demands of programmers of robotic cars. It's supposed to be the other way around.
But not to the GHSA, which suggests that "new public outreach or even enforcement efforts" might be needed to make sure pedestrians stay in line.
More disturbingly, other tech industry insiders are eager to unleash their programming against "repeat offender" jaywalkers who mess with their precious code.
"Pedestrians and pranksters, knowing that the cars are programmed to yield to any in their path, could bring traffic to a halt," CNN Tech reported, apparently giving voice to the techies. "Outfitting the cars with facial recognition technology could help identify violators, but that raises its own tricky issues."
For now, the Twittersphere is still treating this as a joke ...
Autonomus car: "I ran over that person in the road because my driving circuits didn't see her"
Same autonomous car: "My jaywalking detection circuits got a picture of her face though" https://t.co/MEfZkenHh2
— Matt Carphree (@MattyCiii) August 15, 2018
But the future has the potential to be scary — and not just because there's no one behind the wheel of that 2,500-pound metal cage speeding down the roadway. What's truly scary is that driverless car makers seem to want to shift the blame for crashes onto pedestrians who "misbehave" or, in other words, resist the coders' re-education campaign.
And that debate is not about safety, but about accountability. If humans have to alter their behavior to accommodate the machines they create, we're one step closer to a sci-fi dystopia.