This is a Link Fu post: I'll be finding news articles and opinions etc. about the first pedestrian fatality involving a self-driving car while self-driving (safety driver did not engage manual mode). There are people who are pretty committed to the idea that this could be a moment that changes the energy / politics of self-driving cars. I'm skeptical, but open to the possibility.
Here is an example of an opinion piece speculating / pushing the idea that a pedestrian death by self-driving car is an argument for slowing down / stopping self-driving car development / testing on public streets.
https://slate.com/technology/2018/03/uber-crash-kills-woman-in-first-pedestrian-death-caused-by-a-self-driving-car.htmlFirst: Who, What, Where, When
I would personally suggest you do some hard thinking before you click on this link, maybe turn off auto-play if you still have it turned on. Police have released the car video from the event; it is embedded in this coverage.
The basics: at 10 p.m. on a clear, dry roadway in Tempe, AZ, an Uber in autonomous mode struck a woman who later died of her injuries. She was walking her bike across the road (so that's why it is a pedestrian fatality, even tho a bike is present) at a location without a crosswalk or a crossing road. The event occurs after she steps off / through the median in the middle of a four lane road, in the far right lane.
https://www.reuters.com/article/us-autos-selfdriving-uber/arizona-police-release-video-of-fatal-collision-with-uber-self-driving-suv-idUSKBN1GX39AUnstated anywhere in this particular coverage is whether this is an event which would have occurred if the car had been driven by a human.
"“The sensors should have detected the pedestrian in this case; the cameras were likely useless but both the radars and the Lidar must have picked up the pedestrian,” said Raj Rajkumar, a professor at Carnegie Mellon."
Wired coverage, also with the car video:
https://www.wired.com/story/uber-self-driving-crash-video-arizona/This one is even more focused on why the _non_ camera sensors (the lidar and rader) failed to detect the pedestrian / accurately anticipate what the pedestrian was about to do. It discusses in some detail whether perfect attention is a possible expectation of safety drivers / people charged with monitoring.
Neither of these two articles get into street design issues.
This article pre-dates the video release:
https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-chief-says-early-probe-12765481.phpThe police are much more concerned about assessing this event in comparison to what an ordinary driver in an ordinary car might have done. Some coverage has said the car was going 40. This coverage indicates 38 in a 35. Common to earlier coverage of self driving cars was a lot of speculation about whether self driving cars being sticklers for adhering to posted speed limits might create accidents of human drivers going around them.
"The incident happened within perhaps 100 yards of a crosswalk, Moir said. “It is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available,” she said."
Moir is Tempe's police chief according to this article. This is an indication of how street design plays a role in accidents. 100 yards is a long walk (and twice that, if your destination is directly across the road, even longer) while pushing a bike laden with bags. Most adults drive, and can become highly insensitive to how distance and incline are experienced by people who are walking or using human powered vehicles. The pictures, and the video also show that this area is poorly illuminated.
https://www.azcentral.com/story/news/local/tempe-breaking/2018/03/19/woman-dies-fatal-hit-strikes-self-driving-uber-crossing-road-tempe/438256002/"A large median at the site of the crash has signs warning people not to cross mid-block and to instead use the crosswalk to the north at Curry. But the median also has a brick pathway that cuts through the desert landscaping that accommodates people who do cross at that site."
The physical hardscape strongly suggests that if we were to go digging through accident data for this location, we would find a lot of people crossing illegally and potentially unsafely at this location. Rather than light, signal and crosswalk the location, Tempe has chosen to try to "sign" the problem out of existence.
As a variety of sources (above and elsewhere) have noted, the driver has a criminal record and was not looking at the road for seconds at a time, and the deceased woman was homeless, had substance abuse issues and mental health challenges, and was crossing the road at a location where pedestrians are not supposed to cross.
I'm trying to figure out how much coverage there would have been if this had not involved a self-driving car. I'm also curious how _often_ some variation on this accident has occurred at this _exact same location_, but with a human driven car.
Lidar and laser sensing equipment is supposed to help with this kind of situation in darkness. But I'm a little skeptical of those solutions, because I've lived with cars that have parking sensors and I'll just straight up tell you that every little bit of dirty snow that gets thrown up on the sensors wreaks havoc with the information coming from them. If self-driving cars require those kinds of sensors to do their thing, self-driving cars are not going to happen in large parts of our country. Ever. And we could do all kinds of things to the vehicles on the road, but I personally think that if a trip through the Tempe police records turn up a lot of accidents at this location, the money would be much better spent lighting the location and putting up a crosswalk control so peds could request a red. Pretty sure that Uber car would have stopped for a red light, altho of course we can't be sure.
But if a self-driving car ran a red and killed a pedestrian, we would DEFINITELY then have a moment where we could draw back from self-driving cars on public roads and go, we have more than enough human drivers doing _that_ already. We don't need to automate the process.