a bit more about self-driving cars
Feb. 6th, 2012 03:18 pmLinkage free. Probably.
I've been grilling people in my life that don't have a lot of opportunity to run away from my questions (husband, sister, type of thing) about under what circumstances they would be willing to be in a self-driving car and/or buy one for themselves. I'm bugging people about this, because I'm having a failure of imagination: I can't figure out what would convince _me_ to get into and/or purchase a self-driving car. Partly, this is because there are inadequate constraints on the problem: you can't buy one of these things, much less choose between multiple marks/models (that BMW 5 series won't be out for at least a decade, and the whole thing could turn out to be like jetpacks).
But if the NYT is out there asking about legal issues, I figure I might as well contemplate marketing and policy.
(1) I'm not running across a lot of I'll-never-give-up-driving, either because people lluuuuurrvvvee driving or just don't trust whatever it is that people trust or don't trust. I'm sure those people exist; I don't think they are so common as to preclude the deployment of self-driving vehicles, assuming the technology can be made viable.
(2) A series of large studies by reputable people that produced a result roughly comparable to second-hand-smoke exposure would convince a lot of people to switch, even if the majority had not yet switched (that is, if self-driving cars could be demonstrated to have a safety benefit over human-operated cars like living in a smoke free environment vs. being around smokers).
(3) An "emergency" version might be widely accepted by people who are right on the edge of being able to drive at all. That is, a button that invoked self-drive when the operator feels they cannot safely operate the vehicle but also feels like they can't safely stop driving (this came up in a medical context, but I'd call this roughly the "I'm about to fall asleep but I'm still a few miles from where I'm going to get into a bed" button). There's probably a population of people with license suspensions and/or breath interlocks that might be a market for self-drive. R. agrees with me that an aging population that doesn't feel like they can give up their cars might go for this, too.
The obvious early-adopter crowd might well find this whole category of technology development every bit as snooze worthy as a lot of people found e-readers. However, (3) suggests there's a market to use for early-adoption. And (2) suggests a way to transition from early-adoption to mass-adoption. And (1) suggests mass-adoption is viable.
Let's see if the technical issues are, er, superable.
I've been grilling people in my life that don't have a lot of opportunity to run away from my questions (husband, sister, type of thing) about under what circumstances they would be willing to be in a self-driving car and/or buy one for themselves. I'm bugging people about this, because I'm having a failure of imagination: I can't figure out what would convince _me_ to get into and/or purchase a self-driving car. Partly, this is because there are inadequate constraints on the problem: you can't buy one of these things, much less choose between multiple marks/models (that BMW 5 series won't be out for at least a decade, and the whole thing could turn out to be like jetpacks).
But if the NYT is out there asking about legal issues, I figure I might as well contemplate marketing and policy.
(1) I'm not running across a lot of I'll-never-give-up-driving, either because people lluuuuurrvvvee driving or just don't trust whatever it is that people trust or don't trust. I'm sure those people exist; I don't think they are so common as to preclude the deployment of self-driving vehicles, assuming the technology can be made viable.
(2) A series of large studies by reputable people that produced a result roughly comparable to second-hand-smoke exposure would convince a lot of people to switch, even if the majority had not yet switched (that is, if self-driving cars could be demonstrated to have a safety benefit over human-operated cars like living in a smoke free environment vs. being around smokers).
(3) An "emergency" version might be widely accepted by people who are right on the edge of being able to drive at all. That is, a button that invoked self-drive when the operator feels they cannot safely operate the vehicle but also feels like they can't safely stop driving (this came up in a medical context, but I'd call this roughly the "I'm about to fall asleep but I'm still a few miles from where I'm going to get into a bed" button). There's probably a population of people with license suspensions and/or breath interlocks that might be a market for self-drive. R. agrees with me that an aging population that doesn't feel like they can give up their cars might go for this, too.
The obvious early-adopter crowd might well find this whole category of technology development every bit as snooze worthy as a lot of people found e-readers. However, (3) suggests there's a market to use for early-adoption. And (2) suggests a way to transition from early-adoption to mass-adoption. And (1) suggests mass-adoption is viable.
Let's see if the technical issues are, er, superable.
no subject
Date: 2012-02-06 09:17 pm (UTC)One issue, maybe not exactly legal, I think would be the general class of issues you run into with a GPS that has maps that are not 100% accurate due to new construction. My house was built in 2007-ish and if you plug my address into most map systems, it can't find it. And there have been at least a few times my car's GPS has told me to hang a right turn through a traffic wall. I'm sure that the google car won't merrily drive into a lake where there was once a bridge, but if it did, who's responsible? The owner? Google? The map company? The state that tore down the bridge?
I also wonder what would happen if a person broke a restraining order by going to someone's address in a self-driving car. Is the person responsible because they programmed the car to go there? If I punch in the address within a restricted military base, does the car say "I can't do that", or could I blame the trespassing on the car? Or what if I punch in the address to the White Sands K-Mart and the car routes me through the middle of an active missile range? Google Maps seems to have some takedown policy where certain things aren't listed. Would the car system have a similar restricted list? Who maintains that? If I work at google and I'm pissed off at my ex-whatever, could I make it so nobody could ever drive to their house again?
fantastic questions!
Date: 2012-02-06 10:03 pm (UTC)From the HuffPo coverage of the BMW 5 series:
"In principle, the system works on all freeways that we have mapped out beforehand with [a] centimeter accuracy," Nico Kaempchen, project manager of Highly Automated Driving at BMW Group Research and Technology says in the video.
http://www.huffingtonpost.com/2012/01/26/bmw-self-driving-car_n_1234362.html
There are obviously some ecosystem issues, and it's easy to spend a lot of time imagining ways to carve off parts of the problem as transition features ("smart" cruise control that senses distance to the car in front/behind is already out there, as well as please-parallel-park-me). If one of the early markets is long-commute types (which I agree with), it's fair to assume that people who can persist in long-commutes over time are people that have spare money (or they couldn't afford the fuel price spike risks). It's also fair to assume those long commutes are largely on freeways. It might be possible to carve out a self-drive feature that was just for the time-consuming freeway portion of the commute (including handling well-structured junctions between freeways/highways) that costs a big premium and requires sticking to freeways that are mapped. That may, in fact, what BMW is attempting to do. I don't know.
I love the restraining order examples, because they expose so much connectivity between technology and psychology and laws and so forth. In practice, however, if you worked at google and did something along those lines, you'd probably get caught and there'd be some kind of consequences. Preventing access to certain locations by limiting the "map" that self-drive features use seems like it'd be done at some point after the early-adopters but before mass adoption (possibly after the 3rd or 4th prank).
People are really concerned about workers (flaggers, cops, emergency people, etc.) that self-drive features can't perceive/comply with the way a human driver is supposed to. Of course, flaggers get run down all the time (which is a tragedy), and it's pretty easy to imagine some kind of technological solution to this that overall might be better than human drivers. Road workers with a schedule could pre-schedule changes to the map, say, and emergency workers could have beacons on them that cars would be mandated to pay attention to. Would there be failures? Yes. The idea would be to produce fewer failures than human drivers, or less fatal ones.
But I especially love the driving into a lake where there was once a bridge example, because it's _guaranteed_ to happen, simply because human drivers pull that kind of drive-off-the-broken-bridge-into-water-past-multiple-warning-signs with notable regularity (seems like that happens at least once every few years in the US alone).
My _guess_ would be that the responsibility would depend entirely on the specifics of the case. Here are some possibilities:
The car picked the route, did not provide an override option or ignored it when the driver attempted it, ignored beacons it was required by law to attend to. Manufacturer pays up.
The driver did an override to pick the route, an override to ignore the beacon, and possibly more overrides on the spot. Driver goes to jail and/or heirs don't get much from the life insurance company because it is treated as a suicide.
Early adopter driver let the car pick the route, fell asleep and ignored low-key beeps trying to get a response
to override options. Car kept going in the absence of instructions to stop. Beacons have not been adopted anywhere. Case grinds on forever; multiple laws passed with the name of the baby in the backseat to make sure this kind of crap never happens again. If the driver survives and someone else dies and the driver is annoying, driver gets crucified for being drunk/being a crappy parent/being annoyed. If the driver dies and/or is likable/disabled/old, everyone else gets stuck with massive judgments. If the locality was slow to mandate/deploy beacons, etc., ad nauseum.
Fun! Well, for someone who likes this kind of stuff. Not fun in real life.
But self-driving cars just don't feel like jetpacks to me. I feel like these might actually happen.