GPS Makes People Drive Off the Road
Jan. 25th, 2018 10:56 amSome years ago, when GPS was comparatively new, we had a spate of stories about people ignoring sign after brightly lit "Road Closed" "Bridge Out" "Under Construction" sign to their car's demise and their own possible injury (I don't recall any deaths -- those would not likely have been presented in the snidely contemptuous tone that most of these stories had).
There are some recent ones, tho. Let's have a look at GPS Made Me Drive Off the Road stories in their current incarnation and contemplate what this might mean for the future of autonomous vehicles.
https://www.usatoday.com/story/tech/nation-now/2018/01/23/waze-unable-explain-how-car-ended-up-lake-champlain/1060504001/
In this incident, the vehicle was borrowed, the area rural, the weather foggy, and rather than being a road under construction, it was a boat launch. It wasn't car GPS, either, it was Waze, which lets users modify things. If this turns out to be a prank gone awry, things could become downright fascinating legally.
https://www.theridgefieldpress.com/101443/family-of-girl-in-train-accident-considers-legal-action/
In this incident, a young driver on a foggy night taking unfamiliar back roads to avoid the highway turned incorrectly and wound up on the train tracks instead of a nearly parallel road. The intersection has apparently had at least three related incidents recently, according to the above article.
I will note at this point that fog is, ahem, clearly a factor. While the local resident thinks signage should be improved, I'll just tell a brief story here to illustrate how that might not help. Early in 2002, I visited the cathedral in Chartres, France. Visitors to this cathedral or, really, anyone who has ever ridden the train through this station knows that the cathedral is _right_ next to the train station. Quarter mile away, maybe. Because of this, there is little to no signage indicating how to get from the train station to the cathedral. I got off the train on a -- bet you know what I'm about to write next -- extremely foggy day, and looked around, baffled as to where to go. I asked someone (in French) and they looked at me like I was a crazy person and pointed into the fog (they had clearly been here before). So off I went, and it was _right fucking there_, but of course with the fog, you could barely see six feet in front of yourself. So, signs not necessarily guaranteed to help.
http://www.philly.com/philly/news/pennsylvania/king-of-prussia-road-overpass-radnor-20171229.html
In this entry, drivers of large vehicles (trucks and other commercial or construction vehicles) crash into low clearance overpasses or otherwise get stuck in neighborhoods where they should not be, and which are signed that they should not be -- but the drivers are not looking at the signs, they are following turn by turn directions on apps. No fog here -- I would argue that this is a situation where the apps probably should include clearance information, and prior to routing, a person using such an app with a vehicle that requires unusually high clearance (higher than a large SUV, type of thing) should enter that information prior to routing. But there are other solutions as well.
http://newyork.cbslocal.com/2018/01/05/leonia-streets-off-navigational-apps/
Leonia, NJ has passed a law (ordinance, whatever) saying you can't cut through there -- you can only use those streets if you live there or have business there. The article spells out that this kind of rule has gone through the courts before and been upheld, and Waze has not opposed this kind of rule making in the past. The idea is to get the area out of the routing algorithms based on cost (sort of like the toll / no toll routing choice, but more so presumably).
So rather than including detailed clearance information, towns could designate roads or parts of roads with low clearances as "car only" and that would presumably wind up embedded in the routing apps (again, totally do-able, there are entire highways that are non-commercial vehicle only, here on the east coast), altho again, users of the apps would have to specify whether they could use these routes or not, but when they got it wrong, journalists and readers could once again happily mock them without guilt.
Single vehicle accident, not related to any app error or roadway issue, purely a distracted driving problem:
https://www.centralmaine.com/2018/01/11/driver-charged-after-truck-flips-in-new-vineyard/
There might be something else to worry about with GPS:
https://www.popsci.com/space-weather-woman
This starts as another, My GPS Told Me To Drive Off the Road (onto a railroad track), but quickly heads in the direction of another explanation. The idea here is that solar flares plus the usual atmospheric iffiness at dawn and dusk might cause your GPS to give you bad information. So this isn't necessarily just an app limitation, or a prankster on Waze, or people being foolish young drivers or distracted or unable to see what they are doing because of fog. Rather, the GPS co-ordinates they are receiving might be wrong, and everything else is just more wrong built on top of that.
I periodically blog very skeptically about the promises of autonomous driving. In some of the above cases, you could imagine a smart cruise control system plus really helping out (especially with the single vehicle truck flip), altho on the other hand, if the driver had just stopped and done a little navigating _before_ starting up the truck, that might have helped a lot, too -- as would a more built-in navigation system that wasn't as distracting to interact with.
But autonomous systems -- whether full or partial -- rely upon sensor data to know what to do. I've been driving down my street on a clear (no precipitation or fog) day (not twilight on either side) and had the little red car icon that indicates I am following too closely to the vehicle in front of me pop up briefly. Under at least some scenarios, in some vehicles, this might result in the car hitting the brakes for me, to avoid striking that car. I mention this, because on these specific occasions to which I am referring, there were no cars in front, behind, or to either side of me on the road for as far as the eye could see (which is maybe a quarter mile before the road ends in intersections). Sensors can be those kinds of sensors; they can also be cameras. And cameras are subject to all the fog problems that bedeviled the poor drivers above. And the idea that, well, fine, you can't see, but you can drive off of instrument data, well, between the app problems and the space weather, that clearly has some issues, too.
None of this should be construed as a reason to avoid further improving safety systems and even autonomy in cars. But if we are already seeing this kind of problem crop up long before full autonomy, we'd probably better start working on these things before they get a lot worse.
There are some recent ones, tho. Let's have a look at GPS Made Me Drive Off the Road stories in their current incarnation and contemplate what this might mean for the future of autonomous vehicles.
https://www.usatoday.com/story/tech/nation-now/2018/01/23/waze-unable-explain-how-car-ended-up-lake-champlain/1060504001/
In this incident, the vehicle was borrowed, the area rural, the weather foggy, and rather than being a road under construction, it was a boat launch. It wasn't car GPS, either, it was Waze, which lets users modify things. If this turns out to be a prank gone awry, things could become downright fascinating legally.
https://www.theridgefieldpress.com/101443/family-of-girl-in-train-accident-considers-legal-action/
In this incident, a young driver on a foggy night taking unfamiliar back roads to avoid the highway turned incorrectly and wound up on the train tracks instead of a nearly parallel road. The intersection has apparently had at least three related incidents recently, according to the above article.
I will note at this point that fog is, ahem, clearly a factor. While the local resident thinks signage should be improved, I'll just tell a brief story here to illustrate how that might not help. Early in 2002, I visited the cathedral in Chartres, France. Visitors to this cathedral or, really, anyone who has ever ridden the train through this station knows that the cathedral is _right_ next to the train station. Quarter mile away, maybe. Because of this, there is little to no signage indicating how to get from the train station to the cathedral. I got off the train on a -- bet you know what I'm about to write next -- extremely foggy day, and looked around, baffled as to where to go. I asked someone (in French) and they looked at me like I was a crazy person and pointed into the fog (they had clearly been here before). So off I went, and it was _right fucking there_, but of course with the fog, you could barely see six feet in front of yourself. So, signs not necessarily guaranteed to help.
http://www.philly.com/philly/news/pennsylvania/king-of-prussia-road-overpass-radnor-20171229.html
In this entry, drivers of large vehicles (trucks and other commercial or construction vehicles) crash into low clearance overpasses or otherwise get stuck in neighborhoods where they should not be, and which are signed that they should not be -- but the drivers are not looking at the signs, they are following turn by turn directions on apps. No fog here -- I would argue that this is a situation where the apps probably should include clearance information, and prior to routing, a person using such an app with a vehicle that requires unusually high clearance (higher than a large SUV, type of thing) should enter that information prior to routing. But there are other solutions as well.
http://newyork.cbslocal.com/2018/01/05/leonia-streets-off-navigational-apps/
Leonia, NJ has passed a law (ordinance, whatever) saying you can't cut through there -- you can only use those streets if you live there or have business there. The article spells out that this kind of rule has gone through the courts before and been upheld, and Waze has not opposed this kind of rule making in the past. The idea is to get the area out of the routing algorithms based on cost (sort of like the toll / no toll routing choice, but more so presumably).
So rather than including detailed clearance information, towns could designate roads or parts of roads with low clearances as "car only" and that would presumably wind up embedded in the routing apps (again, totally do-able, there are entire highways that are non-commercial vehicle only, here on the east coast), altho again, users of the apps would have to specify whether they could use these routes or not, but when they got it wrong, journalists and readers could once again happily mock them without guilt.
Single vehicle accident, not related to any app error or roadway issue, purely a distracted driving problem:
https://www.centralmaine.com/2018/01/11/driver-charged-after-truck-flips-in-new-vineyard/
There might be something else to worry about with GPS:
https://www.popsci.com/space-weather-woman
This starts as another, My GPS Told Me To Drive Off the Road (onto a railroad track), but quickly heads in the direction of another explanation. The idea here is that solar flares plus the usual atmospheric iffiness at dawn and dusk might cause your GPS to give you bad information. So this isn't necessarily just an app limitation, or a prankster on Waze, or people being foolish young drivers or distracted or unable to see what they are doing because of fog. Rather, the GPS co-ordinates they are receiving might be wrong, and everything else is just more wrong built on top of that.
I periodically blog very skeptically about the promises of autonomous driving. In some of the above cases, you could imagine a smart cruise control system plus really helping out (especially with the single vehicle truck flip), altho on the other hand, if the driver had just stopped and done a little navigating _before_ starting up the truck, that might have helped a lot, too -- as would a more built-in navigation system that wasn't as distracting to interact with.
But autonomous systems -- whether full or partial -- rely upon sensor data to know what to do. I've been driving down my street on a clear (no precipitation or fog) day (not twilight on either side) and had the little red car icon that indicates I am following too closely to the vehicle in front of me pop up briefly. Under at least some scenarios, in some vehicles, this might result in the car hitting the brakes for me, to avoid striking that car. I mention this, because on these specific occasions to which I am referring, there were no cars in front, behind, or to either side of me on the road for as far as the eye could see (which is maybe a quarter mile before the road ends in intersections). Sensors can be those kinds of sensors; they can also be cameras. And cameras are subject to all the fog problems that bedeviled the poor drivers above. And the idea that, well, fine, you can't see, but you can drive off of instrument data, well, between the app problems and the space weather, that clearly has some issues, too.
None of this should be construed as a reason to avoid further improving safety systems and even autonomy in cars. But if we are already seeing this kind of problem crop up long before full autonomy, we'd probably better start working on these things before they get a lot worse.