Driverless cars not smart enough to handle dumb humans

Every day, starting around 4 p.m., this is my view out the window at the World Headquarters of NewsCut.

traffic

Note that the light is green and the traffic isn’t going anywhere. Why? Let’s count the ways.

First, there’s that car sitting in the middle of the intersection who clearly tried to get through it before the light change so that he — I’m assuming it’s a he — didn’t have to stop 8 feet earlier — much better to stop in the intersection, which keeps everyone all gridlocked up.

Swell job, sir.

The driver in the black car, turning left, has also already blocked the intersection for the white car, heading in the opposite direction from Mr. I Own This Intersection.

This is the daily disaster in downtown since river crossings were restricted months ago. The traffic — in this case on Robert Street — is stalled because Interstate 94 — Spaghetti Junction, for you old-timers — has been turned into an off-ramp for the Lafayette Bridge, the bridge project that never ends. And they’re still working on Interstate 35E, which gums up northbound traffic.

Robert Street in the other direction is jammed because (a) St. Paul’s traffic lights have all the timing of a bad comedian and (b) cars are allowed to turn left at Kellogg St (part of which is also closed), backing up traffic all the way to the above location.

The shortcut on Warner Road is backed up to Kellogg because the Child Road bridge over the railroad is still out.

Even the cops at the raucous Republican National Convention in 2008 didn’t seal off the downtown as effectively as these drivers did.

This picture is why the productivity of people who work in downtown Saint Paul has risen dramatically; nobody wants to go out and mix with these people.

That’s the problem with traffic — humans doing stupid human things — that has, apparently, befuddled the smart people trying to build a transportation system that cuts them out of the equation, the New York Times reports today.

Google, which is trying to develop driverless cars, is finding that its cars are getting hammered by other cars, driven by humans, because they’re programmed to follow traffic regulations and obey the law.

It is not just a Google issue. Researchers in the fledgling field of autonomous vehicles say that one of the biggest challenges facing automated cars is blending them into a world in which humans don’t behave by the book. “The real problem is that the car is too safe,” said Donald Norman, director of the Design Lab at the University of California, San Diego, who studies autonomous vehicles.
..
Google cars regularly take quick, evasive maneuvers or exercise caution in ways that are at once the most cautious approach, but also out of step with the other vehicles on the road.

“It’s always going to follow the rules, I mean, almost to a point where human drivers who get in the car and are like ‘Why is the car doing that?’” said Tom Supple, a Google safety driver during a recent test drive on the streets near Google’s Silicon Valley headquarters.

Since 2009, Google cars have been in 16 crashes, mostly fender-benders, and in every single case, the company says, a human was at fault. This includes the rear-ender crash on Aug. 20, and reported Tuesday by Google. The Google car slowed for a pedestrian, then the Google employee manually applied the brakes. The car was hit from behind, sending the employee to the emergency room for mild whiplash.

One of the problems Google is having is its cars leave a safe stopping distance between them and the car in front of them, which humans insist on trying to squeeze their cars into.

More jarring for human passengers was a maneuver that the Google car took as it approached a red light in moderate traffic. The laser system mounted on top of the driverless car sensed that a vehicle coming the other direction was approaching the red light at higher-than-safe speeds. The Google car immediately jerked to the right in case it had to avoid a collision. In the end, the oncoming car was just doing what human drivers so often do: not approach a red light cautiously enough, though the driver did stop well in time.

John Lee, a professor of industrial and systems engineering and expert in driver safety and automation at the University of Wisconsin, says humans often resolve these problems by making eye contact. Driverless cars can’t make eye contact.

“Google got it wrong,” one commenter on the New York Times observed. “They should give more control to the driver….not take it away from them.”

Yes, that’s working well.