General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsUber’s Robotaxis Still Struggle With Double Parked Cars, The ‘Pittsburgh Left’
Now hiring: Driverless car drivers.
Ready or not, self driving cars are coming. But will they REALLY be ready when the check writers (and their bought-and-paid-for politicians) say they are ready?
[hr]
Ubers Robotaxis Still Struggle With Double Parked Cars, The Pittsburgh Left
http://www.digitaltrends.com/cars/uber-pittsburgh-robo-taxi-experiment/
By Jeff Zurschmeide October 18, 2016 5:00 PM
~snip ~
Not quite driverless
Right now, Uber has a small fleet of self-driving cars plying the streets of Pittsburgh. The robo-cabs are Ford Fusion Hybrids, and theyre fitted with a special roof-mounted array of cameras, GPS receivers, and a LIDAR (laser radar) system that collectively generate over a million data points every second. The autonomous system is designed to handle acceleration, braking, steering, and point-to-point navigation.
For all that, the Uber robo-cabs still have a human being in the drivers seat. In part, thats because Pennsylvania law requires a human being to be able to take control of the vehicle, but riders have reported that the self-driving system will drop the car into human control if it gets confused and doesnt know what to do. For example, riders reported that if the robo-taxi encounters a double-parked car in its lane or a stalled car, it tends to wait for the car ahead to move. When that happens, the human driver can take control and get around the unpredictable obstacle.
~ snip ~
It was a little different than what I thought initially, Butler tells Digital Trends. I thought it would be like The Magic School Bus, or that Id go in and it would be like KITT from Knight Rider. But its not The Magic School Bus. You have a human driver and hes driving at least 30 percent of the time.
~ snip ~
Right now, the Uber robo-cabs (and their human babysitters) are limited to certain neighborhoods in Pittsburgh. By restricting the operating area to a known map, Uber is able to make fast iterations on software without accounting for too many outlier cases while theyre perfecting the basics.
~ snip ~
Orrex
(63,214 posts)brush
(53,784 posts)FrodosPet
(5,169 posts)This winter is going to be interesting.
The Polack MSgt
(13,189 posts)Flat landers or rude big city easterners.
Snark...
Seriously though, as a native to the area I found it strange that Uber chose Pittsburgh as the test city. The issues to over come; geography, street size, tunnels, bridges and random one way street placement make it a unique challenge.
longship
(40,416 posts)It's not easterners humping self-driving cars, it's the CA urban folk, where it hardly ever even rains.
FrodosPet
(5,169 posts)The harder the test city, the closer to ready they will be when they are ready. Google is softballing their development compared to Uber.
Computers are great at some parts of driving, and suck at others. For most driving tasks, computers are better than we can ever hope to be. They can have omnidirectional unblinking awareness that can maintain safe distances with pitch perfect lane placements. That will (almost) never dangerously cut in front of an overtaking vehicle, and can provide the best lane selection for navigation. But, outside of some formulas and algorithms and sensory inputs, they have NO clue, no understanding of what it means to be "in motion".
Meanwhile we can look at the world around us and usually know what things are, and how they affect what we are doing. From the very earliest stages of biological life, our species has evolved to be good at moving. Not only through vision, which is easily tricked or in some cases unavailable, but through our equilibrium. Combined with reasons to relocate and raw survival instincts, even the most developmentally disadvantaged understand what it means to be "in motion".
It is the edge cases that make designing self driving cars such a challenge. As a simple example, the effects of time of day. Changing lights mean changing shadows and changing colors, which makes quick and simple object recognition MUCH more difficult. Heading east at sunrise, or west in sunset? The cameras may be even more blind than you. Rain and snow and the fading of time take away lane lines. Is that a shadow or a pothole? Hurry up, you only have 20 milliseconds to figure it out and decide what to do. Do you hard brake, steer a little, steer a lot? Can you straddle it? Are you in traffic where a hard maneuver will create a disaster?
Can the SDC see cars a half mile or more braking or maneuvering, and react long before it even needs to be an issue? What exactly is a particular pattern of speeds, brake lights, and lane changes telling us? Even poor drivers are better than computers at that.
I believe the BEST application of this emerging technology would be to help alert drivers be more alert, rather than trying to replace them.