California to allow self-driving cars on road without human backup by April
Source: UPI
By Ray Downs | Feb. 26, 2018 at 11:01 PM
Feb. 26 (UPI) -- California will allow testing of self-driving vehicles on the road without a human backup driver by April 2, state officials announced Monday.
"This is a major step forward for autonomous technology in California," California Department of Motor Vehicles Director Jean Shiomoto said , according to the Mercury News. "Safety is our top concern and we are ready to begin working with manufacturers that are prepared to test fully driverless vehicles in California."
Self-driving vehicles have been allowed on California roads since 2014 as long as a human driver was able to take the wheel. But the emerging technology is still technically in the testing phase and while Monday's announcement does away with the human requirement inside the vehicle, California will still require that a remote human operator monitoring the vehicle, Recode reported.
John M. Simpson, a director for Consumer Watchdog, told The New York Times that highway safety could be threatened when remote operators try "to control the robot car from afar" like in a video game "except lives will be at stake."
Read more: https://www.upi.com/Top_News/US/2018/02/26/California-to-allow-self-driving-cars-on-road-without-human-backup-by-April/9711519702082/?utm_source=sec&utm_campaign=sl&utm_medium=2
still_one
(92,450 posts)BigmanPigman
(51,638 posts)their devices while driving. It is scary driving three blocks! Everyone does it despite it being against the law. No one can signal to turn since their hands are busy. I try to drive between 9 AM and 3 PM since I suck at driving even when both hands are on the wheel and I am not distracted. This is why I like being a pedestrian and living in cities.
Hermit-The-Prog
(33,480 posts)Next step, the car maker, the government, a script kiddie, or the guy who's bored with swatting decides your car should suddenly turn, accelerate, stop or about face.
diva77
(7,664 posts)there if they do.
infullview
(982 posts)Self driving cars are a bad idea.
1) Sensors can malfunction causing vehicle to misinterpret conditions and make unsafe movements.
2) People who write the software for these vehicles are NOT IEEE certified and have no criminal liability for bad software.
3) No matter how good your AI is, it absolutely cannot come up with a creative solution in a crisis.
4) Software can be hacked.
5) Self driving cars don't have nearly as many "sensors" as a human being. Drivers hear things like unusual noises that tip them off that they might have a tire problem, or that excessive splashing may indicate hydroplaning, etc.. Drivers can also smell which might alert them to a component malfunction
I can think of a lot more reasons
Red Mountain
(1,739 posts)not to let people drive.......
1) substance abuse
2) texting
3) sleep deprivation
4) aggressive driving
5) speeding
6) failure to adjust driving to road conditions
7) inexperience
8) old age
We need data to assess self driving cars. They might not be ready for prime time yet but I'd bet they'll be safer than human drivers before long (if not now).
People aren't going to improve.
infullview
(982 posts)that these cars are no where near ready to be tested without a human being to intercede if something goes wrong. Humans are bound by laws, computers aren't. None of these cars should be on the road until laws are made that include accountability. If your child is killed by a computer who is to blame? There should be laws on the books NOW that spell out who goes to jail for manslaughter by machine.
Owl
(3,644 posts)eppur_se_muova
(36,304 posts)What could possibly go wrong ?