California DMV backs allowing self-driving cars with no human on board

“Self-driving cars with no human behind the wheel — or, for that matter, any steering wheel at all — may soon appear on California’s public roads, under regulations state officials proposed Friday,” David R. Baker and Carolyn Said report for The San Francisco Chronicle. “The rules represent a delicate balance, trying to ensure the safety of a new technology many people don’t trust while avoiding tough restrictions that could send car companies fleeing to other states.”

“Until now, California has required all 27 companies testing autonomous cars in the state to have someone in the driver’s seat, ready to take over, when testing on public roads. And those vehicles needed to have steering wheels and brake pedals, even if some self-driving car engineers didn’t consider them necessary,” Baker and Said report. “Both of those requirements would disappear under the new regulations proposed by the California Department of Motor Vehicles.”

“Instead, automakers would need to certify to the state that their own testing — either on closed tracks or through computer modeling — shows the cars are ready to operate on public roads with no one behind the wheel,” Baker and Said report. “And if automakers want to deploy cars without such standard controls as a steering wheel and pedals, they would first need the approval of federal highway safety officials.”

Read more in the full article here.

MacDailyNews Take: Driving at or under the speed limit is going to suck.

We’ll await the “carOS” jailbreaks with bated breath!

SEE ALSO:
Alphabet Inc’s Google says it bears ‘some responsibility’ after self-driving car hit bus – February 29, 2016
U.S. tells Google computers can qualify as drivers – February 10, 2016
Why driverless cars will screech to a halt – February 9, 2016
Google’s self-driving cars hit the road this summer – May 15, 2015
Google acknowledges 11 accidents with its self-driving cars – May 11, 2015
Apple vs. Google in self-driving cars: To map or not to map? – March 6, 2015

25 Comments

    1. If cars are autonomous (no pun intended) and have no online connection that might let hacking in I don’t see a problem. Once cars start “talking” to other cars, etc. could be where the trouble starts. Hackers are the modern disruptive plague on technology. Which is probably why a Star Trekian future will never happen. There’ll always be someone attempting to pervert and use existing technology for evil purposes other than it’s intended purpose.

      1. I had assumed that an online connection was all part of the parcel actually, be it directly or indirectly. Even if there wasn’t I’m not sure it would be totally invulnerable to electronic interference, especially as it’s an ever changing landscape. Only time will tell if in operation such interference can be prevented if those wishing to cause havoc are intent on it and the motivation great enough.

        As a side note North Korean assassinations could take a new twist, though I suppose it’s a little less dangerous than the Russians tendency to use radioactive material on public streets for the same purpose.

    2. This legislation is political and meaningless. It effectively says that a company can make and market a car that drives itself with nobody behind the wheel, but that the company takes on the responsibility of the safety and operation of it.

      I don’t see any company doing that for a long time since the technology is a ways off.

      I always use the airline industry as an example. We can make unmanned aerial vehicles, but nobody is allowed to fly people in them commercially.

      This legislation may simply create a way for utility use (deliveries, moving, etc.) but not for consumers.

      And I don’t want a car to control me, but thanks anyway.

      1. As usual, dswe, your observations and conclusions are highly flawed. This represents the next logical step towards a fully autonomous transportation network. Being who you are, I would have expected you to applaud California’s choice to relax restrictions and let the free market take a bigger role. But no one ever accused you of being logical.

        1. KingMel:

          I want innovation. But I’m realistic. The road is much more dymanic (human pedestrians) compared to the sky, and we still don’t have unmanned aerial flights for the public. I don’t see software being able to truly automate driving with no need for humans for a very long time.

          I also do not want to cede control of my driving to government and big business who want to see everyone stuck on ridesharing programs ($$$ subscription per month) vs. the freedom to own and control your own vehicle.

          Self-driving vehicles are about government and big business control. It will slowly rip the right to manually drive yourself as government will force self-driving legislation on the industry.

          Another big issue is safety when it comes to hacking. Outside of the problems with the software needing much more innovation, hacking and controlling people’s vehicles is the other big elephant in the room.

          This latter cannot be understated and there is no real solution to it as of current. Massive damage and death can result from hacking automous vehicle software which update over the air.

          So a realist I am. Self-driving vehicles are neither safe, nor is it somethinf that the market has demonstrates they even want.

  1. Probably still a while off. It would be good to b able to get to work without having to focus on driving. My new car has next gen cruise control so I can drive on the freeway and have the car do most of the braking and acceleration.
    Whilst it is great to drive fast, most of our experience is following behind another car (especially in the crowded freeways of northern California). Often in stop / start traffic where we tend to move down the freeway like a giant caterpillar. Imagine traffic moving at a steady pace even at peak times because self driving cars will maintain a healthy distance and allow travel to occur without bottlenecks.

  2. Let’s assume that they are successful in developing an affordable autonomous car. I guarantee that after the lawers and safety nazis have their say, it will be the slowest vehicle on the road. Just like traffic lights are implemented today with sensors forcing everyone to stop regardless of traffic volume to “calm” speeds, not to maximize throughput.

  3. Who cares if it’s slow, if I can go out for a nice dinner, have an extra glass of wine, and go to sleep as I’m delivered home? We’d all do well to slow down in our society anyway.

    1. Years ago I gave myself permission to drive at, or below, the speed limit. I pull into the “travel lane” and happily follow, at a safe distance, whoever else ahead of me is “cruising” the same way. It was not difficult at all, yet was so much more relaxing. I was amazed at what a huge luxury I had just given myself. Plus, I am wasting less fuel (speeding is fuel-inefficient) and contributing to safer roads and society. Check it out for yourself. Give yourself a treat.

      1. As long as you don’t hold up traffic behind you, then no one has a problem. But it is definitely a problem that there are human drivers who are causing bottlenecks. Many of them driving with an iPhone in their hand….

        1. Agree. It is dangerous to travel substantially below the prevailing rate of traffic in any setting. And it is dangerous to drive distracted. But these are separate issues from my choosing not to speed. On a two-lane highway, cruising in the righthand “travel lane” has never been an issue. On a busy one-lane “highway” (which sounds like an oxymoron, doesn’t it?) I travel at the prevailing traffic speed, even if I would choose to go slower on my own. This seems to happen primarily during rush hours on roads around metropolitan centers. Everybody’s late…

          Over distances, the best way to increase average travel speed is to reduce periods of zero speed to the minimum necessary. On occasional road trips from the East Coast to my mountain town in the Rockies, I chuckle when I see some vehicles pass me at great speed (say, at least 80 mph) multiple times over the course of many hours. They are obviously speeding to get wherever they are going fast …but then they end up taking long or frequent breaks (for fuel, coffee, or to relax and stretch, or whatever) …which negates any time gain from speeding. It seems silly. Most cars today are like Donald Fagen’s Kamakiriad — luxury, self-sustaining, travel/living pods. I gave myself permission to drive slower and relax about it. It is more enjoyable, better for the environment (most people do not realize how much fuel they waste by driving substantially above 55 mph; wind friction increases as a square of the increase in speed) and safer. Anyway, chilling out behind the wheel has been a real luxury for me for years now.

  4. And what happens when the first person is killed by a robot car? A pedestrian or driver of another car, a cyclist or a passenger., someone will be killed and who is liable? If the robot car has been poorly maintained, who holds the liability?
    If the car is hacked and hijacked, how will police disable it and under what statutes will they be charged? What if a robot car is programmed by hacking to kill someone intentionally? Has body of law been updated so as not to leave gaps in the law and protect the public?
    I am a believer in the Law of Unintended Consequences. Well intentioned actions often produce unforeseen consequences. From what I have seen, driverless technology has not proven itself capable of handling poor roads and less than ideal weather. Driving does not always happen on dry, well maintained roads in the sunshine? How about shitty, poorly maintsined roads, slick with rain in the fog or with patchy black ice in the twilight? Much of driving in adverse conditions is not easily quantified so that machine learning can consistently deal with it appropriately.
    What about driving situations that present a Hobson’s Choice? Run over a pedestrian or hit another car at high speed. Who cleans up the legal mess based upon the programming of the automated driving system and who is liable?

    For all the hype, I do not see a groundswell of demand among customers for this. The real deal will be driverless big rigs. No truck drivers, no rest breaks 24/7/365. 80,000 lbs whizzing by you at 70 MPH with no driver.
    Kinda like Terminator 2

    1. You make some excellent points, DavGreg. I, too, like to remind people of the risk of unintended consequences. But not as something to be avoided, which is impossible, simply to be recognized and considered.

      But your tone is a bit too strident and dismissive. The progress from driver-aided to fully automatic vehicles will take some time. The laws and regulations will undoubtedly lag the progress of the technology – that is always true – but we will eventually evolve the law to address the liabilities and responsibilities with respect to automation in everyday life. It happened with the advent of the “horseless carriage” and commercial air flight, and it will happen wth respect to increased levels of automation.

      Make no mistake, this is going to happen. In truth, we have already taken the first steps with adaptive cruise control, automated parallel parking, lane change warning and control, emergency braking, etc. Full automation will take a while, and will likely initially be limited to ideal driving conditions or certain areas at first. But those restrictions will be gradually relaxed as the technology improves. So you might want to change your attitude a bit.

      1. Lol. No. DavGreg raises the biggest problem. When a car manufacturer like Mercedes decides that given a choice, the driver will survive over a bystander they are in MAJOR murky legal trouble. They are intentionally making a choice that will result in the death of a innocent bystander. At minimum that’s manslaughter and since the company knows that programming will kill people you could get a murder charge going because of the premeditation.

        If I was a software engineer working on this programming I would be very nervous, because the court doesn’t recognise the excuse, ” I was told to write this software” as a defense. You are a major party to decisions and actions that result in premeditated death. Hello jail time

        If I have a car accident that results in accidental death I can be charged because my actions caused that death, do you really think the people who wrote the software that makes that same choice are exempt? Remember the software is in charge of the vehicle, it’s the responsible party, and who makes that software?

        First murder/manslaughter case and this whole industry is just going to go away. Or the legal industry decides autonomous vehicles are people and starts locking them up. Good luck with that.

        1. lol. No. While the issue on which you focused is significant and must be addressed, it will not stop the progress of transportation automation. One possible outcome is that the U.S. Federal Government will decide to provide some degree of indemnification for driverless vehicles based upon verification that the hardware and software implementation is validated to meet a defined set of requirements. This type of thing has precedence. For example, the Space Launch Liability Indemnification Act.

          You might want to ponder things a little bit and perform a couple of internet searches before you decide to post drivel based upon your innate wisdom.

  5. ASK ANY AIRPLANE PILOT about autopilot malfunctions and sensor miscompare issues. The IDIOTS that think you should get in a “vehicle” with no driver controls are simply gonna be riding in a coffin if they get the dreaded “AP fail” red light annunciator……..

    Never mind that aircraft fly with thousands of feet and miles of separation…..

    WhAT couLD gO wRonG……

  6. In America, the first thought related to autonomous driving that comes to mind is product liability (i.e. who will sue whom if there is an accident). And what follows that thought is the myriad scenarios that would extend that liability to anyone in the product development chain…

    Well, lucky (for the auto industry, as well as for everyone), this isn’t the first product liability concern ever. There is history of precedents. Many, many machines have caused loss of life over the course of their regular use, and the American judicial system had figured out a way to establish responsibility (and liability) without completely obliterating industries. We still fly in airplanes, even though they (occasionally) crash; we drive in cars, even though thousands of people die from them every year (over 30,000).

    And when the overzealous litigious system threatens to snuff out an industry, common sense prevails. In mid-80s small airplane industry took a nosedive, with annual sales declining by over 90% in a span of a few years. The main culprit was product liability cost, which went from around $80 per airplane in the 60s to over $100,000 (yes, hundred thousand) per airplane in the late 80s. The death rate in small plane accidents was roughly the same as in motorcycles, but the perception in people’s minds was always that an airplane is a death trap, flying is unnatural and the victim died because of that unsafe flying contraption. In vast majority of product liability lawsuits, pilot was at fault (flying into bad weather, loss of situational awareness, fuel starvation due to poor planning, etc), and yet relatives of the deceased sought (and often got) large rewards (thanks to sympathetic juries) from aircraft manufacturers. One of the typical examples was Cleveland vs. Piper Aircraft: a pilot, who hasn’t had biennial review, decides to fly, but since he hasn’t flown in the last 90 days isn’t allowed to fly passengers; yet they remove front seat, install a camera, jury-rig some 2×4 wood for a cameraman to sit (facing backwards), with pilot flying form the back seat, with view obstructed by camera and cameraman; he knows the airport is closed, but still goes to fly, and on the takeoff roll he doesn’t see the van parked across the runway (which is CLOSED!), hits the van and dies. His wife sues — Piper Aircraft!! (for not installing shoulder harness) and wins $2.5M from a jury verdict…

    There is no doubt that there will be jury verdicts against big companies, in favour of victims’ families for all sorts of deaths, regardless of who is really at fault. After all, lawyers everywhere need new shoes for their kids…

    All that said, I don’t see it affecting the move towards driverless cars one bit. The more of the driving we leave to computers, the safer everyone will be. Tort law will find its way around this, and I am absolutely sure, 30 years from now, nobody will bother actually driving (other than muscle car enthusiasts, for the sake of it).

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.