Press "Enter" to skip to content

25 Comments

  1. Myself
    Myself March 19, 2018 2:44 pm

    Over 40,000 people were killed in car crashes last year, so I’m not sure one fatality is particularly damning, especially considering the woman jaywalked after dark.

    Also since there was a human “safety” driver, why didn’t they override and stop the car?

    Perhaps the woman crossed within the cars stopping distance.

  2. Pat
    Pat March 19, 2018 3:14 pm

    The driver may not have had time to react and correct the car before the accident occurred. Accidents happen very quickly, and even alert drivers can not always stay in control, or know precisely what to do if multiple situations (such as other cars on the road) are present.

  3. Claire
    Claire March 19, 2018 3:31 pm

    “Over 40,000 people were killed in car crashes last year, so I’m not sure one fatality is particularly damning,”

    I’m not sure it is, either. OTOH, I’m not sure it’s NOT damning.

    At this point, it’s totally on the self-driving car industry to prove these vehicles are safe — and I don’t think experimenting with human lives on real roads and streets is the wisest way to do it when these vehicles are still so inadequately tested. We already know that neither Silicon Valley wizards nor politicians give two hoots about ordinary people.

  4. david
    david March 19, 2018 3:48 pm

    “She also said the fatality should be considered in the context of all accidents.

    More than 37,000 people, including almost 6,000 pedestrians, died in traffic accidents in the US in 2016, according to the US Department of Transportation.

    “We need to be fair and look at all the data,” she said.”

    To be fair, that is a red herring. To be fair in a comparison, we’d have to count the total annual fatalities by the total number of cars on the road all the days people were killed, and compare that to the same totals for driver-less cars and fatalities from them. I suspect we’d find the Uber self-drivers kill WAY more people as a percentage of days drive. And gosh knows how high the percentage would be if it was based on fatalities vs. mileage.

  5. ellendra
    ellendra March 19, 2018 4:12 pm

    “Perhaps the woman crossed within the cars stopping distance.”

    Or blended so well into the background that they didn’t see her.

    It happens all the time where I live. People in dark clothes, jaywalking at night, in areas where there are no street lights, without checking to see if there’s a car coming because they expect all drivers to stop for them. How the hell are you supposed to see a shadow in the dark like that???

    I’m amazed that the accident rate is as low as it is.

  6. Jim Brook
    Jim Brook March 19, 2018 5:56 pm

    Imagine being in a self-driving car, and a big farm vehicle is doing 25 mph on a 50 mph road. You would just be stuck behind it until the farmer turned, perhaps 10 miles down the road. It would not pass. What if a tumbleweed blew across the road, while you are on a curve, and it is icy. The sensors would likely pick it up, and brake, sliding you right out of your lane. Suppose you need to swerve to avoid something, and your “smart” car won’t let you because you did not first signal your turn, thus the car would keep you in your lane. There was a test of one of these automated cars in a California city, Los Angeles I believe. It was completely defeated by a double-parked delivery truck. It just stayed there until a human took over. I want to be in control.

  7. Claire
    Claire March 19, 2018 7:01 pm

    Meanwhile, in Arizona:

    Sgt. Ronald Elcock, a Tempe police spokesman, said during a news conference that a preliminary investigation showed that the vehicle was moving around 40 miles per hour when it struck Ms. Herzberg, who was walking with her bicycle on the street. He said it did not appear as though the car had slowed down before impact and that the Uber safety driver had shown no signs of impairment. The weather was clear and dry.

    https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html

  8. Mike
    Mike March 19, 2018 7:34 pm

    Two things…

    1st, this technology is still in its infancy, however like other technologies, this one too will evolve. Technology these days is evolving exponentially, just look at a vehicle that you can buy today that will warn you if you’re leaving a lane verses one made a decade or two ago.

    2nd, all the stories about this accident have one thing in common, they are very light on details. While the victim wasn’t using a cross walk, the stories don’t say anything about the conditions where the victim tried to cross the street or what she was doing. For example, did she suddenly step out from between two parked cars and get hit? Was she walking along and simply turned and stepped out onto the street with no warning. Did she stop and look before stepping off the curb? Was she texting? Did the clothing she had on blend in with the background? What was the weather and lighting conditions? Was it safer for the passenger and driver for the car to hit the woman as opposed to swerving into something else? Even computer have limits to reaction times and what they can do while a passenger is in the car.

    So before I vilify this technology, I would like to see a little more detail about what happened.

  9. MamaLiberty
    MamaLiberty March 20, 2018 3:14 am

    I can’t imagine wanting a computer to drive my car… Why do people want this thing anyway? Sort of like a robot dog. What’s the point?

  10. Claire
    Claire March 20, 2018 6:42 am

    At this point, ML, I think few actual people want driverless cars. Some do of course. But most of the push toward driverless vehicles is motivated by trucking companies and ride-hailing companies who don’t want to have to pay humans to drive. There’s also an assumption by some that computer-driven vehicles would be safer than ones driven by us quirky, distractable humans.

    Of course it will also be an authoritarian dream when every move a vehicle makes can be remotely tracked and controlled.

  11. MamaLiberty
    MamaLiberty March 20, 2018 7:06 am

    There is that, Claire. But considering the serious love affair most people have with mobility and their cars/trucks, I can’t see how this technology would be accepted outside some dense population centers. A visitor from the east coast a few years ago was shocked, SHOCKED, I tell you, to discover that there are NO taxis, buses or other “public transportation” here. Well, except the “free ride” for drunks.

    Somehow, I can’t quite visualize all the cowboys and ranch ladies in their big pick-up trucks going for a computer driver. Nope… not happening. 🙂 And not even the old lady who is too chicken to drive on the snow most of the time.

    I suspect there are one or two other things on the horizon that will challenge freedom people sooner, but you just never know.

  12. ILTim
    ILTim March 20, 2018 7:14 am

    From the link ‘Myself’ shared,

    The self-driving Volvo SUV was outfitted with at least two video cameras, one facing forward toward the street, the other focused inside the car on the driver, Moir said in an interview.

    From viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,”

    As others have stated, vehicular accidents are common. On public roads, in our current situation, they are in fact unavoidable. Therefore, any testing, using, or implementation of ANY vehicle, piloted by any ethnicity, age, gender, or autonomous system, will with 100% certainty, kill people. This is not air travel. Using cars on roads has always been comparable to throwing humans into a blender.

    What is actually important is the statistical occurrence of deaths. We know a bit about this already; give a 19 year old a powerful motorcycle… what is your expectation? Some groups of humans are more likely to kill with a steering wheel than others.

    Computer driving technology is advancing rapidly and has a massive potential, a huge financial incentive. It WILL rapidly prove itself to be statistically safer than any other group of human drivers, professional rally drivers included. It WILL become a better driver than any of us. It behooves us to learn to live with that fact now.

    Do I want an autonomous car? Sometimes. Other times I want to be in control. I like driving, I like to autocross, I have raced ATV’s. My reflexes and abilities are pretty good. My car control training goes well beyond average. But the software in a Subaru WRX or a BMW sedan which controls the way those cars slide and rotate while cornering is phenomenal. Its advanced so fast during the past ten years… the stability control systems take more input and exert more control over all four brakes than any human is capable of. I drive cars like this vigorously on and off the track, with these systems on and with them disabled.

    As it stands now, a person can drive faster and better with the control aids disabled – but will have much more spectacular crash events on occasion. The risk opportunity is reduced in real terms by the control systems. They are well developed. In order to drive competitively you must take risks beyond what is allowed by the software, to me this signals mature technology.
    Right now driverless cars are performing similar to 1997-2007 era stability control systems. Those cars needed the system to be disabled in snow at all, and alert capable drivers disabled them most of the time. The current state of tech has levels of intervention, which drivers can match to their level of commitment to driving mile-by-mile. The systems are helpful in all cases except for closed track use.

    I do not believe that it is possible to be dismissive of driverless tech at this point. The computer control systems have a higher capability than humans and will evolve to take that place, humans will be removed from the control loop eventually.

  13. Mark Call
    Mark Call March 20, 2018 9:18 am

    Yeah, accidents happen. But, the Truth remains:

    I prefer to have both a STEERING WHEEL — AND a pistol.

  14. Claire
    Claire March 20, 2018 9:18 am

    I agree that all the evidence that’s appeared so far points to the accident not being the vehicle’s fault.

    There’s still plenty of reason to dislike and distrust self-driving tech. Privacy and loss of individual control over our lives being among them. Even if these vehicles prove “safe,” there’s also no doubt that they will be programmed not for the safety of the individuals riding in them, but for some mythical greater good.

  15. Mark Call
    Mark Call March 20, 2018 9:20 am

    re: Gun AND a steering wheel…

    And Claire is right. The problem is that the PTB are fully intent on removing the choice of either.

  16. Scott
    Scott March 20, 2018 3:55 pm

    While this is a tragedy, there’s no perfect system. I would rather see a drunk guy shoved into the back seat autonomous car than for him to drive home. Still, I would want the option to drive it myself-I wouldn’t want a totally autonomous car, or one that could be switched to autonomous from a remote point by someone else… This technology is still relatively new, and will get better in time. The autonomous car has a place, and I doubt they will go away. From what I can tell, it wasn’t the car’s fault. How fast can the human driver take over? Would a human driver have hit the woman as well? If she darted out in front of the car, leaving no time to react, it’s her fault, or it was a pure accident, depending on the exact circumstances.

  17. Pat
    Pat March 20, 2018 5:04 pm

    “The autonomous car has a place, and I doubt they will go away.”

    And where is that place? We don’t know that yet. Until we do, it has no place in real life. Maybe it’s role is to be JUST for testing – to expand the ability of the computers onboard.

    If testing is to be done, let it be done off the streets, without human involvement. Testing each car as it comes off the assembly line is not adequate – perhaps testing on a track against other autonomous cars is a better choice. Work out the kinks under ALL conceivable conditions. This is where computers can help; if they are so great, let them do their stuff, determining the conditions that will be met in human interaction. Obviously the full extent of that knowledge has not been reached yet.

    It’s not enough to know what the car can do – it must be able to do it immediately and correctly, for the benefit of the humans involved. Cars can be replaced, humans cannot.

  18. ellendra
    ellendra March 21, 2018 8:50 am

    What puzzles me is that self-driving technology isn’t used in trains more often. It seems like that would be an easier application. A computer that recognizes the recommended speed for a given stretch of track could certainly have prevented a few derailments.

    Or maybe it is used fairly often, and nobody says so?

  19. Comrade X
    Comrade X March 21, 2018 11:23 am

    Mandate it; For the children!

    Who needs any stinkin freedom if you can make life easier for some or most people.

  20. Claire
    Claire March 23, 2018 3:26 pm

    https://www.nytimes.com/2018/03/23/technology/uber-self-driving-cars-arizona.html

    SAN FRANCISCO — Uber’s robotic vehicle project was not living up to expectations months before a self-driving car operated by the company struck and killed a woman in Tempe, Ariz.

    The cars were having trouble driving through construction zones and next to tall vehicles, like big rigs. And Uber’s human drivers had to intervene far more frequently than the drivers of competing autonomous car projects.

    Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.

  21. Pat
    Pat March 23, 2018 4:26 pm

    One issue not yet raised is driver training. The average driver drives with their hands on the wheel, this is the way they learned, and this is the way they drive (excluding texting, of course). The autonomous car driver will have to learn a new way of driving, and have to tune in to emergency situations.

    Most people do not expect (or prepare for) emergencies. But driving in an autonomous car could feel like working in an emergency room – ONLY in an emergency will the driver retain tbe hands-on driving experience – and s/he must be MORE alert than ever for the exceptional situation that may occur. For the average driver, this can be quite stressful if they are conscientious drivers. So special training, both emotional and reactive, might be required.

  22. Claire
    Claire March 23, 2018 5:05 pm

    “But driving in an autonomous car could feel like working in an emergency room – ONLY in an emergency will the driver retain tbe hands-on driving experience – and s/he must be MORE alert than ever for the exceptional situation that may occur.”

    Exactly. And if Silver were here, he’d point out the studies (multiple, I gather) that show it’s simply not realistic to take control away from the human driver, then expect that person to be ever-vigilant and ready to take over in an emergency — especially given the speed involved in automotive emergencies. That’s just not how the human brain and body works.

Leave a Reply