Jump to content

Uber driver was watching video before fatal self-driving crash, police say


jpd80

Recommended Posts

Oh I agree with you, I just don't understand how the police now back track on this after clearing the person,

and maybe even improper for the police to make such a statement without first formally charging someone.

 

As Fuzzy said above,

Now, they have a 300 page report and all of the details, including video footage inside the vehicle. They claim the operator had her eyes off the road nearly 1/3 of the trip. Based on the distance the car was from the pedestrian when she came into view, they claim if she had her eyes on the road, she should have been able to stop about 43 feet before hitting the pedestrian. It will be interesting to see if the driver or Uber ends up being charged. Uber settled quickly, but other family members have obtained legal council after this report came out.

  • Like 1
Link to comment
Share on other sites

It’s not up to the cops, it’s up to the DA whether to press charges. I agree that it may have been possible for the driver to stop if she had been paying attention. Key word is possible.

 

I think what the police are doing is reinforcing that these backup drivers need to be just as attentive as a normal driver in a normal car.

 

At 40 mph you’re traveling 58 feet per second. If 43 feet was the margin of error then that’s 3/4 of a second.

Link to comment
Share on other sites

Remember, the pedestrian was hit in the right lane on the right side of the car,

it takes about 4 to 5 seconds to walk a bike across two lanes of traffic.

 

With a good set of headlights, I'm pretty sure that most drivers could see see a pedestrian with a pink bike

on the far left side of the road starting to cross and take action well in advance of running into them.

 

I also agree that the DA's office has to make the call on this and yes, the point to police revealing this

was to underscore the need for observers to be vigilant and ready to step in when needed.

Edited by jpd80
Link to comment
Share on other sites

And now this,

 

Uber had disabled an emergency braking system in a self-driving vehicle that struck and killed a woman in Arizona in March even though the car had identified the need to apply the brakes, the US agency investigating the incident has found.

The modified 2017 Volvo XC90’s radar systems observed the pedestrian six seconds before impact but “the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path”, a preliminary report released by the National Transportation Safety Board showed The incident was the first fatal crash caused by a self-driving vehicle.

The self-driving system only recognised the woman was a human 1.3 seconds before impact, when it determined emergency braking was needed.


But Uber said, according to the NTSB, that automatic emergency braking manoeuvres in the Volvo XC90 were disabled while the car was under computer control in order to “reduce the potential for erratic vehicle behaviour.”

The report gives new fuel to opponents in Congress who have stalled a bill designed to speed the deployment of self-driving cars on US roads and puts a spotlight on the fact that the National Highway Traffic Safety Administration does not test self-driving vehicles or certify them before they are deployed on US roads.

Uber, which voluntarily suspended testing after the crash in the city of Tempe, said on Wednesday it planned to end testing in Arizona and focus on limited testing in Pittsburgh and two cities in California.

Elaine Herzberg, 49, was walking her bicycle outside the crosswalk on a four-lane road when she was struck by the Uber vehicle travelling at 63 kmph. A safety operator behind the wheel appeared to be looking down, and not at the road, moments before the crash, according to video from inside the car released by police. The operator told the NTSB she was not looking at a mobile phone but monitoring the vehicle’s self-driving systems.

Tempe police said on Wednesday it had completed its investigation and turned the findings over to prosecutors to review. Police did not release the results of the probe.

Reuters

 

Edited by jpd80
Link to comment
Share on other sites

"You’re allowed to look down or look away from the road for a second or two." Is that a written law somewhere? I must have missed that in drivers ed.

 

 

Well even that is too much-I was driving about 12 years ago and rear ended an Expedition because of that.

 

There was a vehicle cutting the sholder of the road about 1/2 mile down the road with two cars in front of the Expedition (I saw this about 20 seconds before the crash) The cars started applying their brakes because the lawn mowing (I couldn't see because of the Expedition) and I glanced down to look at my speed and by the time I looked up the Expedition applied his brakes and I slammed on mine and cut the wheel and nailed him with my front passenger side of the car.

 

I was prob too close for the given speed I was at also, so that had some to do with it too.

Link to comment
Share on other sites

Well even that is too much

 

I was prob too close for the given speed I was at also, so that had some to do with it too.

 

Tickets in that situation are always "following too closely" (at least in GA) for that very reason. People don't account for reaction times. Even glancing at your speedometer to check your speed or to check your rear view mirror can take a second or two.

Link to comment
Share on other sites

And now this,

 

 

 

This is exactly why none of us who work with software for a living trust autonomous cars in life and death situations. You have these purposeful situations on top of simple bugs or corruption in the software.

 

This also underscores why it's insane to actually test these vehicles on the street in public without first going through some very extensive lab testing by 3rd parties to make sure the systems are safe before being allowed on the street.

It's amazing we don't allow 16 year olds to drive by themselves without taking a throrough test but we don't think twice about turning it over to stupid untested software.

  • Like 1
Link to comment
Share on other sites

Now, they have a 300 page report and all of the details, including video footage inside the vehicle. They claim the operator had her eyes off the road nearly 1/3 of the trip. Based on the distance the car was from the pedestrian when she came into view, they claim if she had her eyes on the road, she should have been able to stop about 43 feet before hitting the pedestrian. It will be interesting to see if the driver or Uber ends up being charged. Uber settled quickly, but other family members have obtained legal council after this report came out.

 

But even if you had seen the person when she came into view, do we know she was facing the road (i.e. beginning the process of walking to the road to cross) when she came into view? Or was she walking parallel to the road when first in view, and then unexpectedly turned to cross, meaning even if the "driver" had been paying attention, you may not have had enough time to stop anyway.

 

Disregard - hadn't seen the video in a while, I was thinking a different scenario where she was on the right/sidewalk walking left across the road.

 

I love watching old movies and seeing a car's driver turn and talk to the passenger for a minute at a time. Amazingly, without looking at the road, he constantly makes corrections with the steering wheel.

 

It's a pet peeve of mine when actors are in the car "driving" and you can clearly see the steering wheel is upside down or sideways while they're going straight.

Edited by rmc523
Link to comment
Share on other sites

 

But even if you had seen the person when she came into view, do we know she was facing the road (i.e. beginning the process of walking to the road to cross) when she came into view? Or was she walking parallel to the road when first in view, and then unexpectedly turned to cross, meaning even if the "driver" had been paying attention, you may not have had enough time to stop anyway.

 

 

It's a pet peeve of mine when actors are in the car "driving" and you can clearly see the steering wheel is upside down or sideways while they're going straight.

They have camera footage from the car showing the pedestrian moving across the road. At the point she was struck, she was almost to the sidewalk.

Link to comment
Share on other sites

They have camera footage from the car showing the pedestrian moving across the road. At the point she was struck, she was almost to the sidewalk.

 

Whoops, it's been a while since I've seen the footage - for some reason I was thinking she had crossed from right to left, as in she was stepping off the right sidewalk and walking across. Disregard my earlier comments about her changing direction suddenly.

Link to comment
Share on other sites

 

Whoops, it's been a while since I've seen the footage - for some reason I was thinking she had crossed from right to left, as in she was stepping off the right sidewalk and walking across. Disregard my earlier comments about her changing direction suddenly.

At 40 mph with good headlights, a regular driver would have had ample time to avoid this crash.

The AV detection was desensitized to avoid false brake applications and in doing so allowed this happen.

The observer was watching the the AV read out and probably had a Sully moment trying to see what the AV software would do....

Edited by jpd80
Link to comment
Share on other sites

This is exactly why none of us who work with software for a living trust autonomous cars in life and death situations. You have these purposeful situations on top of simple bugs or corruption in the software.

 

This also underscores why it's insane to actually test these vehicles on the street in public without first going through some very extensive lab testing by 3rd parties to make sure the systems are safe before being allowed on the street.

It's amazing we don't allow 16 year olds to drive by themselves without taking a throrough test but we don't think twice about turning it over to stupid untested software.

 

What it boils down to is you have people who don't understand the tech and people on the other side pushing it so they can profit off it.

 

Non-tech people expect computers/tech to work like a lightswitch-it works or it doesn't work. We've only recently gotten to the point that PCs are almost bulletproof for day to day operations, outside of a catastrophic failure (HHD crash etc), They need less massaging to "work" properly.

  • Like 1
Link to comment
Share on other sites

Bottom line is this tech was approved for beta testing on public roads on the say so of Tech companies pushing the AV band wagon

and anyone who opposes them is seen as a luddite that unnecessarily resists progress.

 

I have a big issue with techhies saying certain things are "fool proof" because of multiple redundant systems, that is no judge of

system competence and feeds into the hype of more is better when actually it's not....

 

AVs shouldn't be permitted on public roads until they can pass a simple government approved competency test

and since none exist yet, we are way in front of what should be deemed safe and prudent judgement.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...