Jump to content

25,000 Tesla automated cars to be examined


rooster59

Recommended Posts

25,000 Tesla automated cars to be examined

By Catherine Hardy

post-247607-0-12135300-1467413145_thumb.

US experts are planning to examine 25,000 cars across the country that are equipped with automated driving systems.

It comes after a driver of one of the vehicles was killed when it was in autopilot mode.

Investigation opened

Federal safety regulator the US National Highway Traffic Safety Administration (NHTSA) says it is opening a preliminary inquiry into 25,000 Tesla Motors 2015 Model S cars with automated systems.

The NHTSA “calls for an examination of the design and performance of any driving aids in use at the time of the crash.”

What happened?

The NHTSA says:

  • The crash happened on May 7 in Williston, Florida
  • The 2015 Model S was being driven in Autopilot mode
  • A tractor-trailer reportedly made a left turn in front of the Tesla at a junction
  • The driver of the Tesla vehicle died as a result of the impact

It is the first fatality known to involve a Model S operating on Autopilot.

It comes as Tesla and other car makers are gearing up to offer systems over the next few years allowing vehicles to pilot themselves.

Will there be a recall?

It is not clear yet.

The investigation is the first step.

The NHTSA could seek to order a recall if it finds the vehicles are unsafe.

What have Tesla said?

  • Autopilot launched last October
  • Described as in “beta” (development) mode by chairman Elon Musk

The luxury electric car maker says:

  • This is the first known fatality in more than 130 million miles of travel where Autopilot was activated.
  • “Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly-lit sky, so the brake was not applied.”
  • “The high ride-height of the trailer, combined with its positioning across the road and the extremely rare circumstances of the impact, caused the Model S to pass under the trailer, with the bottom of the trailer impacting on the windshield (windscreen) of the Model S.”
  • “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically-significant improvement in safety when compared to purely manual driving.”

The safety debate

There is debate within the motor industry and legal profession over the safety of systems that take partial control of steering and braking.

Tesla updated its Autopilot driving system in Model S sedans, putting new limits on its hands-free operation.

The function has been restricted on residential roads or roads without a central divider.

Tesla has been praised for innovation but also criticised for launching the Autopilot system too early.

euronews2.png
-- (c) Copyright Euronews 2016-07-02

Link to comment
Share on other sites


Yikes.

There's been plenty of examples where computers have failed. There's been even more examples of when humans have failed.

I must admit, I like to have the levers and pedals in my own hands when it comes to driving.

I wonder if the tesla system has been tested on Thai roads yet, if there ever were a chance of a computer meltdown, it would probably happen here. 55555555

Link to comment
Share on other sites

Pretty much the same issue that causes a big chunk of airliner crashes nowadays. Better get used to it.

I don't know what crashes you're referring to because that does not appear to be the case.

The industry has recognized, "pretty much" that the pilots are not getting enough "hand-flying" experience so when the plane is handed over to the pilots to fly manually, they mishandle it or were not trained w/r to the situation.

I would like to know when/where in relation to the Tesla the truck made the left turn and how fast the Tesla was going.

Also, I wonder how much the automation lulls the driver into a state where they are not alert and predicting situations before they happen - something I'm sure the automation cannot do. "What-ifing" has saved me many times and the lack of it in this case may have been a factor in this accident.

Edited by MaxYakov
Link to comment
Share on other sites

Pretty much the same issue that causes a big chunk of airliner crashes nowadays. Better get used to it.

Also, I wonder how much the automation lulls the driver into a state where they are not alert and predicting situations before they happen - something I'm sure the automation cannot do. "What-iffing" has saved me many times and the lack of it in this case may have been a factor in this accident.

That was my point.

Link to comment
Share on other sites

post-21260-0-48863500-1467553434_thumb.p

It looks to me like the Tesla had the right of way, but none of the news articles I've seen say so.

The location of the accident is not given except that it was in North Florida, but the above diagram indicates that it was on highway 27. It could have been at this intersection:

On Google Maps: 28.657200, -81.846624, https://goo.gl/J1dPyP

post-21260-0-37279700-1467555890_thumb.p

The speed limit is 55 mph on that stretch of the highway and the Tesla was probably going at that speed. The trailer truck must have run quite slowly to navigate that 90-degree left turn and its driver must have expected the Tesla to slow down and if necessary stop for him.

Link to comment
Share on other sites

I'm sure Tesla will try to re-enact the factors of this accident - which may well be an expensive test, at that!!

If I were driving with such technology, I'd be treating it as no smarter than any predictive cruise control equipped vehicle

Link to comment
Share on other sites

attachicon.gifTesla Florida.png

It looks to me like the Tesla had the right of way, but none of the news articles I've seen say so.

The location of the accident is not given except that it was in North Florida, but the above diagram indicates that it was on highway 27. It could have been at this intersection:

On Google Maps: 28.657200, -81.846624, https://goo.gl/J1dPyP

attachicon.gifTesla Hw 27.png

The speed limit is 55 mph on that stretch of the highway and the Tesla was probably going at that speed. The trailer truck must have run quite slowly to navigate that 90-degree left turn and its driver must have expected the Tesla to slow down and if necessary stop for him.

Thanks for the detailed images and analysis.

If the driver had "expected the Tesla to slow down and if necessary stop for him" - No way IMHO in that situation would a sane semi driver (or any sane driver) purposely use his or her rig as traffic-blockade with 55+ MPH oncoming traffic.

Assuming he was sane, it is more likely he either didn't see the Tesla or severely misjudged its speed. It was not the first time that "Tessy" (the victim's nickname for his Tesla) apparently was not seen in a lane-change situation.:

Tesla crash: Man who died in autopilot collision filmed previous near-miss, praised car's technology - ABC News Australia - July 2, 2016

Note: the Tesla was apparently black - not the best of colors for visibility.

There is a recent report that the:

Tesla driver killed in autopilot crash ‘was watching Harry Potter’ - NY Post - July 2, 2016

And another thing is that the partner in the Tesla autopilot technology, Mobileye, said that it

"... isn't meant to avoid this type of accident." - Electrek article - July 1, 2016

My humble opinion is that the semi was the primary contributor to the accident and, had the Tesla been driving the car, undistracted, he might have had time hit the brake or, at least, to duck to avoid being decapitated.

Edited by MaxYakov
Link to comment
Share on other sites

The only place I know of where it is permissible to make a turn like that with oncoming traffic is in Thailand.

It will be interesting to see how this accident is adjudicated. Sadly, it makes little difference to the guy in the Telsa.

Link to comment
Share on other sites

...so the Autoplilot does not Brake your vehicle, but will only:

  • audibly and visually warn you, and
  • swerve around an obstacle if necessary

it recognises imminent rear end crashes, and recognises pedestrians (as in those cases the target is growing as it nears...

Tesla claims the autopliot does not recognise lateral moving targets, such as a truck driving across your path

(This is strange, as the turning truck when still to fully turn to the Left, is a small target, but once it has begun to complete the turn, it has become a Large target, hence the simulation of an Approaching target has been met.)

So why didn't the autoplilot assume the truck is a big pedestrian??

Link to comment
Share on other sites

A witness stated the Tesla passed her at a high rate of speed.

I also read the truck trailer was white against a bright sky. I don't know how the Tesla sensors work but in photography, the bright white on bright white can blend. The white trailer was detected as part of the sky?

Also, the cars "radar" might have been looking under the trailer at the gap?

Perhaps that combination of factors let the Tesla slip under the trailer truck.

GOOGLE has test robotic cars and logged more than a million miles. I recall the data showed 13 accidents, 12 were human error.

With the outstanding safety statistics of robotic vehicles, governments will soon be forcing manufacturers into the new technology.

That will be good. The computers will be getting better and I predict close to zero accidents.

Look for thousands of new creative exotic designs whizzing thru intersections at high speed, missing each other by centimeters. No traffic jams, always on time.

This will free up the automotive design world. Won't need all the mandatory safety systems.

You will be able to design your own car and 3D print it.

Link to comment
Share on other sites

so, the obvious solution is to design a robot to sit in the driver's seat, as history already knows:

Computers don't make mistakes - People make mistakes

an error in any computer program/report etc are the result of some human error - hence the popular 'typo' error term

Link to comment
Share on other sites

...so the Autoplilot does not Brake your vehicle, but will only:

  • audibly and visually warn you, and
  • swerve around an obstacle if necessary

it recognises imminent rear end crashes, and recognises pedestrians (as in those cases the target is growing as it nears...

Tesla claims the autopliot does not recognise lateral moving targets, such as a truck driving across your path

(This is strange, as the turning truck when still to fully turn to the Left, is a small target, but once it has begun to complete the turn, it has become a Large target, hence the simulation of an Approaching target has been met.)

So why didn't the autoplilot assume the truck is a big pedestrian??

Americans are getting bigger, but not that big (yet).

Reading Mobileye's description, such obstacles were assumed to be overhead road signs so are ignored so as not to create erroneous brake-firing:

ElecTech/Mobileye Article Update from Tesla in response to a statement by Mobileye

Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature. In the case of this accident, the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire.”
Apparently, from this statement, the forward path obstacle detection components employ both optical and RADAR sensors. That is, unless RADAR (either RF or some other light frequency) has been developed that can be confused by the color of an object. So both voted "no-fire" w/r braking.
If I were a Tesla owner, I'd be running with the lights on at all times, electricity consumption be damned.
Also, I don't see myself 3D-printing one of these, or anything like it, in even the most bizarre, utopian parallel universe. I wouldn't buy one even second-hand in this universe.
But hey! Someone has to debug the systems/firmware/software, eh?
Edited by MaxYakov
Link to comment
Share on other sites

attachicon.gif020.JPG Up to You.cheesy.gif

ok...

umm...

  • Pre-owned, Pre-crashed on Special Demo (demolished)
  • Rock-Bottomed prices
  • Tesla doesn't recognise stationary targets
  • Did it pass the smoke test?
  • Car so smart, it conducts it's own board meeting
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.








×
×
  • Create New...