New Questions Raised about Tesla’s ‘Autopilot’ Safety After Three Fatalities This Week – Slashdot | xxxNew Questions Raised about Tesla’s ‘Autopilot’ Safety After Three Fatalities This Week – Slashdot – xxx
菜单

New Questions Raised about Tesla’s ‘Autopilot’ Safety After Three Fatalities This Week – Slashdot

十一月 30, 2019 - MorningStar

Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 


Forgot your password?
Close

binspamdupenotthebestofftopicslownewsdaystalestupid freshfunnyinsightfulinterestingmaybe offtopicflamebaittrollredundantoverrated insightfulinterestinginformativefunnyunderrated descriptive typodupeerror

Check out Slashdot on LinkedIn & Minds! | Migrate from GitHub to SourceForge quickly and easily with this tool. Check out all of SourceForge’s improvements.

×

122412532 story

New Questions Raised about Tesla's 'Autopilot' Safety After Three Fatalities This Week - Slashdot New Questions Raised about Tesla's 'Autopilot' Safety After Three Fatalities This Week - Slashdot

New Questions Raised about Tesla’s ‘Autopilot’ Safety After Three Fatalities This Week (startribune.com) 16

Posted by EditorDavid from the leave-the-driving-to-us dept.
The Associated Press looks at three new fatalities involving Teslas this week, saying the crashes have “increased scrutiny of the company’s Autopilot driving system just months before CEO Elon Musk has planned to put fully self-driving cars on the streets.” Last Sunday, a Tesla Model S sedan left a freeway in Gardena, California, at a high speed, ran a red light and struck a Honda Civic, killing two people inside, police said…. Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it’s likely that the Tesla in Sunday’s California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane. He also suggested that the driver might not have been paying close attention. “No normal human being would not slow down in an exit lane,” he said…

On the same day, a Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla… In both cases, authorities have yet to determine whether Tesla’s Autopilot system was being used… Many experts say they’re not aware of fatal crashes involving similar driver-assist systems from General Motors, Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they don’t watch the road. “Tesla is nowhere close to that standard,” Rajkumar said. He predicted more deaths involving Teslas if the National Highway Traffic Safety Administration fails to take action…

And on Dec. 7, yet another Model 3 struck a police cruiser on a Connecticut highway, though no one was hurt… [T]he driver told police that the car was operating on Autopilot, a Tesla system designed to keep a car in its lane and a safe distance from other vehicles.

New Questions Raised about Tesla’s ‘Autopilot’ Safety After Three Fatalities This Week

Comments Filter:

  • “No normal human being would not slow down in an exit lane”

    Not only do they SPEED UP in exit lanes here in California to try to beat the light, they’ll also go slow in the acceleration lane when getting on the freeway, cross over onto the shoulder and treat it as a right turn lane, and a lot of other asinine and dangerous shit.

    Totally ignorant of human selfishness, I see.

      • Retarded unrelated comment.

        Tesla’s. Autopilot system isn’t perfect, it never will be. But, statistically it is safer than driving alone. The data shows this. Tesla’s can recognize red lights if they have the V3 computer, but do not stop yet. They alert the driver and say “hey, we are about to cross a red light”. Same thing with stop signs. Why Tesla doesn’t apply the breaks yet? Not sure. It’s not like they have stimulus people working on the problem. I’ve argued before that there are going to be deaths.

      • “brainpower” and “treat it as theres”

        So much for your brain power.

        Here in Iraq? Is that where you are? That explains a lot.

    • More importantly, he simply doesn’t know whether autopilot was on. Instead of waiting to find out from the people who can and do check such things, he makes up an answer. The media writes about the made up answer, and consumers eagerly lap it up, spreading ignorance far and wide.

  • “The driver police that the car was operating on Autopilot” – yeah, drivers have never tried to lie to police to get out of trouble before.

    Raj Rajkumar said “it’s likely”… Authorities “have yet to determine”

    I can certainly believe there could be some glaring issues with Tesla’s Autopilot. But good grief, how about we wait to see if these vehicles were even *in* that mode before we start debating this?

    • You seem to be conveniently ignoring the third case, where the driver stated to police that the car was in autopilot.

      • You seem to be conveniently ignoring the third case, where the driver stated to police that the car was in autopilot.

        [Area man] yes officer, I was not in control of the car when it struck you, it’s all Elons fault. I am totally absolved of all blame!

        [Officer] How convenient.

    • wrong, Tesla shill-boy

      autopilot needs to be banned, it’s killing people with stupid choices

      • by nbvb ( 32836 ) writes:

        Actually, I think drivers need to be banned. They make MUCH stupider choices.

        Autopilot’s never stumbled out of a bar after drinking a fifth of vodka and gotten behind the wheel.

    • But…It’s not cars killing people, it’s people killing people in these reports? And looks like in conditions one would not use or could not use the autopilot anyway.

      I get the point that possibly statistically Tesla on autopilot may be safer then human driven cars, but it still feels different. What is these Tesla accidents were caused by say a wheel falling off?

  • No not the Boeing MCAS, I think he’s blaming it on the “My Car Ain’t Safe” system.

  • No normal human being would not slow down in an exit lane,

    This is simply not true. You see people exiting a freeway, not slowing down at all until they are close to a light.

    Or what if they were drunk and trying to run a light that was turning red at the end of the exit ramp? Then I could easily see them *accelerating* in an exit lane, misjudging speed and ramming the car into something. That kind of thing happens all the time. That’s one of the drawbacks of a car with really good performance, it’s very

There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.

Slashdot Top Deals

Some programming languages manage to absorb change, but withstand progress. — Epigrams in Programming, ACM SIGPLAN Sept. 1982

Close

Close

Slashdot

Working...


Notice: Undefined variable: canUpdate in /var/www/html/wordpress/wp-content/plugins/wp-autopost-pro/wp-autopost-function.php on line 51