1. Attachments are working again! Check out this thread for more details and to report any other bugs.

Featured Tesla recalled over "full self-driving"

Discussion in 'Prius, Hybrid, EV and Alt-Fuel News' started by Gokhan, Feb 16, 2023.

  1. austingreen

    austingreen Senior Member

    Joined:
    Nov 3, 2009
    13,602
    4,136
    0
    Location:
    Austin, TX, USA
    Vehicle:
    2018 Tesla Model 3
    Model:
    N/A
    I bought mine in 2018 (EAP) and 2019 (full self driving) at the time it was very clear that features of both were in beta and the data to make them better was through seeing what drivers did differently than the software. I spent $2000 for the full self driving addition not because I thought they would get it to work completely but slowly get it to work well, and would upgrade my computer when they found it didn't have enough power. They have given me one computer upgrade, and I expect at least one more at no cost to me. The autopilot (without enhancements) is now free with the car and it is truly like an aircraft autopilot. If you are a pilot and have used autopilot you know that the pilot in command is responsible to over see it, it is an aid. It can only land itself at some airports, and no commercial pilot would let it if they weren't impaired.

    Full self driving beta is far from working well enough to call it that, but the autopilot was pretty bad when I bought the car. The feature that got worse in autopilot (that is also in full self driving beta) is a switch from radar plus cameras to just cameras in its traffic and speed limit aware cruise control. There appers to be some accidents so they have made it more sensitive to objects in other lanes and found that the radar would make mistakes the vision would not. This made phantom braking problem much worse, but it is easy to just over ride by pressing the accelerator.


    The FSD beta is one of the most tested in the field with over 100,000 of us sending information back to tesla automatically from our cars. There is a visualization screen to show what we are seeing. If you have ever worked with AI in real world situations you would know that it takes a long time to get a system to handle all situations.

    If you don't want to experience it, do not download the beta. For those of us that want to, I am offended that some loud mouthed billionaire in California is manipulating the regulatory agencies to cause me harm, the inability to use software I payed for. If you don't want to be part of the program, its easy just don't pay for it. Yes I have developed AI systems and mission critical software. Self driving beta is part of the first catagory and not the second. If it was developed like mission critical software I would not expect a test for another decade. Maybe it will get there some day but it will take years not months. people claim waymo is ahead, but that is geo fenced and just stops in new situations.

    A) You would never have a tesla
    B) You would not buy the beta software
    C) You seem to want to pretend you are an expert here, but I doubt you are from the nonsense in this thread.
     
    #21 austingreen, Feb 17, 2023
    Last edited: Feb 17, 2023
    hill, Trollbait and Zythryn like this.
  2. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    22,455
    11,767
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    The general public is not a pilot though. Some of them think aircraft, and ship, autopilot is more capable than they actually are, or it doesn't occur to them that keeping a craft on the same heading is a lot easier in vast sky or open ocean.Autopilot is a technically correct name. but most people don't technically know what autopilot is. Full Self Driving implies Level 5 autonomy. I know it's the goal, but it is a ways off.

    I'm not being critical of the systems' capabilities, but I do think Tesla could have avoided some grief by using different names.
     
  3. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    There is a flaw in your logic Austingreen. First, it says that "If you don't want to experience it, do not download the beta". If there are 100,000 teslas running the FSD application, there's a very good chance that at least some of them are sharing the road with me. There is not a way for me to opt out of being a data point on a neighbor's crash report. Based on casual observation, we have more than 20% EVs on the road here.

    Now that you mention it, maybe a local ordinance can ban the use of FSD within the city limits.

    I'm not an expert in AI training, so I will have to take your word that it's good to be training AI systems using inexperienced testers on public roads. But I don't think that Walter Huang would have agreed with you that the FSD application was not mission critical. He was the one who drove right into a clearly visible lane barrier on Highway 101 in San Jose. Died from the fire.

    As for my expertise? 40+ years working in IT at large companies where application failures were not acceptable. I've worked in all sectors of the SDLC (software development life cycle) at one time or another.

    But you are right about the likely hood that I would not buy a Tesla.
     
  4. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    20,191
    8,360
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2018 Chevy Volt
    Model:
    Premium
    this is the same kind of FUD that fear mongers began spreading when cruise control COULD first keep the distance correct between your car and the one in front of you. Every once in awhile though - it wouldn't. BAM. The victem mentality - now wants to sue the manufacturer because THEY weren't paying attention . Sound familiar? I wonder how many of those gloom & doomers are still afraid to even use that feature ..... after all - it too occasionally requires driver intervention.
    .
     
    #24 hill, Feb 17, 2023
    Last edited: Feb 18, 2023
    austingreen, Zythryn and Trollbait like this.
  5. Leadfoot J. McCoalroller

    Leadfoot J. McCoalroller Senior Member

    Joined:
    May 12, 2018
    7,438
    6,920
    1
    Location:
    Pennsylvania
    Vehicle:
    2018 Prius c
    Model:
    Two
    I put that down to personality. Tesla has one salesman in particular who seems to habitually claim things to be one thing when they are most definitely not that thing.
     
    hill, Todd Bonzalez, fuzzy1 and 2 others like this.
  6. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium

    You can mislabel it as FUD if you like, but that does not change reality that Tesla has deliberately unleashed artificial "student drivers" amongst us that are undetectable until they run a stop sign, drives in the wrong lane or cuts off traffic and brakes hard in a tunnel, resulting in many injuries to innocent drivers nearby.

    So why is this being allowed? At the very least the cars should have a "student driver" placard in each window like we have when training new humans to drive in traffic.

    So what is the counterpart to FUD... misinformation fits, or maybe it's just abetting a crime. Here in California they prosecute people when they do something that entices someone else into injuring themselves. It's called creating an attractive nuisance. That's what Musk and all of his supporters are doing in this instance.
     
  7. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    110,184
    50,069
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    Because that strategy works
     
  8. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    How so?
     
  9. bisco

    bisco cookie crumbler

    Joined:
    May 11, 2005
    110,184
    50,069
    0
    Location:
    boston
    Vehicle:
    2012 Prius Plug-in
    Model:
    Plug-in Base
    look at sales, profits and stock price
     
  10. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    22,455
    11,767
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    Having the names line up with the salesman's claims doesn't help, though he probably named them.
     
  11. mikefocke

    mikefocke Prius v Three 2012, Avalon 2011

    Joined:
    Nov 3, 2012
    3,761
    1,682
    0
    Location:
    Sanford, NC
    Vehicle:
    Other Hybrid
    Model:
    Limited
    During my career I have had sole responsibility to make the call on when a product was ready for release for three different Operating Systems. I can tell you the test cycles were hundreds of times as time consuming as any fix that was ever developed. And yes for the first 2 every month there were known error lists in the '60s through the 90s. Even individual patches you could apply for specific errors. Source code was available to any customer. For the third OS, it was months and even years after source code was frozen before a release could occur.
     
    dbstoo and austingreen like this.
  12. austingreen

    austingreen Senior Member

    Joined:
    Nov 3, 2009
    13,602
    4,136
    0
    Location:
    Austin, TX, USA
    Vehicle:
    2018 Tesla Model 3
    Model:
    N/A
    States and countries are in charge of licensing drivers. Calling those of us using autopilot "student drivers" and unsafe may reflect your misunderstanding of those concepts. Why use the insulting item, "student drivers" to those of use that properly drive a tesla using autopilot. Wouldn't it show that tesla's were the most unsafe cars if that was the case?

    https://www.bostonglobe.com/2022/05/25/business/tesla-owners-less-likely-crash-their-ev-than-their-other-cars/#:~:text=The%20crash%20rate%20per%20million,owned%2C%20according%20to%20the%20data.

    We do have tesla's being driven in my city with software not yet released as beta. They do test before they release beta versions. Every driver that owns a tesla should know that they are responsable for its correct use. Some how we have a bunch of non Tesla drivers echoing a bunch of anti tesla politicians, PACs, stock market investers, having never bothered to ask "is it safer".

    Certain things are mission critical like braking. My gen 3 first year prius had a bug in its braking software. Toyota fixed it. Bugs happen. My tesla model 3 has had no bugs in mission critical areas. Autopilot definitely has some situations that it does not operate correctly. These are easy for drivers that are paying attention to correct. The NHTSA video in california about a tesla phantom braking causing a multi car pile up is something that is rare but happens. Watching the video, I have no idea why the driver just sat there instead of pressing the accelerator.

    All cars have accidents. If you use statistics properly tesla's using autopilot are less likely to cause accidents than the same driver driving anouther car without it. If you are worried about being part of the beta test, you probably should be much more scared of non teslas hitting you. 3 non teslas hit my prius in the 9 years I owned it. How many teslas have hit your car? (Most of this is to relize how to use statistics. There were very few teslas when I traded in my prius. You need to correct to accidents or severe accidents per mile, not just read some article and extrapolate wildly that the cars are unsafe.
     
    Zythryn likes this.
  13. austingreen

    austingreen Senior Member

    Joined:
    Nov 3, 2009
    13,602
    4,136
    0
    Location:
    Austin, TX, USA
    Vehicle:
    2018 Tesla Model 3
    Model:
    N/A
    Totally agree, but the time period can be short running software in mulitple sandboxes with automated test. This is how I have dealt with large complicated projects. The problem is getting the test routines correct and all the situations. Teslas themselves even with autopilot off can give the company the test cases. The car can calculate what autopilot would do and what the driver did differently. Visualization allows a driver to see what the system thinks it sees. When I got my car in 2018, visualization was pretty poor in situations. Now it sees and properly identifies traffic cones, bicycles, scooters, pedestrians, traffic signs, and lights. Tesla put a more powerful computer in my car to be able to deal with the cameras better, and I expect if they get self driving working, my radar, cameras, and sonars will still work but it will need better software and a much more powerful computer. That next computer has been designed.
     
  14. fuzzy1

    fuzzy1 Senior Member

    Joined:
    Feb 26, 2009
    17,557
    10,324
    90
    Location:
    Western Washington
    Vehicle:
    Other Hybrid
    Model:
    N/A
    The original cruise controls didn't take over enough driver function to allow or facilitate full human disengagement, prompt an increased number of drivers to doze off napping, or put all their attention into watching movies or playing video games:

    upload_2023-2-18_10-55-15.png
     
    #34 fuzzy1, Feb 18, 2023
    Last edited: Feb 18, 2023
  15. hill

    hill High Fiber Member

    Joined:
    Jun 23, 2005
    20,191
    8,360
    54
    Location:
    Montana & Nashville, TN
    Vehicle:
    2018 Chevy Volt
    Model:
    Premium
    Um prosecute? no - they don't .... LOL
    In fact many parts of California you can walk out of Walmart with a $999 big screen TV - no criminal charges.
    Attractive nuisance on the other hand is a civil tort - & it's a doctorin that applies to children because a 9-year-old is more likely to crawl under a fense to see if they can drive a tractor then a 45-year-old. Not criminal. Remember? That's why musk was not guilty of any crime - much less a civil tortious act that applies to kids. But thanks for the legal treatise anyways.
    .
     
    austingreen likes this.
  16. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    27,670
    15,664
    0
    Location:
    Huntsville AL
    Vehicle:
    2018 Tesla Model 3
    Model:
    Prime Plus
    Having completed a 1,500 mile round trip, I saw more than a few drivers who were not as skilled as Autopilot/FSD. Some probably should have parked or cheap motel for a nap.

    Delivering Mom’s ashes to join Dad’s in Arlington National Cemetery, left early Thursday, overnight Hilton with free charging, graveside Friday afternoon, and back home 7:30 AM Saturday morning.

    upload_2023-2-18_14-55-44.jpeg

    Bob Wilson
     
    Gokhan, hill and austingreen like this.
  17. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    It's not the drivers that are learning. It's Tesla's neural net that is being "taught" by example. Unfortunately, the humans are the teachers and they are amateur drivers. Either they are not necessarily good drivers or the learning algorithms are poorly implemented. Either way, the software made a lot of poor decisions, including speeding and being programed to running stops signs.

    Tesla has had a reoccurring problem for several years. Teslas traveling at freeway speeds that do not detect hazards such as huge fire trucks with flashing lights, slamming into them at full speed. Today another tesla owner died when the car plowed into fire trucks that blocked the freeway.

    https://www.nbcbayarea.com/news/local/tesla-fire-truck-crash-i-680-east-bay

    And then tesla owners who have a vested interest in the value of their car swear up and down that it would not happen if the driver paid attention. How inhumane does one have to be, to say that when yet another person died today in the exact same scenario as half a dozen other people. That's a scenario Musk claimed to have fixed long ago.
     
    Todd Bonzalez likes this.
  18. Todd Bonzalez

    Todd Bonzalez Active Member

    Joined:
    Apr 3, 2022
    250
    160
    1
    Location:
    Ireland
    Vehicle:
    2004 Prius
    Model:
    Base
    I'd have some questions about how someone doesn't see a big red fire truck on the road in front of them. Maybe something to do with:

    Sounds like someone needs to invent a device that disables Autopilot/FSD on all Teslas in the vicinity of active emergency workers
     
  19. austingreen

    austingreen Senior Member

    Joined:
    Nov 3, 2009
    13,602
    4,136
    0
    Location:
    Austin, TX, USA
    Vehicle:
    2018 Tesla Model 3
    Model:
    N/A
    well if we use the scientific method, your hypothesis is that tesla vehicles running autopilot are less safe than other vehicles not running it.

    NHTSA has gotten all the data from tesla. Other car makers don't collect it. Some evil billionaire in california has lied with statistics, but as you saw by my link the same drivers driving their other cars were 9x more likely to get in an accident than a tesla. You can of course turn off autopilot and go 116 mph and crash and kill people have 2 different florida drivers have done. Maybe you don't need to go as fast to kill people in a yaris or cheap car, as statistics show. But is this the softwares fault or the driver (in one of the instances it was partially a tesla service centers fault that when repair a car got rid of a parental speed control that the child of the own decide go as fast as possible).

    When your hypothesis is tested by an experiment and shown that it is not correct, it is time to change the hypothesis. Unfortunately in this tribal world, anti-tesla people will gather together to falsify the data. Look at real data not fudged data, then tell me why you think a tesla is less safe.

    As those of us with experience have told you that the system is much safer as long as there is a safety driver paying attention than a car without it. You on the other hand keep repeating made up talking points. It is not your fault. Find a friend with a tesla and sit in the passenger seat and see what it actually is instead of deciding that it is all bad.
     
    Zythryn and bwilson4web like this.
  20. fuzzy1

    fuzzy1 Senior Member

    Joined:
    Feb 26, 2009
    17,557
    10,324
    90
    Location:
    Western Washington
    Vehicle:
    Other Hybrid
    Model:
    N/A
    Musk's defense may be that errors by drivers of non-Teslas killed over 100 people today. And every day.

    My problem is with the drivers who think the driver assistance systems are good enough for them to mentally disengage, getting ahead of the today's technical reality. And with systems that let drivers get away with that disengagement.
     
    Todd Bonzalez and Trollbait like this.