1. Attachments are working again! Check out this thread for more details and to report any other bugs.

Featured Tesla recalled over "full self-driving"

Discussion in 'Prius, Hybrid, EV and Alt-Fuel News' started by Gokhan, Feb 16, 2023.

  1. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    27,670
    15,664
    0
    Location:
    Huntsville AL
    Vehicle:
    2018 Tesla Model 3
    Model:
    Prime Plus
    And does what:
    • Brings the car to an immediate stop where ever it is?
    • Set the emergency lights on flash and honk the horn?
    • Reduce speed and change lanes to avoid the service vehicles?
    The current version does the last actions.

    Bob Wilson
     
  2. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    22,455
    11,767
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    With all the cars being sold with ADAS, soon there wouldn't a car without a student driver at the wheel.
    Which is why CR and others are putting a heavy emphasis on the driver monitoring and engagement systems cars with ADAS have as part of the rating.
     
  3. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    My assertion is that Tesla is deliberately using a dataset that is known to be faulty to guide the cars on public roads in an attempt to force humans to react swiftly and correctly when the car behaves abnormally, unsafely or illegally.

    It is my opinion (and that of the officer that ticketed me and the judge that upheld the charge) that driving in such a way that another car needs to make an emergency maneuver to avoid an accident falls into the category of reckless driving, a misdemeanor.

    I've seen many videos that showed the Tesla doing strange and unpredictable maneuvers under control of the FSD. I've seen them driving on the wrong side of the street in downtown San Jose. I've seen them suddenly brake for unknown reasons and suddenly accelerate. I've seen them break the law by changing lanes in the middle of an intersection. I won't further expound on the driving errors that have been caught on video. Suffice it to say that if I'd driven like that when I took mandated driving lessons 50 years ago, I'd have flunked the course.

    The proper hypothesis would be: Due to the nature of forcing errors to produce training material, the cars are deliberately driving in unpredictable ways without warning to the other cars in the vicinity. It appears that the Tesla testing group does not have a means to ensure that their video feeds are covering all the necessary scenarios.

    The proof that the training system is not working right was presented today. There were 12 known cases where teslas were involved in crashes into emergency vehicles at the scene of a crash. Musk insisted that testla add a huge amount of video footage to train the neural net. In follow up tweets he said it was not a problem anymore. And yet it happened again this morning.



    Where is the report from an independent authority to verify that Musk did not alter the data? As you state, other car manufactures don't collect the same level of information, and there are no restraints to prevent Tesla from fudging the data. FSD beta drivers have posted that they were asked not to post critiques online. The justification was simple. " It was test software, not ready for production."

    I see a very simple explanation for that. It has to do with muscle memory and dissimilar vehicles. Consider just the differences in controls. Your primary car is the tesla, one that is configured to use one foot driving. You initiate braking by lifting your foot. You might do that out of habit in a 911, only to realize a second or so later that the car is not slowing quickly. I make similar mistakes in my wife's car because I only drive it a few miles every few months.

    Is it possible that the confusion caused by dissimilar controls is causing accidents when people drive their backup car occasionally? I'd say it's possible. One of the things that I learned in a motorcycle safety training class was that the once a month weekend rider was most likely to have an accident because their reflexes were slower

    And then there are the changes in capabilities in recent years. A new Tesla is much more capable in most areas compared to virtually any car built 5 years ago.

    TL:DR In short I don't think that your proposed hypothesis matched my statements. Furthermore the death today in a scenario that Musk claims was covered. That would indicate that Tesla does not have a valid way to validate their own tests. Or Musk lied. Either will fit the situation.

    Question for AustinGreen. 1) Do you work for Tesla? You speak as if you have some insider information that's not otherwise available.
     
  4. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    Really? What version of FSD or Autopilot reliably detects service vehicles or work crews and obeys the California statute that requires that it slow or change lanes? Have you tested it, and how'd you do that?

    For others outside California, it's become a common practice to protect accident victims and first responders from secondary impacts by parking a firetruck diagonally across the lane to block the road. A law was enacted to require all cars and trucks to slow and / or move left when approaching highway workers or emergency vehicles with flashing lights.
     
  5. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    I hope you are not referring to me in that last sentence. That would hurt my feelings. :) But that's OK. I'm kind of puzzled about the repeated assertions that you have data that tesla is safer, yet the boston globe link has no information about the data, nor the parameters and adjustments that were made by Cambridge Mobile Telematics to come up with their report.


    Reading this paragraph, I have no idea how you know what the driver was doing with his feet as cars were piling up. What's your secret? I also can't understand how one can say that their car has had no bugs, and then tack on a disclaimer that autopilot does not work properly, but that's OK because you seem to think that the system that controls your speed, your braking and steering at freeway speeds is not mission critical.

    Shirley you jest.

    I've driven close to a million miles over the last 50 years. I have not been hit by ANY cars in the last 40 years. What is your accident rate per million miles? Data talks.
     
  6. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    27,670
    15,664
    0
    Location:
    Huntsville AL
    Vehicle:
    2018 Tesla Model 3
    Model:
    Prime Plus
    Eventually, NHTSA has to specify what ADAS systems must do be acceptable. By no means perfect, it should be driven by factual accident analysis including insurance company claims. Human intervention must be minimized.
    [​IMG]

    Bob Wilson
     
    #46 bwilson4web, Feb 19, 2023
    Last edited: Feb 19, 2023
  7. Todd Bonzalez

    Todd Bonzalez Active Member

    Joined:
    Apr 3, 2022
    250
    160
    1
    Location:
    Ireland
    Vehicle:
    2004 Prius
    Model:
    Base
    How about dumbing it down to say: "The misleading name (Autopilot) given to the feature gives morons the false belief that they don't need to do anything"? We've all seen the videos where people defeat the "hands on wheel" check with an orange or whatever. What's the legal term for that?

    The end of the story is that the person in the driver's seat bears responsibility for driving with the necessary care and attention to prevent accidents...whether they're driving a Tesla or a Trabant. Autopilot just makes it easier to engage in distracted driving (or whatever you want to call it). I don't believe the law as it currently stands allows drivers to delegate responsibility for driving safely to a computer...but maybe 20, 30, 40 years from now, who knows?

    Again I ask how does someone going down the road fail to see a big red fire truck with flashing lights unless they're (literally) asleep at the wheel?
     
  8. john1701a

    john1701a Prius Guru

    Joined:
    Jan 6, 2004
    12,769
    5,252
    57
    Location:
    Minnesota
    Vehicle:
    2017 Prius Prime
    Model:
    Prime Advanced
    That gets to the heart of the problem... what does "minimize" mean? For those of us who live where there is snow & ice, the idea of ADAS is both laughable & scary. There's the reality of human instinct coming into play as well, where the driver simply doesn't like the choices ADAS makes.

    It is a huge mess that elevates "over promise, under deliver" to an entirely new level. To fix that, realistic expectations need to be set. But expecting anything clear & concise from Musk that does not change with the wind simply isn't going to happen... which is NHTSA getting involved is required at this point.

    Stepping back to look at the bigger picture Tesla has other more important problems to address. Fallout from the price slashes could get ugly. To be a proud owner of a vehicle one day, then the next have it suddenly lose a large amount of resale value is quite disheartening. Then to be kicked while you're down by hearing exclusive access to an impressive national charging network will be lost too, how do you respond?

    This is textbook stuff. Tesla raced to the cliff, gambling that Innovator's Dilemma wouldn't come back to bite them. The race to drop cost in ways competition could not match, while cranking out much higher volumes of their long-in-the-tooth product, has some believers losing faith.
     
  9. bwilson4web

    bwilson4web BMW i3 and Model 3

    Joined:
    Nov 25, 2005
    27,670
    15,664
    0
    Location:
    Huntsville AL
    Vehicle:
    2018 Tesla Model 3
    Model:
    Prime Plus
    Leaving 3 AM Thursday morning, I returned 7:30 AM, Saturday morning:

    upload_2023-2-19_7-26-18.png

    AutoPilot pretty much did all of the driving including rain and night driving. I monitored for the +24 hours the trip took and only had a few interventions: (1) acceleration to avoid getting blocked in, and (2) left and some right turns are best done manually. The rest of the drive was AutoPilot handling the drive which was much better than the weaving pickups, poor drivers, and semi-trailer trucks.
     
  10. Zythryn

    Zythryn Senior Member

    Joined:
    Apr 28, 2008
    6,313
    4,303
    1
    Location:
    Minnesota
    Vehicle:
    Other Electric Vehicle
    Model:
    N/A
    Perhaps you should ask this Ford Escape driver?
    Police: Car crashes into fire truck on I-91 in Windsor Locks

    While it seems unimaginable to many of us, it has been happening for a very long time. Regardless of vehicle.
     
  11. John321

    John321 Senior Member

    Joined:
    Nov 16, 2018
    1,289
    1,278
    0
    Location:
    Kentucky
    Vehicle:
    2008 Prius
    Model:
    Two
    As an individual I would start to worry about my own liability if I used Auto Pilot. With the current news, ongoing investigations and previous documented crashes/problems - an individual would do well to take into account his own liability for letting these systems take over his driving right now.

    In our litigation happy society, I expect personal liability lawyers get a tingle all over themselves when they hear there has been an injury/accident involving Self Driving Features in an auto.
     
    #51 John321, Feb 19, 2023
    Last edited: Feb 19, 2023
  12. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    22,455
    11,767
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    Few years ago, a study was commissioned in the UK over how liability and regulations for these advanced ADAS and autonomous be handled. (Was posted here, don't care to look it up now)

    The main take away was that the manufacturer of an autonomous, full self driving system is the one liable in the event of a crash. The report defined such systems as ones that can fully handle all driving in snow, rain, at night, wind, etc.; a full Level 5 system. In addition, any system beneath that level of autonomy, should not be named in a way to imply that is capable of such abilities.
     
  13. Todd Bonzalez

    Todd Bonzalez Active Member

    Joined:
    Apr 3, 2022
    250
    160
    1
    Location:
    Ireland
    Vehicle:
    2004 Prius
    Model:
    Base
    The Highway Code - Introduction - Guidance - GOV.UK

    That last part in bold implies that a licensed, unimpaired, awake driver must be ready to take over at any time.

    The only problem with this is that no self-driving vehicles are legally recognised in the UK, so it's a moot point.

    Self-driving vehicles listed for use in Great Britain - GOV.UK

     
    Trollbait likes this.
  14. Trollbait

    Trollbait It's a D&D thing

    Joined:
    Feb 7, 2006
    22,455
    11,767
    0
    Location:
    eastern Pennsylvania
    Vehicle:
    Other Non-Hybrid
    Yet they aren't responsible for how the car drives, and can divert all attention away from the road, as long as they don't fall asleep.o_O

    I see that law is applying automated and self driving to cars that aren't full Level 5. It is also from 2018.

    The report I was referring to was from last year.
    New UK plan for self-driving cars details liability protections | World Economic Forum
    This might be the actual report.
    Responsible Innovation in Self-Driving Vehicles - GOV.UK
     
  15. fuzzy1

    fuzzy1 Senior Member

    Joined:
    Feb 26, 2009
    17,557
    10,324
    90
    Location:
    Western Washington
    Vehicle:
    Other Hybrid
    Model:
    N/A
    Highway hypnosis was identified more than a century ago.

    The real target now is to do better than what human drivers demonstrated just before widespread Pandemic misbehavior set in: a maximum of about 1 fatality per 100 million vehicle miles traveled, over a realistic weighting of all drivers over all roads and all conditions, including lousy roads and weather that no current ADAS system will touch.

    Which individual humans have the data to document themselves to be better than that? None. It isn't humanly possible.

    Remember also that the way the math works with the wide distribution of human behaviors, the overall average is weighted heavily towards the bad drivers, not the mediocre and good and great ones. One bad driver can push the fatality rate up more than 100 perfect drivers can pull it down.

    I would hope that people are not getting confused about who is the pilot, and who is the co-pilot. To my understanding, current ADAS systems are qualified for only the right-hand seat, not the left-hand.
     
    #55 fuzzy1, Feb 19, 2023
    Last edited: Feb 19, 2023
    austingreen likes this.
  16. Todd Bonzalez

    Todd Bonzalez Active Member

    Joined:
    Apr 3, 2022
    250
    160
    1
    Location:
    Ireland
    Vehicle:
    2004 Prius
    Model:
    Base
    Yes, the wording does indeed suggest that the person in the driver's seat becomes liable at the moment when the car hands control back to the driver for whatever reason. I guess everything will be logged and timestamped in the case of an accident.

    The law's obviously been enacted with an eye to the future when specific models of self-driving vehicles have been certified/approved. A moot point at this time, as stated before. Tesla's current beta software obviously doesn't make the cut.

    As far as drivers are concerned, the Highway Code represents the current state of driving rules in the UK. It refers to many laws from past decades. The laws aren't invalid just because they're old.

    The WEF article has no useful information, and they don't have any legislative input anyway AFAIK.

    From a quick skim of the gov.uk page, it doesn't seem to be relevant either, it's more about welcoming responsible innovators etc

    Not to dismiss the consultative value of studies or reports, but anybody can commission one to support their point of view. What makes it into law is all that counts.
     
  17. austingreen

    austingreen Senior Member

    Joined:
    Nov 3, 2009
    13,602
    4,136
    0
    Location:
    Austin, TX, USA
    Vehicle:
    2018 Tesla Model 3
    Model:
    N/A
    You should not be insulted. It was a simple question - have you looked at the data or are you just repeating things you have heard.

    The data for this study is behind a paywall. It is good data but sparse. The company has telematics for many insurance companies. They looked at households with a tesla or Porsche and another non tesla or Porsche that was supplying them with analytics. They found that more accidents per mile were committed in the other car than the tesla. They also found more accidents were committed in the Porsche than the other car. Same households so same driver group. A different study they have done says tesla model 3 drivers are 5x more likely to accelerate excessively than drivers of the average car (this acceleration rate is by the insurance companies) they are less likely to have accidents caused by distracted driving. Tesla releases crash info and autopilot and self driving beta are much less likely to be engaged in a crash. This may be simply because drivers take over in more dangerous situations, so I would not draw any conclusions without examining exact data sets. NHTSA has tesla's being involved much less in crashes and fatal crashes than the average car. The Camry last year was the 4th most likely car to be involved in fatal crashes. This data too is sparse. Most Teslas on the road were built in the last 4 years, but there are many old Camrys on the road without more advanced Toyota safety systems. The data for age of the cars is not shown.

    My simple secret is I have driven the car and know how auto pilot and self driving beta works. If you press the accelerator during this time the car will accelerate. The car slowed then stopped. The driver did not press the accelerator. If there was a malfunction of the accelerator then that would already have been disclosed.

    The next thing you say is I said the car had no bugs, I did not say that. I said it does not appear to have any mission critical bugs.

    Mission critical is a defined term. It means that it must be able to run at all times correctly. The full safe driving beta is really level 2 autonomous driving. Accordting to NHTSA such systems require a human driver to be safe. Teslas like many cars have mission critical systems like acceleration, braking, and steering. So if you believe despite the warnings when you sign up for the beta, and all the news, and experience of drivers using it, that this beta will brake, steer, change lanes, turn on roads, etc at all times without human interference then you are operating on a false assumption. I have found it to operate very well on its own on highways and is much more relaxing in stop and go traffic on highways than driving myself. It sometimes decides to do lane changes when I don't want it to do them and I cancel them.

    So your answer is no, you have never been hit by a tesla. Great. Why do you think they will cause you to be in accidents in the future?

    I am not old enough to have driven for 50 years ;) or anywhere close to 1 million miles. The 3 accidents in the prius were 1) early morning a driver existed the highway at excessive speeds and came over 2 lanes on the access road to hit me at my rear driver side passenger door. 2) At a stop light a truck backed into me at low speed but the trailer hitch caused lots of damage - he apologized and said his wife was yelling at him to get over to the left turn lane and he didn't look for cars stopped behind and didn't hear my horn until it was too late, 3) car was parked on the street by a bar I was in, I came out and someone had hit it. In that last case sentry mode would have gotten the plate, as it is I had to file on uninsured motorist. I freely admit at times I am a drowsy driver or get involved in conversations with my passengers. Autopilot keeps me centered in the lane and not exceeding the speed limit in those case and I believe it makes me a better safer driver.
     
    Zythryn likes this.
  18. austingreen

    austingreen Senior Member

    Joined:
    Nov 3, 2009
    13,602
    4,136
    0
    Location:
    Austin, TX, USA
    Vehicle:
    2018 Tesla Model 3
    Model:
    N/A
    Agree that full self driving beta is a very bad name. It may get there but it is level 2 autonomous. Autopilot I think is not a bad name but really didn't get to be there until 2020 ;) and lots of idiots did bad things at the beginning. These bad drivers also crash their other cars. Many of my friends thought my car could drive itself, I showed them the cool things it can do but also places where it doesn't do well. I think autopilot is a misunderstood term, but I get why people are confused.

    +1
    They are allowing some autonomous cars in geofenced phoenix and San Francisco. IMHO it is too early to let these vehicles operate without drivers. I have seen some with safety drivers in my neighborhood.

    Here's what it was like to ride in a Waymo with no driver in Phoenix

    I think it will take a long time, but it may be shorter than 20 years.


    putting on make up, using their phone, lots of incidents of distracted drivers hitting these vehicles. Definitely the drivers responsibility, but I hope tesla gets its vision system to brake earlier in these situations.
     
  19. Todd Bonzalez

    Todd Bonzalez Active Member

    Joined:
    Apr 3, 2022
    250
    160
    1
    Location:
    Ireland
    Vehicle:
    2004 Prius
    Model:
    Base
    Thankfully this isn't generally a defence people can use in court.

    If you're not able to pay attention to your surroundings due to "hypnosis" I believe this is usually regarded as reckless driving
     
  20. dbstoo

    dbstoo Senior Member

    Joined:
    Oct 7, 2012
    1,365
    732
    0
    Location:
    Near Silicon Valley
    Vehicle:
    2024 Prius Prime
    Model:
    XSE Premium
    I beg to differ. What you know is how its is supposed to work. And you know what it's supposed to do when the autopilot shuts down in a normal situation. But that's not the same as knowing how it works at any given moment. In this specific instance, if the software worked as expected, it would not have caused the pileup. And that raises the question of what the software is designed to do when it tries to pull over into a non existent breakdown lane (that's actually an occupied fast lane) in a badly lit tunnel. Will it even register driver inputs such as the accelerator or steering? Without seeing the code, the design specs and the logs there is no way for you or I to know what Tesla engineers put in the software to handle such a situation.

    As for the assertion that "The driver did not press the accelerator. If there was a malfunction of the accelerator then that would already have been disclosed." That's an interesting position to take. From a system point of view, it does not need a malfunction of the accelerator to have caused loss of control. I'm not sure why you are sure that a car defect (such as a faulty accelerator) would have been disclosed. Just this week a guy made the news when the steering wheel came off while he was driving his Model Y. For some reason Tesla support said that was not a flaw and charged him for the repair. (summary from memory, but the main point is denial of the error by Tesla).

    Just looking at the data I have available (videos and accident reports) I can think of a commonality between this week's death as well as the crash in the tunnel.

    The common link is glare and hundreds of flashing lights. Glare from the flashing strobes on the emergency vehicles. Flashing lights as the sun shining in the west end reflects off all the cars and the attendant 100 or so tail light suddenly in the camera's field of vision.

    If we were privy to the real data, I'd but money on system overload from trying to track too many bogeys at once. Did the system that was running the auto pilot have enough processor cycles left to service the interrupt if the accelerator was depressed? Did it have enough cycles to determine that the 100 or more bogies that were there for only a split second before disappearing were warning beacons on huge, immovable objects like a fire truck or brake lights triggered by a mass of cars whose drivers were startled by the sun?

    Before stating the impossibility of that being the case, consider the data: The cars did NOT behave as expected. The systems were designed to use different and tested CPUs, but alternate parts were substituted sometime during the pandemic. Two of the prime collision sensors that were originally part of the design (SONAR and RADAR) were disabled, removed, depreciated or not installed in recent Tesla versions. The data that is reported about the crash is gathered by the same systems that are suspect in this circumstance and interpreted / filtered before it's made available to the public. A lot of blame is being placed on the drivers who did not respond properly, yet I've seen no proof that the drivers were given a warning that a valid crash was imminent with enough time to react.**

    Someone in this thread stated that Tesla provided new and improved hardware more than once so that he could participate in training the neural net for the FSD product. What measures are they using to determine that today's dataset will still work with last month's hardware?



    ** The firetrucks in this week's death were visible for more than a quarter mile before the impact. I don't know what the speed was, but the front end of the car was obliterated.