• exscape@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    This kind of serious trouble (from the article):

    The Department of Justice is currently investigating Tesla for a series of accidents — some fatal — that occurred while their autonomous software was in use. In the DoJ’s eyes, Tesla’s marketing and communication departments sold their software as a fully autonomous system, which is far from the truth. As a result, some consumers used it as such, resulting in tragedy. The dates of many of these accidents transpired after Tesla went visual-only, meaning these cars were using the allegedly less capable software.

    Consequently, Tesla faces severe ramifications if the DoJ finds them guilty.

    And of course:

    The report even found that Musk rushed the release of FSD (Full Self-Driving) before it was ready and that, according to former Tesla employees, even today, the software isn’t safe for public road use. In fact, a former test operator went on record saying that the company is “nowhere close” to having a finished product.

    So even though it seems to work for you, the people who created it don’t seem to think it’s safe enough to use.

    • obviouspornalt@lemmynsfw.com
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      My neighborhood has roundabouts. A couple of times when there’s not any traffic around, I’ve let autopilot attempt to navigate them. It works, mostly, but it’s quite unnerving. AP wants to go through them ready faster than I would drive through them myself.

      • SirEDCaLot@lemmy.fmhy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        AP or FSD?
        AP is old and frankly kinda sucks at a lot of things.
        FSD Beta if anything I’ve found is too cautious on such things.

    • CmdrShepard@lemmy.one
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I think you and the author are drawing conclusions that aren’t supported by the quote.

      The engineers stated it’s “nowhere close” to being a finished product which is evident by the fact that it’s only L2 and in beta.

      The DOJ is investigating but we know some of these crashes where from people disregarding the safety features (like keeping your hands on the wheel and eyes on the road) when they crashed, so what comes of the investigation is still up in the air and I think a lot of the motivation is driven by publicity from articles such as this and not necessarily because the system is unsafe to use at all.

      The truth is that nobody has achieved full automation so we don’t know what a full automation suite should look like in terms of hardware and software. The Mercedes system is a joke in that it can only be used on the highway below 40MPH. I dunno what speed limits are where you’re located but in my area all the highways are 55+MPH.

      Furthermore, the robotaxis are being used in places like Vegas where they’re geofenced to premapped city streets in areas where the weather is perfect all year round. The entire industry has a long way to go before anyone reaches a finished product.

      • exscape@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I dunno, I think “even today, the software isn’t safe for public road use” is pretty clear-cut and has nothing to do with the level of automation.

        I’m not suggesting anyone else is way ahead though. But I do think that removing all non-visual sensors is an obvious step back, especially in poor weather where visibility may be near zero, but other sensors types could be relatively unimpeded.

        • CmdrShepard@lemmy.one
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I think “even today, the software isn’t safe for public road use” is pretty clear-cut and has nothing to do with the level of automation.

          Keep in mind this isnt even a quote and was attributed to someone who doesn’t even work on this tech for the company. What has you convinced it’s unsafe for use now? A few car accidents? What about all the accidents that have been prevented using this same system? You might suggest a ban but if the crash rate or fatality rate increases, haven’t you made conditions less safe on the road?

          The problem with articles like this is they focus on things like “Tesla has experienced 50 crashes in the last 5 years!!” but they don’t include context like the fact that cars without these systems have crash rates 10x higher or more. These systems can still be a net benefit even if they don’t work 100% of the time or prevent 100% of crashes.