OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we’re going to allow Elon to murder every year?

  • nimble@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 minutes ago

    Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn’t seen something enough times then it won’t know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.

  • Gammelfisch@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 hour ago

    So, a kid on a bicycle or scooter is an edge case? Fuck the Muskrat and strip him of US citizenship for illegally working in the USA. Another question. WTF was the driver doing?

    • M600@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 minutes ago

      In regards to the deer, it looks like it might have been hard to see for the driver. I remember learning in driversED that it is better to hit the animal instead of swerving to miss it as it might hit a car to your side, so maybe that is what they were thinking?

  • pdxfed@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 hours ago

    You just need to buy the North America Animal Recognition AI subscription and this wouldn’t be an issue plebs, it will stop for 28 out of 139 mammals!

  • w3dd1e@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    4 hours ago

    Deer aren’t edge cases. If you are in a rural community or the suburbs, deer are a daily way of life.

    As more and more of their forests are destroyed, deer are a daily part of city life. I live in the middle of a large midwestern city; in neighborhood with houses crowded together. I see deer in my lawn regularly.

    • Kecessa@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      3 hours ago

      People are acting like drivers don’t hit deers at full speed while they’re in control of the car. Unless we get numbers comparing self driving vs human driven cars then this is just a non story with the only goal being discrediting Musk when there’s so many other shit that can be used to discredit him.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        52 minutes ago

        People are acting like drivers don’t hit deers at full speed while they’re in control of the car.

        I should be very surprised if people don’t generally try to brake or avoid hitting an animal (with some exceptions), if only so that they don’t break the car. Whether they succeed at that is another question entirely.

        • Kecessa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          15 minutes ago

          People drive drunk, people drive while checking their phone, people panic and freeze, deers often just jump in front of you from out of nowhere.

          People hit fucking humans without braking because they’re not paying attention to what the fuck they’re doing!

  • Emerald@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    3
    ·
    edit-2
    3 hours ago

    I notice nobody has commented on the fact that the driver should’ve reacted to the deer. It’s not Tesla’s responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle’s movements at the end of the day.

    • rsuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 hour ago

      True but if Tesla keeps acting like they’re on the verge of an unsupervised, steering wheel-free system…this is more evidence that they’re not. I doubt we’ll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.

    • inclementimmigrant@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      3 hours ago

      That would be lovely if it wasn’t called and marketed as Full Self-Driving.

      You sell vaporware/incomplete functionality software and release it into the wild, then you are responsible for all the chaos it brings.

    • chaogomu@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      4 hours ago

      Then it’s not “Full self driving”. It’s at best lane assistance, but I wouldn’t trust that either.

      Elon needs to shut the fuck up about self driving and maybe issue a full recall, because he’s going to get people killed.

  • MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    5
    ·
    edit-2
    3 hours ago

    I hit a deer on the highway in the middle of the night going about 80mph. I smelled the failed airbag charge and proceeded to drive home without stopping. By the time I stopped, I would never have been able to find the deer. If your vehicle isn’t disabled, what’s the big deal about stopping?

    I’ve stuck two deer and my car wasn’t disabled either time. My daughter hit one and totaled our van. She stopped.

    That said, fuck Musk.

    • Sentau@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      9
      ·
      44 minutes ago

      Maybe drive a little slower at night. If you can’t spot and react to animals on your path, you won’t able to react when it’s a human

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 hours ago

      Whether or not a human should stop seems beside the point. Autopilot should immediately get the driver to take back control if something unexpected happens, and stop if the driver doesn’t take over. Getting into an actual collision and just continuing to drive is absolutely the wrong behavior for a self-driving car.

    • Madison420@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      3 hours ago

      You’re supposed to stop and report it so they can come and get it so no one hits it and ends up more squishy then intended.

  • Kbobabob@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    8 hours ago

    Is there video that actually shows it “keeps going”? The way that video loops I know I can’t tell what happens immediately after.

      • LordKitsuna@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        Inb4 it actually stopped with hazards like I’ve seen in other videos. Fuck elon and fuck teslas marketing of self driving but I’ve seen people reach far for karma hate posts on tesla sooooooo ¯\_(ツ)_/¯

  • Sam_Bass@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    8 hours ago

    the deer is not blameless. those bastards will race you to try and cross in front of you.

    • WoahWoah@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      6 hours ago

      Finally someone else familiar with the most deadly animal in North America.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        5 hours ago

        I’d give the moose the top spot. Maybe not in sheer numbers of deaths, but I’d much rather have an encounter with a deer than a moose.

        Though for sheer number, I also wouldn’t give that to deer, that spot would go to humans, though I can admit it’s a bit pedantic.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    8 hours ago

    For the 1000th time Tesla: don’t call it “autopilot” when it’s nothing more than a cruise control that needs constant attention.

    • LordKitsuna@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 hours ago

      Real Autopilot also needs constant attention, the term comes from aviation and it’s not fully autonomous. It maintains heading, altitude, and can do minor course correction.

      It’s the “full self driving” wording they use that needs shit on.

    • GoodEye8@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 hours ago

      It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn’t have to take responsibility when it inevitably breaks the law.

  • whotookkarl@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    7 hours ago

    It doesn’t have to not kill people to be an improvement, it just has to kill less people than people do

    • ano_ba_to@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 hours ago

      That’s a low bar when you consider how stringent airline safety is in comparison, and that kills way less people than driving does. If sensors can save people’s lives, then knowingly not including them for profit is intentionally malicious.

    • rigatti@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 hours ago

      True in a purely logical sense, but assigning liability is a huge issue for self-driving vehicles.

      • Kecessa@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 hour ago

        As long as there’s manual controls the driver is responsible as they’re supposed to be ready to take over

          • Kecessa@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            Because it’s not, it’s a car with assisted driving, like all cars you can drive at the moment and with which, surprise surprise, you are held responsible if there’s an accident while it’s in assisted mode.

  • brbposting@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    21
    ·
    9 hours ago

    Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.

    • Demdaru@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      9 hours ago

      I mean, to be honest…if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.

      Official advice I heard many times. Prolly doesn’t apply if you are going slow.

      Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I’ve never had to put it i to test.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 hours ago

        Haven’t read down yet, but I bet odds are a bit better if you let go of the brake just before impact, to raise the front up a bit.

  • bluGill@fedia.io
    link
    fedilink
    arrow-up
    31
    arrow-down
    2
    ·
    10 hours ago

    Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.

    The real question isn’t is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn’t be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I’ll accept a few edge cases where they are worse.

    Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

    • atempuser23@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      31 minutes ago

      Yes. The question is if the Tesla is better than a anyone in particular. People are given the benefit of the doubt once they pass the drivers test. Companies and AI should not get that. The AI needs to be as good or better than a GOOD human driver. There is no valid justification to allow a poorly driving AI because it’s better than the average human. If we are going to allow these on the road they need to be good.

      The video above is HORRID. The weather was clear, there was no opposing traffic , the deer was standing still. The auto drive absolutely failed.

      If a human was driving in these conditions plowed through a deer at 60 mph and didn’t even attempt to swerve or stop they shouldn’t be driving.

    • ano_ba_to@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      Being safer than humans is a decent starting point, but safety should be maximized to the best of a machine’s capability, even if it means adding a sensor or two. Keeping screws loose on a Boeing airplane still makes the plane safer than driving, so Boeing should not be made to take responsibility.

    • Semi-Hemi-Lemmygod@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      Humans are also bad drivers who get edge cases wrong all the time.

      It would be so awesome if humans only got the edge cases wrong.

      • xthexder@l.sw0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        I’ve been able to get demos of autopilot in one of my friend’s cars, and I’ll always remember autopilot correctly stopping at a red light, followed by someone in the next lane over blowing right through it several seconds later at full speed.

        Unfortunately “better than the worst human driver” is a bar we passed a long time ago. From recent demos I’d say we’re getting close to the “average driver”, at least for clear visibility conditions, but I don’t think even that’s enough to have actually driverless cars driving around.

        There were over 9M car crashes with almost 40k deaths in the US in 2020, and that would be insane to just decide that’s acceptable for self driving cars as well. No company is going to want that blood on their hands.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      Given that they market it as “supervised”, the question only has to be “are humans safer when using this tool than when not using it?”

      One of the cool things I’ve noticed since recent updates, is the car giving a nudge to help me keep centered, even when I’m not using autopilot

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 hours ago

      Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

      https://www.reuters.com/business/autos-transportation/nhtsa-opens-probe-into-24-mln-tesla-vehicles-over-full-self-driving-collisions-2024-10-18/

      The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.

      It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.

      • atempuser23@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        29 minutes ago

        One trick used is to disengage auto pilot when it senses and imminent crash. This would vastly lower the crash count shifting all blame to the human driver.

      • Billiam@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        8 hours ago

        It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.

        I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn’t you be shoving that into every single selling point you have? Why wouldn’t that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla’s FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 hours ago

          If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.

          Bu they don’t because they know it would be a repeat of smashing the bulletproof window on stage.

  • Hubi@feddit.org
    link
    fedilink
    English
    arrow-up
    184
    arrow-down
    3
    ·
    14 hours ago

    The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.

    How are these people always such pathetic suckers.

    • nialv7@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      15 minutes ago

      Yeah this Tesla owner is dumb. wdym “we just need to train the AI to know what deer butts look like”? Tesla had radar and sonar, it didn’t need to know what a deer’s butt looks like because radar would’ve told it something was there! But they took it away because Musk had the genius idea of only using cameras for whatever reason.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 hours ago

      I’d go even farther and say most driving is an edge case. I used 30 day trial of full self-driving and the results were eye opening. Not how it did: it was pretty much as expected, but looking at where it went wrong.

      Full self driving did very well in “normal” cases, but I never realized just how much of driving was an “edge” case. Lane markers faded? No road edge but the ditch? Construction? Pothole? Debris? Other car does something they shouldn’t have? Traffic lights not aligned in front of you so it’s not clear what lane? Intersection not aligned so you can’t just go straight across? People intruding? Contradictory signs? Signs covered by tree branches? No sight line when turning?

      After that experiment, it seems like “edge” cases are more common than “normal” cases when driving. Humans just handle it without thinking about it, but the car needs more work here

    • teft@lemmy.world
      link
      fedilink
      English
      arrow-up
      117
      arrow-down
      1
      ·
      14 hours ago

      I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        43
        ·
        13 hours ago

        Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.

        Deer are the opposite of an edge case in the majority of the US.

        • leftytighty@slrpnk.net
          link
          fedilink
          English
          arrow-up
          16
          ·
          edit-2
          12 hours ago

          Putting these valid points aside we’re also all just taking for granted that the software would have properly identified a human under the same circumstances… This could very easily have been a much more chilling outcome

          • snooggums@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            11 hours ago

            I’m not taking that for granted. If it can’t tell a solid object os in the road, I would guess that would be true for a human that is balled up or facing away as well.

        • ArxCyberwolf@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 hours ago

          It’s no different in Southern Ontario where I live. Saw a semi truck plow into one, it really wasn’t pretty. Another left a huge dent on my mom’s car when she hit one driving at night.

    • leftytighty@slrpnk.net
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      1
      ·
      13 hours ago

      Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a “snowflake liberal” by comparison

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      14 hours ago

      Edge cases (NOT features) are the thing that keeps them from reaching higher levels of autonomy. These level differences are like “most circumstances”, “nearly all circumstances”, “really all circumstances”.

      Since Tesla cares so much more about features, they will remain on level 2 for another very long time.

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      27
      arrow-down
      14
      ·
      14 hours ago

      Deer on the road is an edge case that humans cannot handle well. In general every option other than hitting the deer is overall worse - which is why most insurance companies won’t increase your rates if you hit a deer and file a claim for repairs.

      The only way to not hit/kill hundreds of deer (thousands? I don’t know the number) every year is to reduce rural speed limits to unreasonably slow speeds. Deer jump out of dark places right in front of cars all the time - the only option to avoid it that might work is either drive in the other lanes (which sometimes means into an oncoming car), or into the ditch (you have no clue what might be there - if you are lucky the car just rolls, but there could be large rocks or strong fence posts and the car stops instantly. Note that this all happens fast, you can’t think you only get to react. Drivers in rural areas are taught to hit the brakes and maintain their lane.

      • dhork@lemmy.world
        link
        fedilink
        English
        arrow-up
        52
        arrow-down
        2
        ·
        14 hours ago

        Drivers in rural areas are taught to hit the brakes and maintain their lane.

        Which the Tesla didn’t do. It plowed full speed into the deer, which arguably made the collision much much worse than it could have been. I doubt the thing was programmed to maintain speed into a deer. The more likely alternative is that the FSD couldn’t tell there was a deer there in the first place.

        • SchmidtGenetics@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          34
          ·
          edit-2
          13 hours ago

          Braking dips the hood making it easier for the deer to go into the windshield. You should actually speed up right before hitting to make your hood go up and make it hopefully go under or better stay in the grill.

          • TimeSquirrel@kbin.melroy.org
            link
            fedilink
            arrow-up
            32
            arrow-down
            1
            ·
            13 hours ago

            Doesn’t this all depend on the height of your car and the condition of your shocks? Doesn’t seem like a hard and fast rule. Also, you’re assuming rear wheel drive. FWD does not “raise the hood” like you’re playing Cruising USA.

          • troed@fedia.io
            link
            fedilink
            arrow-up
            20
            ·
            13 hours ago

            Please show me that guideline, anywhere.

            /Swede living in the deer countryside

            • NABDad@lemmy.world
              link
              fedilink
              English
              arrow-up
              16
              ·
              11 hours ago

              Wear gloves when they hand you that guideline because they might be pulling it out of their ass.

          • dhork@lemmy.world
            link
            fedilink
            English
            arrow-up
            21
            arrow-down
            1
            ·
            edit-2
            13 hours ago

            Maybe, but it’s still the case that slowing down will impart less energy to the collision. Let up on the brake before impact if you want, but you should have been braking once you first saw the deer in the road.

            Sometimes those fuckers just jump out at you at the last minute. They’re not smart. But if you click the link, this one was right in the middle of the road, with that “Deer in the headlights” look. There was plenty of time to slow down before impact.

            • SchmidtGenetics@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              21
              ·
              edit-2
              13 hours ago

              Conditions matter and your reaction should always be for the worst possible scenario (moose and snow), braking removes your ability to maneuver as well, and locking the brakes up which will almost always happen when you panic break, would be the worst scenario. If there’s snow or rain, braking again is right out.

              If it jumps out and you can’t do anything but brake, you shouldn’t do that, you grip the wheel and maintain speed, and if you can punch the gas for the hood raise. But people panic and can’t think. So maintain speed, don’t panic and lock your brakes up.

              • superkret@feddit.org
                link
                fedilink
                English
                arrow-up
                3
                ·
                5 hours ago

                In this case, the deer just stood there in the road.
                Any driver and any AI should be able to stop before the obstacle in that case.
                Cause it could be a human, or a fallen tree instead of a deer.

              • bluGill@fedia.io
                link
                fedilink
                arrow-up
                8
                ·
                10 hours ago

                You should know how to brake without causing maneuver problems (including not locking up the wheels). It is a basic skill needed for many situations. Just keep slowing down, the accelerate just before impact is something that can only be done in movies - any real world attempt will be worse - remember if you keep braking you lose momentum, so the acceleration needs to be perfectly timed or it is worse.

              • criitz@reddthat.com
                link
                fedilink
                English
                arrow-up
                14
                ·
                edit-2
                12 hours ago

                I don’t think hitting more gas is going to gently slide the 300 pound buck under my car. It’s just going to increase the impact force.

                • BakerBagel@midwest.social
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  ·
                  11 hours ago

                  Sliding the deer under your car is also really bad for you. It’s going to do a lot of damage under there such as ripping break lines, destroying ball joints, or fragging your differentials. You need to safely shed as much speed as possible while maintaining your lane when about to hit a deer.

                • 0x0@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  6
                  ·
                  10 hours ago

                  Considering suspension, if you accelerate there’s a lowering of the back of the car/raising of the front.

                  Conversely, breaking has the opposite effect, increasing the chances of the deer rolling over your hood and through your windshield.

                  You’ll want to minimize that, hence the acceleration.

            • SchmidtGenetics@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              20
              ·
              edit-2
              13 hours ago

              Right before hitting begin the keyword. If you can stop before hitting yes that’s ideal, but in situations where it jumps out and you can’t react. Braking during impact is the worst thing you can do.

              If you think I’m saying to line it up and accelerate for 200meters, I dont know what to say about that,

              • Aatube@kbin.melroy.org
                link
                fedilink
                arrow-up
                4
                ·
                8 hours ago

                Dude, the article just said to hit the brakes “if you can’t avoid hitting a deer”, the exact scenario you described… Did you even open it?

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              10 hours ago

              I don’t know, where I live giraffes are only in the zoo and thus never on the road. I’m not aware of any escaping the zoo.

              I’m sure if I lived around wild deere, my training would include that, but since I don’t I was able to save some time by not learning that.

                • bluGill@fedia.io
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 hours ago

                  I’ve never been in a zoo I’m allowed to drive more thln e wheelchair through. They may require extra training - I would not know

            • SchmidtGenetics@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              20
              ·
              edit-2
              13 hours ago

              Same for a moose? Speed up so you clear it before gravity caves your car roof.

              You maintain speed, you can’t maneuver well if braking, and as stated your hood dips while braking too which can cause worse issues.

              • GreyEyedGhost@lemmy.ca
                link
                fedilink
                English
                arrow-up
                7
                arrow-down
                1
                ·
                9 hours ago

                The whole premise of ABS brakes, which all cars made in North America since 2012 will have, is specifically to allow you to maintain control when you fully apply the brakes. Unless you are a professional driver or have a car without ABS, you should just fully apply the brakes in an emergency stop. Please stop telling people that fully applying the brakes will reduce manueverability when it won’t for the majority of drivers in the developed world.

                And if someone’s vehicle doesn’t have ABS, they should know how to properly brake without locking their tires, and when it won’t be appropriate to use them.

              • Aphelion@lemm.ee
                link
                fedilink
                English
                arrow-up
                8
                ·
                edit-2
                10 hours ago

                That’s a good strategy to ensure you die: a mooses torso is already higher than the hood of a lot of SUVs, so you’re taking a moose to the face.

              • Slowy@lemmy.world
                link
                fedilink
                English
                arrow-up
                9
                arrow-down
                1
                ·
                13 hours ago

                No, for moose you are actually supposed to swerve and risk the ditch.

      • Hubi@feddit.org
        link
        fedilink
        English
        arrow-up
        28
        arrow-down
        1
        ·
        14 hours ago

        The problem is not that the deer was hit, a human driver may have done so as well. The actual issue is that the car didn’t do anything to avoid hitting it. It didn’t even register that the deer was there and, what’s even worse, that there was an accident. It just continued on as if nothing happened.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          13 hours ago

          Yeah, the automated system should be better than a human. That is the whole point of collision detection systems!

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 hours ago

            Right. I was trying to decide whether to mention that deer can be hard to spot in time. Even in the middle of the road like this, they’re non-reflective and there may be no movement to catch the eye. It’s very possible for a human to be zoning out and not notice this deer in time

            But yeah, this is where we need the car to help. This is what the car should be better than human with. This is what would make ai a good tool to improve safety. If it saw the deer

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          13 hours ago

          If tesla also used radar or other sensing systems instead of limiting themselves to only cameras then being in the dark wouldn’t be an issue.

      • IchNichtenLichten@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        11 hours ago

        Deer on the road is an edge case that humans cannot handle well.

        If I’m driving at dawn or dusk, when they’re moving around in low light I’m extra careful when driving. I’m scanning the treeline, the sides of the road, the median etc because I know there’s a decent chance I’ll see them and I can slow down in case they make a run across the road. So far I’ve seen several hundred deer and I haven’t hit any of them.

        Tesla makes absolutely no provision in this regard.

        This whole FSD thing is a massive failure of oversight, no car should be doing self driving without using cameras and radar and Tesla should be forced to refund the suckers customers who paid for this feature.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          11 hours ago

          Sure, I do that too. I also have had damage because a deer I didn’t see jumped out of the trees onto the road. (Though as others pointed out this case the deer was on the road with plenty of time to stop (or at least greatly slow down), but the Tesla did nothing.

      • 0x0@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        7
        ·
        13 hours ago

        In general every option other than hitting the deer is overall worse

        You’re wrong. The clear solution here is to open suicide-prevention clinics for the depressed deer.

    • tacosanonymous@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 hours ago

      Sunk cost? Tech worship?

      I’m so jaded, I question my wife when she says the sun will rise tomorrow so I really don’t get it either.

  • Nytixus@kbin.melroy.org
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    9 hours ago

    I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that’s going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can’t do at a whim.

    • Lets_Eat_Grandma@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      6 hours ago

      How come human drivers have more fatalities and injuries per mile driven?

      Musk can die in a fire, but self driving car tech seems to be vastly safer than human drivers when you do apples to apples comparisons. It’s like wearing a seatbelt, you certainly don’t need to have one to go from point A to point B, but you’re definitely safer with it - even if you are giving up a little control. Like a seatbelt, you can always take it off.

      • Semi-Hemi-Lemmygod@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        6 hours ago

        I honestly think it shouldn’t be called “self driving” or “autopilot” but should work more like the safety systems in Airbusses by simply not allowing the human to make a decision that would create a dangerous situation.