• Chozo@fedia.io
      link
      fedilink
      arrow-up
      47
      arrow-down
      2
      ·
      5 months ago

      I used to work on the software for these cars, so I can speak to this a little. For what it’s worth, I’m no longer with the project, so I have no reason to be sucking Google’s dick if these weren’t my honest opinions on the tech being used here. None of this is to excuse or defend Google, just sharing my insight on how these cars operate based on my experiences with them.

      Waymo’s cars actually do a really good job at self-navigation. Like, sometimes it’s scary how good they actually are when you see the conditions they can operate under. There are so many layers of redundancies that you could lose all of the camera feeds, GPS, and cellular data, and they’ll still be able to navigate themselves through traffic by the LIDAR sensors. Hell, even if you removed the LIDAR from that scenario, those cars accurately know their location based on the last known location combined with how many times each tire has revolved (though it’d just run into everything along the way, but at least it’d know where it’s located the entire time). All of the other sensors and data points collected by the cars actually end up making GPS into the least accurate sensor on the car.

      That said, the article mentions that it was due to “inconsistent construction signage”, which I’d assume to be pretty accurate from my own experience with these cars. Waymo’s cars are usually really good at detecting cone placements and determining where traffic is being rerouted to. But… that’s generally only when the cones are where they’re supposed to be. I’ve seen enough roadwork in Phoenix to know that sometimes Mad Max rules get applied, and even I wouldn’t know how to drive through some of those work zones. It was pretty rare that I’d have to remotely take over an SDC, but 9/10 times I did it was because of construction signs/equipment being in weird places and I’d have to K-turn the car back where it came from.

      That’s not to say that construction consistently causes the cars to get stuck, but I’d say was one of the more common pain points. In theory, if somebody were to run over a cone and nobody picks it back up, an SDC might not interpret that obstruction properly, and can make a dumb decision like going down the wrong lane, under the incorrect assumption that traffic has been temporarily rerouted that way. It sounds scary, and probably looks scary as hell if you saw it on the street, but even then it’s going to stop itself before coming anywhere near an oncoming car, even if it thinks it has right of way, since proximity to other objects will take priority over temporary signage.

      The “driving through a red light” part I’m assuming might actually be inaccurate. Cops do lie, after all. I 100% believe in a Waymo car going down the opposing lane after some sketchy road cones, but I have a hard time buying that it ran a red light, since they will not go if they don’t detect a green light. Passing through an intersection requires a positive detection of a green light; positive or negative detection of red won’t matter, it has to see a green light for its intended lane or it will assume it has to stop at the line.

      In the video, the cop says he turns on his lights and the SDC blows through a red light. While I was working there, red light violations were so rare that literally 100% of the red light violations we received were while a human was driving the car in manual mode. What I’d assume was likely going on is that the SDC was already in a state of “owning” the intersection for an unprotected left turn when the lights came on. When an SDC thinks it’s being pulled over, it’s going to go through its “pullover” process, which first requires exiting an intersection if currently in one. So what likely ended up happening is the SDC was already in the intersection preparing for a left turn, the light turns red while the SDC is in the box (and still legally has right of way to the intersection), cop turns on the sirens, SDC proceeds “forward” through the intersection until it’s able to pull over.

      But, that’s just my speculation based on my somewhat outdated understanding of the software behind these cars. I’d love to see the video of it, but I doubt Waymo will release it unless there’s a lawsuit.

      • Prison Mike@links.hackliberty.org
        link
        fedilink
        arrow-up
        12
        ·
        5 months ago

        The red light bit seems spot on. In every article stating “it blew through a red light” there’s always the caveat that it’s just trying to clear the intersection while getting pulled over. Technically people are allowed to do that (and/or move to a safer area, such as getting into the right lane when being pulled over in the left lane).

        I think media like to add the intersection stuff to rile people up.

      • rhythmisaprancer@moist.catsweat.com
        link
        fedilink
        arrow-up
        12
        ·
        5 months ago

        This is pretty interesting to read, thanks! I would think that Waymo employs an abundance of visual sensors that could give us an idea of what happened if they so chose to do so. Construction zones can be hard, maybe they need to own this one?

      • bitwaba@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        5 months ago

        If you listen to the video of the interaction with the police officer and the two Waymo guys, it’s clear to me he’s not making anything up about the events that took place. The car did run through the intersection when he turn on the light. He’s not trying to issue tickets or anything - he really is interacting with the Waymo people to let the know “your car was behaving erratically. It needs to be off the road”. Its very possible the road construction uncertainty plus being in an oncoming traffic lane plus being lit up by the police triggered some very specific failure of process in the code.

      • Doubletwist@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        So I’ve been in situations where I was stopped at a red light, and emergency vehicles were coming and I was waved by a policeman to cross the intersection against the red light to clear the way.

        So what, is a self driving car going to just sit there and keep the intersection blocked?

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          5 months ago

          (I’m assuming we’re talking about unprotected left turns.)

          I don’t know if I ever saw it happen, myself, so I can’t say for certain. My understanding of the SDC’s logic is that if it was already in the intersection, it would complete the turn, and then pull off to the right shoulder to let the emergency vehicle pass. If it hasn’t yet entered the intersection and detects siren lights behind it, I believe it will turn on the hazard lights and remain stationary unless honked at (I could be mistaken, but I think it’ll recognize being honked at by emergency vehicles, and will assume it to mean “move forward and clear a path”). The SDCs have an array of microphones around the car to detect honks, sirens, nearby crashes, etc, and can tell the direction the sounds are coming from for this purpose.

          That said, because it’s listening for sirens, the SDC will usually be aware that there’s an emergency vehicle heading toward it well ahead of time, and if they’ve got their lights on, the SDC will usually be able to determine which vehicle, specifically, is the emergency vehicle, so it can monitor its trajectory and make sure it’s staying out of the way when possible. Typically, they will be proactive about steering clear of anything with lights/sirens running.

          This would also considered a higher-priority event, and usually it will automatically ping a live human to remotely monitor the situation, and depending on the specific context, they may either command the SDC to remain stationary, proceed forward, make a U-turn, or whatever else may be necessary. In case the emergency vehicle has a loud speaker, we’d be able to hear any requests they’re making of us, as well.

          For what it’s worth, I know that Waymo also works pretty closely with the Phoenix PD, and provide them with updates about any significant changes to the car’s behaviors or any tips/tricks for dealing with a stuck car in an emergency situation, so if a situation got particularly sticky, the cops would know how to work around it. My understanding is that Phoenix PD has generally been very cooperative, though they’ve apparently had issues with state troopers who don’t seem to care to learn about how to deal with the cars.

    • OsrsNeedsF2P@lemmy.ml
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      5 months ago

      It did recognize the patterns, but the construction signs were (allegedly) inconsistent

      Also not that your comment wss alluding otherwise, but self driving cars only use AI for recognition. The decision making is deterministic algorithms