A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    5 months ago

    You don’t even have to rig a bomb, a better analogy to the sensor spoofing would be to just shine a sufficiently bright light in the driver’s eyes from the opposite side of the road. Things will go sideways real quick.

    • @[email protected]
      link
      fedilink
      English
      25 months ago

      It’s not meant to be a perfect example. It’s a comparable principle. Subverting the self-driving like that is more or less equivalent to any other means of attempting to kill someone with their car.

      • @[email protected]
        link
        fedilink
        English
        45 months ago

        I don’t disagree, i’m simply trying to present a somewhat less extreme (and therefore i think more appealing) version of your argument