title

  • NochMehrG@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    And it’s basically ended already. At least for ordinary people without a IT forensic team, the best advice is to be very sceptical towards images and videos, more or less so depending on the source.

    • apemint@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Photoshop has been around for over quarter of a century but you don’t need a forensic team to tell something has been photoshopped.
      Tools to detect image (and video) modifications have been around and will continue to be developed alongside these technologies. We’re simply entering a new era of media creation.

      When Photoshop became mainstream, people said the exact same thing, but somehow the world didn’t end up on its head.

      • IWantToFuckSpez@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Photoshop still required skill to create a convincing fake. With AI it’s a lot easier for everyone without artistic skill to make deepfakes. Sure there are tools to detect these fakes. But it will get easier and easier to make a deepfake thus the social media feeds will be flooded with so many fakes in the future that damage will be done before the fakes can be debunked. Like how that altered video of Pelosi where she sounded drunk that went viral in the right wing sphere, that will happen exponentially more often in the future.