THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Corkyskog@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    4 months ago

    I guess it could theoretically drive down the cost of amateur and low budget film and animation work.

    • kronisk @lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Making it harder for animators and illustrators to make a living outweighs the reality that every woman on earth now has to fear someone making revenge porn with their likeness?

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        Just saying what would be lost, not that it outweighs it. It’s not like there aren’t other methods to still keep that and ban things like sexuality.