cross-posted from: https://lemmy.world/post/20664372

X (formerly Twitter) claims that non-consensual nudity is not tolerated on its platform. But a recent study shows that X is more likely to quickly remove this harmful content—sometimes known as revenge porn or non-consensual intimate imagery (NCII)—if victims flag content through a Digital Millennium Copyright Act (DMCA) takedown rather than using X’s mechanism for reporting NCII.

  • Peruvian_Skies@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    In other words, the consent of a corporation is more important than the consent of a human being… for the public distribution of that human being’s likeness in an intimate context. Holy dystopia, conservatives are fucked in the head.

  • Th4tGuyII@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Xitter might as well call it the “Maybes and Conditions” with how much they cherrypick their T&Cs nowadays

  • roofuskit@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Everyone should just share the image of Elon before his hair plugs and watch how fast it gets taken down.

  • celsiustimeline@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Trash headline. The study found that MORE takedown requests were fulfilled by sending DMCA takedowns instead of the (probably non-functioning) NCII takedown request mechanism on Twitter. Nowhere does it say they will only take down revenge porn if you send them a DMCA.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I’m not sure how you can miss the point while simultaneously spelling it out at the same time. Quite the feat.