A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • @John_McMurray@lemmy.world
    link
    fedilink
    English
    45
    edit-2
    9 months ago

    Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn’t stop anything, it just told them “Not here”

    • @theherk@lemmy.world
      link
      fedilink
      English
      129 months ago

      Until a few years ago, when they finally stopped allowing unmoderated, user uploaded content they had a ton a very problematic videos. And they were roasted about it in public for years. Including by many who were the unconsenting, sometimes underage subjects of these videos, and they did nothing. Good that they finally did, but they trained users for years that it was a place to find that content.

        • @theherk@lemmy.world
          link
          fedilink
          English
          19 months ago

          You know you could easily say some dumb shit like that to somebody whose daughter wound up fighting a long time to remove herself from the site. ¯\(ツ)

            • r3df0x ✡️✝☪️A
              link
              fedilink
              English
              29 months ago

              Pornhub left up underage child rape videos until they were very publicly called out for it.

              Porn is also a method of bourgeois oppression. The corporate elites want you to be an easily controlled consoomer.

            • @VirtualOdour@sh.itjust.works
              link
              fedilink
              English
              09 months ago

              You’re wasting your time, they’re posting on lemmy where it’s not even possible up remove a picture you posted let alone one of you posted by someone else - the fact they’re still mad pornhub had a similar problem and solved it effectively makes it pretty obvious they’re looking for an excuse for an ideological crusade against people they’ve already decided to hate.

            • @theherk@lemmy.world
              link
              fedilink
              English
              19 months ago

              What did I say that was dumb? I said “until a few years ago”, and that is true. And I have firsthand experience with the trouble they wouldn’t go through to deal with it. To imply that I’m just choking down what the government is selling is simply not reasonable.

                • @theherk@lemmy.world
                  link
                  fedilink
                  English
                  19 months ago

                  The person to which I was responding said:

                  yeah I believe everything the government says through the media too.

                  I’m not saying you accused me of the same. I don’t know what credibility I need, nor do I fully understand how I lost it. I am happy to hear the person you know had a good experience, truly, and I hope that is the case for most.

                  • @Breezy@lemmy.world
                    link
                    fedilink
                    English
                    -39 months ago

                    You’re answer is so full of shit, i think my left eye rolled to far back and now iys stuck

    • @abhibeckert@lemmy.world
      link
      fedilink
      English
      199 months ago

      We have culturally drawn a line in the sand where one side is legal and the other side of the line is illegal.

      Of course the real world isn’t like that - there’s a range of material available and a lot of it is pretty close to being abusive material, while still being perfectly legal because it falls on the right side of someone’s date of birth.

      It sounds like this initiative by Pornhub’s chatbot successfully pushes people away from borderline content… I’m not sure I buy that… but if it’s directing some of those users to support services then that’s a good thing. I worry though some people might instead be pushed over to the dark web.

      • @John_McMurray@lemmy.world
        link
        fedilink
        English
        139 months ago

        Yeah…I forgot that the UK classifies some activities between consenting adults as “abusive”, and it seems some people are now using that definition in the real world.

        • @Scirocco@lemm.ee
          link
          fedilink
          English
          29 months ago

          Facesitting porn (of adults) is illegal in UK for the reason that it’s potentially dangerous

          • Quicky
            link
            fedilink
            English
            5
            edit-2
            9 months ago

            Which led to some amazing protests.

            Weirdly, watching facesitting porn in the UK is perfectly fine, as long as it wasn’t filmed in the UK.

            I can just imagine trying to defend that in court. “Your honour, it’s clear to me that the muffled moans of the face-sittee are those of a Frenchman”

    • @A_Random_Idiot@lemmy.world
      link
      fedilink
      English
      14
      edit-2
      9 months ago

      I mean, is it dumb?

      Didnt pornhub face a massive lawsuit or something because of the amount of unmoderated child porn that was hidden in its bowels by uploaders (in addition to rape victims, revenge porn, etc etc…), to the point that they apparently only allow verified uploaders now and purged a huge swath of their videos?

    • r3df0x ✡️✝☪️A
      link
      fedilink
      English
      -29 months ago

      Pornhub also knowingly hosted child porn. Ready or Not put them on blast for it when you raid a company called “Mindjot” for distributing child porn.