A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • @Blackmist@feddit.uk
    link
    fedilink
    English
    394 months ago

    Did it? Or did it make them look elsewhere?

    The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.

      • @PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        English
        14 months ago

        And what days were those? Cuz you pretty much need to go all the way back to pre-internet days. Hell, even that isn’t far enough, cuz Playboy’s youngest model was like 12 at one point.

        • The Snark Urge
          link
          fedilink
          English
          14 months ago

          Depressing, isn’t it? I was more talking about how prevalent “fauxcest” has become in porn more recently. I guess that’s just my cross to bear as an only child 💅

        • femtech
          link
          fedilink
          English
          14 months ago

          Wtf? For real? Was cp not federal illegal when they did that.

      • @michaelmrose@lemmy.world
        link
        fedilink
        English
        1
        edit-2
        4 months ago

        Reasonable adults sites don’t return obviously sketchy things for reasonable queries. EG you don’t search boobs and get 12 year olds.

    • @EdibleFriend@lemmy.world
      link
      fedilink
      English
      8
      edit-2
      4 months ago

      given the amount of extremely edgy content already on Pornhub, this is kinda sus

      Yeah…i am honestly curious what these search terms were, how many of those were ACTUALLY looking for CP. And of those…how many are now flagged somewhow?

      • @Arsonistic@lemmy.ml
        link
        fedilink
        English
        24 months ago

        I know I got the warning when I searched for young gymnast or something like that cuz I was trying to find a specific video I had seen before. False positives can be annoying, but that’s the only time I’ve ever encountered it.

  • @Kusimulkku@lemm.ee
    link
    fedilink
    English
    154 months ago

    I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive

    • @Thorny_Insight@lemm.ee
      link
      fedilink
      English
      84 months ago

      I’m not sure if it’s related but as a life-long miniskirt lover I’ve noticed that many sites no longer return results for the term “schoolgirl” and instead you need to search for a “student”

    • @Squire1039@lemm.eeOP
      link
      fedilink
      English
      34 months ago

      The MLs have been shown to be extraordinarily good at statistically guessing your words. The words covered are probably comprehensive.

      • @Kusimulkku@lemm.ee
        link
        fedilink
        English
        134 months ago

        I think the other article talks about it being a manually curated list because while ML can get correct words it also gets random stuff, so you need to check it isn’t making spurious connections. It’s pretty interesting how it all works

    • Hyperreality
      link
      fedilink
      14 months ago

      Obviously don’t google this, but IRC one of the terms used was lemon party.

        • Beardedsausag3
          link
          fedilink
          24 months ago

          Lemon party was a bunch of old naked dudes sat in a group i think… Mightve been involving themselves with each other? It’s been a fucking loooong ass time since I got shown that and meatspin at school lol

      • ShadowRam
        link
        fedilink
        14 months ago

        hahaha… it saddens me that only those >30yrs old may get this.

        • jaycifer
          cake
          link
          fedilink
          14 months ago

          Hey now, I understood that reference and I’m… only… 27.

          30 years draws ever nearer.

    • @Bgugi@lemmy.world
      link
      fedilink
      English
      24 months ago

      Aylo maintains a list of more than 28,000 banned terms in multiple languages, which is constantly being updated.

      Id be very curious what these terms are, but I wouldn’t be surprised if “pizza guy” or “school uniform” would trigger a response.

  • @Gakomi@lemmy.world
    link
    fedilink
    English
    14 months ago

    To be fair people are dumb as fuck, don’t search for illegal things on Google or any site that is well known cause that’s how you end up on some watch list.

  • @FinishingDutch@lemmy.world
    link
    fedilink
    English
    884 months ago

    Sounds like a good feature. Anything that stops people from doing that is great.

    But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

    • @silasmariner@programming.dev
      link
      fedilink
      English
      24 months ago

      wree people really expecting to find that content on PornHub?

      Welcome to the internet 😂 where people constantly disappoint/surprise you (what word is that? Dissurprise? Disurprint?

    • @CameronDev@programming.dev
      link
      fedilink
      English
      714 months ago

      PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

      And im guessing they are trying to catch users who are trending towards questionable material. "College"✅ -> "Teen"⚠️ -> "Young Teen"⚠️⚠️⚠️ -> "CSAM"🚔 etc.

      • @FinishingDutch@lemmy.world
        link
        fedilink
        English
        184 months ago

        Wow, that bad? I was aware they purged a lot of ‘amateur’ content over concerns regarding consent to upload/revenge porn, but I didn’t know it was that much.

          • @azertyfun@sh.itjust.works
            link
            fedilink
            English
            174 months ago

            Eeeeeeeh. There’s nuance.

            IIRC there were only a handful of verified CSAM videos on the entire website. It’s inevitable, it happens everywhere with UGC, including on here. Anecdotally, in the years leading up to the purge PH had already cleaned up its act and from what I saw pirated content was rather well moderated. However this time the media made a huge stink about the alleged CSAM, payment processors threatened to pull out (they are notoriously very puritan, it’s caused a lot of trouble to lemmynsfw’s admins for instance) and so regardless of the validity of the initial claims PH had to do something to gain back the trust of payment processors, so they basically nuked every video that did not have a government ID attached.

            Now if I may speculate a little, one of the reasons it happened this way is probably that due to its industry position PH is way better moderated than most (if not all) websites of their size and already had verified a bunch of its creators. At the same time the rise of OnlyFans and similar websites means that real amateur content has all but disappeared so there was less and less reason to allow random UGC anyway. So the high moderation costs probably didn’t make much sense anymore anyway.

            • @root@precious.net
              link
              fedilink
              English
              104 months ago

              Spot on. The availability of CSAM was overblown by a well funded special interest group (Exodus Cry). The articles about it were pretty much ghost written by them.

              When you’re the biggest company in porn you’ve got a target on your back. In my opinion they removed all user content to avoid even the appearance of supporting CSAM, not because they were guilty of anything.

              PornHub has been very open about normalizing healthy sexuality for years, while also providing interesting data access for both scientists and the general public.

              “Exodus Cry is an American Christian non-profit advocacy organization seeking the abolition of the legal commercial sex industry, including pornography, strip clubs, and sex work, as well as illegal sex trafficking.[2] It has been described by the New York Daily News,[3] TheWrap,[4] and others as anti-LGBT, with ties to the anti-abortion movement.[5]”

              https://en.wikipedia.org/wiki/Exodus_Cry

              • @azertyfun@sh.itjust.works
                link
                fedilink
                English
                64 months ago

                They’re the fuckers who almost turned OF into Pinterest as well? Not surprising in retrospect. The crazy thing is how all news outlets ran with the narrative and payment processors are so flaky with adult content. De-platforming sex work shouldn’t be this easy.

    • Ace! _SL/S
      link
      fedilink
      English
      64 months ago

      It had all sorts of illegal things before they purged everyone unverified due to legal pressure

    • 520
      link
      fedilink
      14 months ago

      So…pornhub has actually had problems with CSAM. It used to be much more of a Youtube-like platform where anyone can upload.

      Even without that aspect, there are a looot of producers that don’t do their checks well and a lot of underage actresses that fall through the cracks

  • @ocassionallyaduck@lemmy.world
    link
    fedilink
    English
    254 months ago

    This is one of the more horrifying features of the future of generative AI.

    There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

    This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn’t come from a real child’s suffering. It’s the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

    Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

    • @yamanii@lemmy.world
      link
      fedilink
      English
      64 months ago

      What do you mean soon, local models from civitai can generate CSAM for at least 2 years. I don’t think it’s possible to stop it unless the model creator does something to prevent it from generate naked people in general like the neutered SDXL.

      • @ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        14 months ago

        True. For obvious reasons I haven’t looked too deeply down that rabbit hole because RIP my search history, but I kind of assumed it would be soon. I’m thinking more specifically about models like SORA though. Where you could feed it enough input, then type a sentence to get video content. That is going to be a different level of darkness.

    • Zorque
      link
      fedilink
      14 months ago

      Are… we looking at the same article? This isn’t about AI generated CSAM, it’s about redirecting those who are searching for CSAM to support services.

      • @ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        14 months ago

        Yes, but this is more about mitigating the spread of CSAM. And my feeling was it’s going to become somewhat impossible soon. AI generated porn is starting to flood the market and this chat it is also one of those “smart” attempts to mitigate this behavior. I’m saying that very soon, it will be something users don’t have to go anywhere to get if the model can just fabricate it out of thin air, so the chat it mitigation is only temporary, and the dark web of actual CSAM material will become overwhelmed and swamped in artificially generating new tidal waves of artificial CP. So it’s an alarming ethical dilemma we are on the horizon of that we need to think about.

    • @gapbetweenus@feddit.de
      link
      fedilink
      English
      35
      edit-2
      4 months ago

      The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it’s enabling a progression of the condition. As far as I know that is an unanswered question.

        • @gapbetweenus@feddit.de
          link
          fedilink
          English
          114 months ago

          For porn in general, yes - I think the data is rather clear. But for cp or related substitute content it’s not that definitive (to my knowledge), be it just for the reason that it’s really difficult to collect data on that sensitive topic.

          • @Asafum@feddit.nl
            link
            fedilink
            English
            104 months ago

            Why would it be any different? If it’s about sexual gratification by their chosen media then I’d imagine it wouldn’t matter what the subject was, but obviously it’s always necessary to get actual data to know for sure.

            • @gapbetweenus@feddit.de
              link
              fedilink
              English
              14 months ago

              Why would it be any different?

              Because pedophiles display pathological deviation when it comes to sexual attraction.

            • @Baahb@lemmy.world
              link
              fedilink
              English
              44 months ago

              I think you’re making assumptions that aren’t fair but maybe aren’t obvious either. Honestly I’m only thinking about this because I just watched the contrapoints video on twilight, and so I’ll use her example, though she’s talking about a slightly different topic. Gonna paraphrase like a mofo:

              Weird Power dynamics between partners in a fantasy, like twilight, or say porn since we are being obvious here, is normal because self image often requires women to present one way while hiding their desires for sex. It’s absolution of a sort, and is ostensibly healthy to explore in this way. That said… Some examples such as race play in fantasies may dehumanize the “other” in super not cool ways and reinforce negative racial stereotypes.

              If we take that and extend it to pedophiles, normalization of the thought process leading to that sort of disfunction/disorder seems like a not great thing, but yeah, we’d need to study to learn more and that seems both difficult and likely undesirable for the researchers.

            • @cumming_normi@yiffit.net
              link
              fedilink
              English
              04 months ago

              Because “CSAM” states abuse as the third word in the acronym. Machine learning could (in theory, I lack knowledge on the current implementations) be trained without any children being abused (in any traditional sense anyway) and used to produce the content without any real children being involved (ignoring training data).

              The downvotes likely come from a difference in definition between abuse and CP, images of nonexistent people cannot realistically harm anyone.

        • @Varyk@sh.itjust.works
          link
          fedilink
          English
          -134 months ago

          Ah, one of the “using words they don’t understand” crew.

          And several hours late, too.

          Swinging for the fences, aren’t you?

      • @ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        84 months ago

        So your takeaway is I’m… Against AI generative images and thus I “protest too much”

        I can’t tell if you’re pro AI and dislike me, or pro loli hentai and thus dislike.

        Dude, AI images and AI video are inevitable. To pretend that does have huge effects on society is stupid. It’s going to reshape all news media, very quickly. If reddit is 99% AI generated bot spam garbage with no verification of what is authentic, reddit is functionally dead, and we are on a train with no brakes in that direction for most public forums.

          • @ocassionallyaduck@lemmy.world
            link
            fedilink
            English
            24 months ago

            You should probably research the phrase “protest too much” and the word “schtick” then.

            I’m not trying to clutch pearls here, as another poster here commented this isn’t a theoretical concern.

            • @Varyk@sh.itjust.works
              link
              fedilink
              English
              14 months ago

              You aren’t trying to clutch pearls, but your pearls were just so available you felt you had to jump on the bandwagon to reply to a two-day old comment?

              Nobody said this was a theoretical concern and it’s okay if you don’t understand the phrases " protest too much" and "shtick“, but you can ask for the definitions and relevance directly instead of fishing.

      • @Wirlocke@lemmy.blahaj.zone
        link
        fedilink
        English
        44 months ago

        I think one of the main issues is the matter of fact usage of the term Minor Attracted Person. It’s a controversial term that phrases pedophiles like an identity, like saying Person Of Color.

        I understand wanting a not as judgemental term for those who did no wrong and are seeking help. But it should be phrased as anything else of that nature, a disorder.

        If I was making a term that fit that description I’d probably say Minor Attraction Disorder heavily implying that the person is not ok as is and needs professional help.

        In a more general sense, it feels like the similar apologetic arguments that the dark side of reddit would make. And that’s probably because Google’s officially using Reddit as training data.

      • Sandbag
        link
        fedilink
        English
        -104 months ago

        Are you defending pedophilia? This is a honest question because you are saying it gave a nuanced answer when we all, should, know that it’s horribly wrong and awful.

        • @Gabu@lemmy.world
          link
          fedilink
          English
          1
          edit-2
          4 months ago

          when we all, should, know that it’s horribly wrong and awful. [sic, the word “should” shouldn’t be between commas]

          This assumes two things:

          1. Some kind of universal, inherent and self evident morality; None of these things are true, as evidence by the fact most people do believe murder is wrong, yet there are wars, events entirely dedicated to murdering people. People do need to be told something wrong is wrong in order to know so. Maybe some of these people were never exposed to the moral consensus or, worse yet, were victims themselves and as a result developed a distorted sense of morality;
          2. Not necessarily all, but some of these divergents are actually mentally ill - their “inclination” isn’t a choice any more than being schizofrenic or homosexual† would be. That isn’t a defense to their actions, but a recognition that without social backing and help, they could probably never overcome their nature.

          † This is not an implication that homosexuality is in any way, or should in any way, be classified as a mental illness. It’s an example of a primary individual characteristic not derived from choice.

        • @tjsauce@lemmy.world
          link
          fedilink
          English
          84 months ago

          Abusing a child is wrong. Feeling the urge to do so doesn’t make someone evil, so long as they recognize it’s wrong to do so. The best way to stop kids from being abused is to teach why it is wrong and help those with the urges to manage them. Calling people evil detracts from that goal.

        • Lowlee Kun
          link
          fedilink
          English
          84 months ago

          What you are thinking about is child abuse. A pedophile is not bound to bcome an abuser.

  • @interdimensionalmeme@lemmy.ml
    link
    fedilink
    English
    -234 months ago

    Incredibly stupid and obviously false “think of the children” propaganda. And you all lap it up. They’re building aroubd you a version of the panopticon so extrene and disgusting that even people in the 1800s would have been outraged to use it against prisoners. Yet you applaud. I think this means you do deserve your coming enslavement.

    • RedFox
      link
      fedilink
      English
      104 months ago

      I keep asking myself why I haven’t blocked lemmy.ml

      I keep telling myself I’ll lose ideas or comments from the good users there…

      At this point, I’ll have just blocked all their users individually

    • @fruitycoder@sh.itjust.works
      link
      fedilink
      English
      34 months ago

      How is this building that?

      Like I’m a privacy but and very against surveillance, but this doesn’t seem to be that. It is a model that seems like could even be deployed to more privacy friendly sites (PH is not that).

      • @interdimensionalmeme@lemmy.ml
        link
        fedilink
        English
        24 months ago

        In context, each paver in the road to hell seems just and good intentionned

        But after all we’ve been through, falling for this trick again, it’s a choice. Maybe they think, this time, they’ll be the ones wearing the boots.

          • @interdimensionalmeme@lemmy.ml
            link
            fedilink
            English
            34 months ago

            Normalizes using AI to profile user’s search history in a non-anonimous way. People used to say, if I die delete my browser history. Now they’re glad caretaker AI are keeping an eye on everyone’s search. Soon we won’t be able to take a shit with AI knowing what we had for diner. But hey, THINK OF THE FUCKING CHILDREN

            • @fruitycoder@sh.itjust.works
              link
              fedilink
              English
              14 months ago

              They already have this data, they already used AI over it, they already sell it. Thats their business model.

              I agree that’s an issue, but its not specific to this nor is this dependent on this.

              • @interdimensionalmeme@lemmy.ml
                link
                fedilink
                English
                14 months ago

                Yes, this is not new, it is just about normalization and testing the waters for backlash before they do something, less whiteknightey with it and they don’t manage to keep it out of the news.

    • xor
      link
      fedilink
      English
      11
      edit-2
      4 months ago

      The panopticon is… a chatbot that suggests you get help if you search for CSAM? Those bastards! /s

    • @StitchIsABitch@lemmy.world
      link
      fedilink
      English
      294 months ago

      And, why? I mean it’s nice of you to make these claims, but what the hell does reducing csam searches have to do with the panopticon and us becoming enslaved?

    • @Gabu@lemmy.world
      link
      fedilink
      English
      184 months ago

      Not since the wipe, AFAIK. Still, at the bottom of the page you can (or at least could, haven’t used their services in a while) see a list of recent searches from all users, and you’d often find some disturbing shit.

      • @KrankyKong@lemmy.world
        link
        fedilink
        English
        94 months ago

        …That paragraph doesn’t say anything about whether or not the material is on the site though. I had the same reaction as the other person, and I didn’t misread the paragraph that’s literally right there.

  • @Mostly_Gristle@lemmy.world
    link
    fedilink
    English
    694 months ago

    The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn’t attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

        • Lemmy
          link
          fedilink
          English
          21
          edit-2
          4 months ago

          Same thing for me when I was 13. I freaked the fuck out when I saw a wikipedia article on the right. I thought I was going to jail the next day lmfao

      • Dran
        link
        fedilink
        English
        324 months ago

        I’d think it’s probably not a majority, but I do wonder what percentage it actually is. I do have distinct memories of being like 12 and trying to find porn of people my own age instead of “gross old people” and being confused why I couldn’t find anything. Kids are stupid lol, that’s why laws protecting them need to exist.

        Also good god when I become a parent I am going to do proper network monitoring; in hindsight I should not have been left unattended on the internet at 12.

        • @Rinox@feddit.it
          link
          fedilink
          English
          6
          edit-2
          4 months ago

          It’s not about laws, it’s about sexual education. Sexual education is a topic that can’t be left to the parents and should be explained in school, so as to give the kids a complete knowledge base.

          Most parents know about sex as much as they know about medicines. They’ve had some, but that doesn’t give them a degree for teaching that stuff.

        • @kylian0087@lemmy.world
          link
          fedilink
          English
          144 months ago

          I was the same back then. And have come across some stuff which is surprisingly easy to find. Later to realize how messed up that was.

          I think monitoring is good but it has a fine line not to cross in your child privacy. If they suspect anything they sure know how to work around it and you loose any insight.

        • @Piece_Maker@feddit.uk
          link
          fedilink
          English
          24 months ago

          Sorry I know this is a serious subject and not a laughing matter but that’s a funny situation. I guess I was a MILF hunter at that age because even then I was perfectly happy to knock one out watching adult porn instead!

  • @Socsa@sh.itjust.works
    link
    fedilink
    English
    534 months ago

    Google does this too, my wife was searching for “slutty schoolgirl” costumes and Google was like “have a seat ma’am”

      • nickwitha_k (he/him)
        link
        fedilink
        English
        -104 months ago

        I do have to agree with them on that one. Fetishizing school uniforms worn by children gives some serious Steven Tyler vibes.

        • @gapbetweenus@feddit.de
          link
          fedilink
          English
          414 months ago

          Sexuality is tightly connected to societal taboos, as long as everyone involved is a consenting adult - it’s no-one else businesses. There is no need or benefit in moralizing peoples sexuality.

          • nickwitha_k (he/him)
            link
            fedilink
            English
            34 months ago

            To be clear, I absolutely agree. I’m not saying that people are immoral for liking some plaid. Just a kind of fetish that seems less “natural” (like spanking or bdsm) and more amplified in popular media in a parallel to sexualization of children in response to feminism (see: Brooke Shields’ experience) and that makes it one that I’m not comfortable participating in. But for those that don’t find their brains making such associations that are being safe, sane, and consensual, I wish wonderful, freaky times.

            • @gapbetweenus@feddit.de
              link
              fedilink
              English
              134 months ago

              I’m not saying that people are immoral for liking some plaid.

              Fetishizing school uniforms worn by children gives some serious Steven Tyler vibes. fetish that seems less “natural”

              Sure sounds like you are. And you sound also rather judgy about it. Maybe it’s just a language thing - but at least that’s my impression.

              • nickwitha_k (he/him)
                link
                fedilink
                English
                54 months ago

                It may well be my communication. The first statement was something of a half-joke at the expense of the rock singer and the normalization of predatory behavior towards minors that he and others engaged in during the height of rock’s popularity, not at the expense of people who like to engage in age-play.

                I am very accepting of others kinks and do not judge individuals for activities that are safe, sane, and consensual. Accepting the people and their ethically-sound activities does not mean that one cannot have preferences and perceptions on the activities themselves. Our preferences and perceptions are shaped to a degree (large or small) by our experiences. Mine are most definitely colored to a significant degree by my own early childhood trauma, which makes anything approaching age-play, power-play, and CNC, even just by indirect association in my own thought processes, uncomfortable and unsexy to me.

                I also find scat-play pretty disgusting (tbf, that’s probably part of the kink for some) but, I’m not going to turn someone away, unless they’ve not showered since their last session.

                • @gapbetweenus@feddit.de
                  link
                  fedilink
                  English
                  44 months ago

                  Our preferences and perceptions are shaped to a degree (large or small) by our experiences. Mine are most definitely colored to a significant degree by my own early childhood trauma, which makes anything approaching age-play, power-play, and CNC, even just by indirect association in my own thought processes, uncomfortable and unsexy to me.

                  Even if it means nothing from an internet stranger, sorry to hear you had traumatic childhood experiences. Makes sense that you are uncomfortable with said practices.

                  I also find scat-play pretty disgusting (tbf, that’s probably part of the kink for some)

                  We can agree on something here.

          • r3df0x ✡️✝☪️A
            link
            fedilink
            English
            -54 months ago

            It’s still weird to sexualize children. It’s less weird when it’s teenagers and everyone is of age but it’s a weird thing to engage in constantly.

            • @gapbetweenus@feddit.de
              link
              fedilink
              English
              174 months ago

              It’s sexualizing children in the same way as daddy porn sexualizes incest, you are taking fantasies at their literal face value without looking into what’s going on.

    • prole
      link
      fedilink
      English
      224 months ago

      Google now gives you links to rehabs and addiction recovery centers when searching for harm reduction information about non-addictive drugs.

  • Kairos
    link
    fedilink
    English
    224 months ago

    Oh just like an experiment the headline made me think someone was suing over this.