• @Default_Defect@midwest.social
    link
    fedilink
    English
    81 year ago

    Some tech bro will attempt to make this cake and will tell someone it was better than anything some uppity WOKE human baker could have made, regardless of how bad it turned out.

  • BananaPeal
    link
    fedilink
    English
    171 year ago

    Is that why my cakes keep falling? I thought butger was optional.

  • ah yes, using a highly specialized AI intended for image generation to create something that is usually in text form. truly a good judge of the quality of AI…

  • Xusontha
    link
    fedilink
    781 year ago

    I like how it just explicitly refuses to label the butger bowls

  • OpenStars
    link
    fedilink
    71 year ago

    You can use an electric beater instead of a wire whisk. It makes it extra chewy. :-P

  • Captain Aggravated
    link
    fedilink
    41 year ago

    So many jokes. Including:

    …Around the corner fudge is made!

    …And sediment shaped sediment.

  • @Nacktmull@feddit.de
    link
    fedilink
    32
    edit-2
    1 year ago

    Why is it even called artificial intelligence, when it´s obviously just mindless artificial pattern reproduction?

    • @Damdy@lemmy.world
      link
      fedilink
      31 year ago

      Well, I think it comes down to a fundamental belief on consciousness. If you’re non religious, you probably think that consciousness is a purely biological and understandable process. This is complete understandable and should be replicable. Therefore, artificial intelligence. But it’s hard as dong to do well.

      • Richard
        link
        fedilink
        English
        21 year ago

        Why the hell are you being downvoted? I thought Lemmy had no religious fundamentalists or spiritualists

    • @Honytawk@lemmy.zip
      link
      fedilink
      61 year ago

      Because it is intelligent enough to find and reproduce patters. Kind of like humans.

      But it is artificial.

    • @BluesF@feddit.uk
      link
      fedilink
      231 year ago

      Machine Learning is such a better name. It describes what is happening - a machine is learning to do some specific thing. In this case to take text and output pictures… It’s limited by what it learned from. It learned from arrays of numbers representing colours of pixels, and from strings of text. It doesn’t know what that text means, it just knows how to translate it into arrays of numbers… There is no intelligence, only limited learning.

      • DroneRights [it/its]
        link
        fedilink
        English
        -31 year ago

        Machine Learning isn’t a good name for these services because they aren’t learning. You don’t teach them by interacting with them. The developers did the teaching and the machine did the learning before you ever opened the browser window. You’re interacting with the result of learning, not with the learning.

      • @Fungah@lemmy.world
        link
        fedilink
        21 year ago

        Are we so different?

        Isn’t meaning just comparing and contracting similarly learned patterns against each other and saying “this is not all of those other things”.?

        The closer you scrutinize meaning the fuzzier it gets. Linguistically at least, though now that I think about it I suppose the same holds true in science as well.

        • @BluesF@feddit.uk
          link
          fedilink
          -11 year ago

          Yes, we absolutely are different. Okay, maybe if you really boil down every little process our brains do there are similarities, we do also do pattern recognition, yes. But that isn’t all we do, or all ML systems do, either. I think you’re selling yourself short if you think you’re just recognising patterns!

          The simplest difference between us and ML systems was pointed out by another commenter - they are trained on a dataset and then they remain static. We constantly re-evaluate old information, take in new information, and formulate new thoughts and change our minds.

          We are able to perceive in ways that computers just can’t - they can’t understand what a smell is because they cannot smell, they can’t understand what it is to see in the way that we do because when they process images it is exactly the same to a computer as processing any other series of numbers. They do not have abstract concepts to relate recognised patterns to. Generative AI is unable to be truly creative in the way that we can, because it doesn’t have an imagination, it is replicating based on its inputs. Although, again, people on the internet love to say “that’s what artists do”, I think it’s pretty obvious that we wouldn’t have art in the way we do today if that was true… We would still be painting on the walls of caves.

    • @Sordid@lemmy.dbzer0.com
      link
      fedilink
      37
      edit-2
      1 year ago

      Because that’s what intelligence is. There’s a very funny video floating around of a squirrel repeatedly trying to bury an acorn in a dog’s fur and completely failing to understand why it’s not working. Now sure, a squirrel is not the smartest animal in the world, but it does have some intelligence, and yet there it is just mindlessly reproducing a pattern in the wrong context. Maybe you’re thinking that humans aren’t like that, that we make decisions by actually thinking through our actions and their consequences instead of just repeating learned patterns. I put it to you that if that were the case, we wouldn’t still be dealing with the same problems that have been plaguing us for millennia.

    • K0W4L5K1
      link
      fedilink
      81 year ago

      Honestly I think it’s marketing ai sells better then machine learning programs

    • @Barack_Embalmer@lemmy.world
      link
      fedilink
      8
      edit-2
      1 year ago

      AI is also the minmax algorithm for solving tic-tac-toe, and the ghosts that chase Pac-Man around. It’s a broad term. It doesn’t always have to mean “mindblowing super-intelligence that surpasses humans in every conceivable way”. So it makes mistakes - therefore it’s not “intelligent” in some way?

      A lot of the latest thought in cognitive science couches human cognition in similar terms to pattern recognition - some of the latest theories are known as “predictive processing”, “embodied cognition”, and “4E cognition” if you want to look them up.