• rottingleaf@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    30
    ·
    10 days ago

    It would be the same with liberal talking points and in general any human talking point.

    Humans try to change the reality the way they want it, thus things they say are always incorrect. When they want to increase something, they make it appear less than IRL, usually. Also appearances are not universal.

    Humans also simplify things acceptably for one subject, but not for another.

    Humans also don’t know what “correct information” is.

    A lot of philosophy connected to language starts mattering, when your main approach to “AI” is text extrapolation.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 days ago

      So you’re saying you lie to try and change reality or present it in a different way?

      That’s horrible and I certainly don’t subscribe to this mentality. I will discuss things with people with an open mind and a willingness to change positions if presented with new information.

      We are not arguing out of some tribal belief, we have our morals and we will constantly test them to try and be better humans for our fellow humans.

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      10 days ago

      Tell me more about how your theories of gay people being abominations are backed by science.

    • tee9000@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      edit-2
      10 days ago

      I think you hurt peoples feelings lmao.

      The truth just isnt very catchy. Thanks for trying though. Im still on lemmy for people like you.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 days ago

      Only because you are a layer does not conclude that all humans are egoistic layers. Of course there are a lot of them, but it is not a general human thing, it’s cultural and regional. Layers want you to believe that everyone is lying all the time, that makes their lives more easy. But feel free to not believe me 😇.

    • Zement@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      10 days ago

      Math is correct without humans. Pi is the same in the whole universe. There are scientific truths. And then there are the the flat earth, 2x2=1, qanon anti vax chematrail loonies, which in different degrees and colour are mostly united under the conservative “anti science” folks.

      And you want an Ai that doesn’t offend these folks / is taught based on their output. What use could that be of?

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        8
        ·
        10 days ago

        Ahem, well, there are obvious things - that 2x2 modulo 3 is 1, that some vaccines might be bad, that’s why farma industry regulations exist, that pi is also unknown p multiplied by unknown i or some number encoded as ‘pi’ string.

        These all matter for language models, do they not?

        And you want an Ai that doesn’t offend these folks / is taught based on their output. What use could that be of?

        It is already taught on their output among other things.

        But I personally don’t think this leads anywhere.

        Somebody someplace decided it’s a genial idea to extrapolate text, because humans communicate their thoughts via text, so it’s something that can be used for machines.

        Humans don’t just communicate.