• basmati@lemmus.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 month ago

      It’s really not. It’s a fad like zip disks or ewaste recycling. Only it’s even more expensive while reducing productivity and quality of work everywhere it’s implemented, all for the vague hope it eventually might get better.

      • dependencyinjection@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        2
        ·
        1 month ago

        Do you think AI and / or AGI is a possibly at all given enough time?

        Because if the answer is yes, then don’t we need people working on it all the time to keep inching towards that? I’m not saying that the current implementations are anywhere close, but they do have their use cases. I’m a software developer and my boss the lead engineer (the smartest person I’ve ever met) has made some awesome tools tools that save our company of 7 people maybe a 100 hours of work a month.

        People used to complain about the LHC and that’s made countless discoveries that help in other fields.

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          Powered flight was an important goal, but that wouldn’t have justified throwing all the world’s resources at making Da Vinci’s flying machine work. Some ideas are just dead ends.

          Transformer based generative models do not have any demonstrable path to becoming AGI, and we’re already hitting a hard ceiling of diminishing returns on the very limited set of things that they actually can do. Developing better versions of these models requires exponentially larger amounts of data, at exponentially scaling compute costs (yes, exponentially… To the point where current estimates are that there literally isn’t enough training data in the world to get past another generation or two of development on these things).

          Whether or not AGI is possible, it has become extremely apparent that this approach is not going to be the one that gets us there. So what is the benefit of continuing to pile more and more resources into it?

      • TʜᴇʀᴀᴘʏGⒶʀʏ@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        2
        ·
        1 month ago

        This is such a weak take. It’s constantly getting more efficient, and it’s already extremely helpful- It’s been incorporated into countless applications. OpenAI might go away, but llms and genai won’t. I run an open source local llm to automate most of my documentation workflow, and that’s not going away

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          “it’s been incorporated into countless applications”

          I think the phrasing you were looking for there was “hastily bolted onto.” Was the world actually that desperate for tools to make bad summaries of data, and sometimes write short form emails for us? Does that really justify the billions upon billions of dollars that are being thrown at this technology?

          • Zos_Kia@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            3
            ·
            1 month ago

            This comment shows you have no idea of what is going on. Have fun in your little bubble, son.

    • omarfw@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      LLMs are not AI. They’re content stealing blenders wearing a name tag that says AI on it.