• msage@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    1 day ago

    LLMentalist is a mandatory read.

    Stop making LLMs happen, we don’t need energy hungry bullshit generators for anything.

    There are so many more important AIs that need attention and funding to help us with real problems.

    LLMs won’t solve anything.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 day ago

      There is a lot of hype around LLMs, and other forms of AI certainly should be getting more attention, but arguing that this tech no value is simply disingenuous. People really need to stop perseverating over the fact that this tech exists because it’s not going anywhere.

      • msage@programming.dev
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Any benefits are by far outweighted by the cost and dangers.

        Tell me more about the value when every LLM company is hemorrhaging money.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 day ago

          You seem to have a very US centric perspective on this tech the situation in China looks to be quite different. Meanwhile, whether you personally think the benefits are outweighed by whatever dangers you envision, the reality is that you can’t put toothpaste back in the tube at this point. LLMs will continue to be developed. The only question is how that’s going to be done and who will control this tech. I’d much rather see it developed in the open.

          • msage@programming.dev
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            13 hours ago

            You dense motherfucker.

            No LLMs are being developed in the open.

            Even provided weights mean nothing.

            It’s not knowledge LLMs retain, just the ingressed text.

            LLMs should be skipped after confirming that they are indeed a dead end they always were. And the entire world should focus on anything else.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              8 hours ago

              You’re such an angry little ignoramus. The GPT-NeoX repo on GitHub is the actual codebase they used to train these models. They also open-sourced the training data, checkpoints, and all the tools.

              However, even if you were right that the weights were worthless, which they’re obviously not, and there were no open projects which there are, the solution would be to develop models from scratch in the open instead of screeching at people and pretending this tech is just going to go away because it offends you personally.

              And nobody says LLMs are anything other than Markov chains at a fundamental level. However, just like Markov chains themselves, they have plenty of real world uses. Some very obvious ones include doing translations, generating subtitles, doing text to speech, and describing images for visually impaired. There are plenty of other uses for these tools.

              I love how you presumed to know better than the entire world what technology to focus on. The megalomania is absolutely hilarious. Like all these researchers can’t understand that this tech is a dead end, it takes the brilliant mind of some lemmy troll to figure it out. I’m sure your mommy tells you you’re very special every day.

            • DigitalStefan@fosstodon.org
              link
              fedilink
              arrow-up
              1
              ·
              12 hours ago

              @msage @yogthos I don’t know if I agree 100% with this, but I do like what you’re saying.

              It seems like all the AI companies are simply hoping AGI emerges from it and nobody is doing the actual research to make that happen.

              People were researching it when I was a child and I suspect they’ll still be researching it when I’m collecting my pension.