• tsonfeir@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        11
        ·
        7 months ago

        Nope, just gotta know what it IS, what it ISN’T, and how to correctly write prompts for it to return data that you can use to formulate your own conclusion.

        When using AI, it’s only as smart as the operator.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            7 months ago

            As much as I hate to do this, it is AI, as ML is a part of Artificial Intelligence.

            It isn’t AGI, some might say it may be, but they are wrong. But the model is learning.

            • SpaceNoodle@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              7 months ago

              An LLM is not capable of learning. It won’t hallucinate less with additional training input.

              • msage@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                7 months ago

                Just the notion of a computer having hallucinations should suggest that it’s doing more than just basic code.

                It’s not ‘intelligent’, but it has ‘learned’ enough beyond standard CPU instructions.

                That’s why it’s not a General AI, but it’s still an AI.

                • SpaceNoodle@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  7 months ago

                  I also talk about gremlins inside CPUs, but that doesn’t mean I think there are magical critters turning a crank inside them.

                  It’s called a metaphor, brother.

                  Regardless, it’s all code that’s eventually run on a CPU, so there isn’t any step where magic is injected.

                  • msage@programming.dev
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    7 months ago

                    Sigh.

                    There is no code for language processing, it’s just math approximating results from weights. The whole weight set-up is what’s called ‘artificial intelligence’, because nobody wrote

                    if prompt like 'python' return ['large snake', 'programming language', 'australian car company']

                    the model ‘learned’ how to mimic human speech using training, not by 1000s of software engineers adding more branches to the code.

                    That technique is part of ‘artificial intelligence’, when computers solve problems they were not programmed to do. The neural network learns its knowledge by the code, but the code has no idea what is going on.

            • Zos_Kia@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              8
              arrow-down
              1
              ·
              7 months ago

              No you don’t understand. The word AI, which was invented to describe this kind of technology, should not be used to describe this technology. It should instead be reserved for some imaginary magical technology that may exist in the future.

    • capital@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 months ago

      New version of people who know how to search the web vs those who don’t. Currently shit search results broken by search companies notwithstanding.