• alekwithak@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    3 days ago

    AI also just makes things up. Like how RFKJr’s “Make America Healthy Again” report cites studies that don’t exist and never have, or literally a million other examples. You’re not wrong about Fox news and how corporate and Russian backed media distorts the truth and pushes false narratives, and you’re not wrong that AI isn’t the problem, but it is certainly a problem and a big one at that.

    • CrayonDevourer@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      26
      ·
      3 days ago

      AI also just makes things up. Like how RFKJr’s “Make America Healthy Again” report cites studies that don’t exist and never have, or literally a million other examples.

      SO DO PEOPLE.

      Tell me one of the things that AI does, that people themselves don’t also commonly do each and every day?

      • alekwithak@lemmy.world
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        3
        ·
        edit-2
        3 days ago

        Real researchers make up studies to cite in their reports? Real lawyers and judges cite fake cases as precedents in legal preceding? Real doctors base treatment plans on white papers they completely fabricated in their heads? Yeah I don’t think so, buddy.

        • Ceedoestrees@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          3 days ago

          I think they’re saying that the kind of people who take LLM generated content as fact are the kind of people who don’t know how to look up information in the first place. Blaming the LLM for it is like blaming a search engine for showing bad results.

          Of course LLMs make stuff up, they are machines that make stuff up.

          Sort of an aside, but doctors, lawyers, judges and researchers make shit up all the time. A professional designation doesn’t make someone infallible or even smart. People should question everything they read, regardless of the source.

          • Don_alForno@feddit.org
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            3 days ago

            Blaming the LLM for it is like blaming a search engine for showing bad results.

            Except we give it the glorifying title “AI”. It’s supposed to be far better than a search engine, otherwise why not stick with a search engine (that uses a tiny fraction of the power)?

            • Ceedoestrees@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              3 days ago

              I don’t know what point you’re arguing. I didn’t call it AI and even if I did, I don’t know any definition of AI that includes infallibility. I didn’t claim it’s better than a search engine, either. Even if I did, “Better” does not equal “Always correct.”