• NotANumber@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      This is why safety mechanisms are being put in place, and AIs are being programmed that act less like sycophants.

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 days ago

        oh thank goodness, they’re gonna put in safety mechanisms after unleashing this garbage on the populace! phew, everything will be fine then, it won’t waste enormous amounts of resources to lie to people anymore? it won’t need new power stations? new water sources?

        oh wait… no, it’ll be marginally better but still use up all those resources. Oh wait, no, they won’t even fix it.

        what a stupid, silly waste of time and energy

        • NotANumber@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          I don’t trust OpenAI and try to avoid using them. That being said they have always been one of the more careful ones regarding safety and alignment.

          I also don’t need you or openai to tell me that hallucinations are inevitable. Here have a read of this:

          Title: Hallucination is Inevitable: An Innate Limitation of Large Language Models, Author: Xu et al., Date: 2025-02-13, url: http://arxiv.org/abs/2401.11817

          Regarding resource usage: this is why open weights models like those made by the Chinese labs or mistral in Europe are better. Much more efficient and frankly more innovative than whatever OpenAI is doing.

          Ultimately though you can’t just blame LLMs for people committing suicide. It’s a lazy excuse to avoid addressing real problems like how treats neurodivergent people. The same problems that lead to radicalization including incels and neo nazis. These have all been happening before LLM chatbots took off.

          • mojofrododojo@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            22 hours ago

            Ultimately though you can’t just blame LLMs for people committing suicide.

            well that settles it then! you’re apparently such an authority.

            pfft.

            meanwhile here in reality the lawsuits and the victims will continue to pile up. and your own admitted attempts to make it safer - maybe that’ll stop the LLM associated tragedies.

            maybe. pfft.