A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

        • Chakravanti@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          11 months ago

          According to what logic? Like I’m ever going to trust some lying asshole to hide his instructions for fucking anything that’s MINE. News Alert: “Your” computer ain’t yours.

          • Olgratin_Magmatoe@startrek.website
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            11 months ago

            People have been trying to circumvent chatGPT’s filters, they’ll do the exact same with open source AI. But it’ll be worse because it’s open source, so any built in feature to prevent abuse could just get removed then recompiled by whoever.

            And that’s all even assuming there ever ends up being open source AI.

            • Chakravanti@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              11 months ago

              You’re logic is bass ackwards. Knowing the open source publicly means the shit gets fixed faster. Closed source just don’t get fixed %99 of the time because there’s only one mother fucker to do the fixing and usually just don’t do it.

              • Olgratin_Magmatoe@startrek.website
                link
                fedilink
                arrow-up
                1
                ·
                11 months ago

                You can’t fix it with open source. All it takes is one guy making a fork and removing the safeguards because they believe in free speech or something. You can’t have safeguards against misuse of a tool in an open source environment.

                I agree that closed source AI is bad. But open source doesn’t magically solve the problem.

                • Chakravanti@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  11 months ago

                  Forks are productive. Your’re just wrong about it. I’ll take FOSS over closed source. I’ll trust the masses reviewing FOSS over the one asshole doing, or rather not doing, exactly that.

                  • Olgratin_Magmatoe@startrek.website
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    edit-2
                    11 months ago

                    The masses can review said asshole all they like, but it doesn’t mean anything, because nobody can stop them from removing the safeguards.

                    Then all of a sudden you have an AI that anybody can use that will happily generate even the most morally bankrupt things with ease and speed.

                    I love FOSS, but AI is inherently unethical. At best it steals people’s work. At worst it makes CP/unconsensual porn.