• FishFace@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    2 days ago

    Relating this to carbon emissions is absurd. Your phone’s maximum power consumption is about 25W, of which sensors are a tiny, minuscule fraction. Running your phone at 25W for an entire year would allow you to drive a typical petrol car doing 40mpg for 250 miles on the same energy budget.

    Reducing sensor power usage is good, but not for this reason.

    • Frezik@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      edit-2
      2 days ago

      There is a connection, but I don’t think it’s a satisfying one.

      There’s some thought that neural networks would take less power consumption if they were on analog chips. So yeah, it’s for LLMs to get bigger. Reducing CO2 emissions by not doing LLM slop is apparently off the table.

      • KairuByte@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        Reducing CO2 emissions by not doing LLM slop is apparently off the table.

        Not to be argumentative, but has this ever been something the consumer market has done with an emerging “core” technology? I don’t see how this was ever realistically on the table.

        AI slop is an unfortunate fact of life at this point. If it’s inevitable, we may as well make it as not terrible as possible.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          That’s what regulations are for. We’ve been asking for CO2 regulations for decades, but the argument is almost always “we can’t reduce dirty energy production until we have enough power to replace it all without downscaling.” Then they invent stuff like crypto to drain any excess power. That crashed, then AI suddenly appears to drain it. I’m convinced it’s all a conspiracy to keep dirty energy companies profitable. The timing is just too convenient.

          • KairuByte@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            13 hours ago

            That’s what I mean though. Convincing users to not use LLMs as a way to reduce CO2 is a fools errand. It will never work. So we should focus on something that can actually move the needle, like speeding up the move to a fully green grid.

        • Frezik@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          4
          ·
          23 hours ago

          Nothing inevitable about it. People aren’t going to be running local models en masse; that will be about as popular as self-hosting Internet services. People are largely reliant on centralized datacenter models, and those will shut down as the bubble pops.

  • pastermil@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 days ago

    As cool as it sound at a glance, I fail to see the case they’re trying to build.

    And of course, they have to sprinkle a little AI on it.

  • vacuumflower@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    I would like more something like a GPU to become normal, except it’d be an analog unit for kinds of computation vastly more efficient this way, where you don’t need determinism. Some trigonometry and signal processing, perhaps even some of 3d graphics.

  • floo@retrolemmy.com
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    11
    ·
    2 days ago

    No, it isn’t. There’s just a passing interest in retro technology

    It’ll pass

    • sploosh@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 days ago

      New, programmable analog chips that perform basic sound processing aren’t retro. The article is worth reading.

    • floquant@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      This has nothing to do with retro technology. This is about thinking “is using binary really the most efficient way to run every computation we need to do?”, which is really relevant today.

      • floo@retrolemmy.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        edit-2
        2 days ago

        Is it? Binary is not a “analog” vs “digital” thing. “Binary” existed in analog computing for a couple of centuries at least before the concept of “digital“ even existed.

        It’s an abstract concept, not a specific application and while it can be specifically applied, there is no implication that it is either analog or digital. It could be either, both, or neither.

        • floquant@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          2 days ago

          With “binary” I mean “has two states”, as in discrete, as in digital. You can represent binary bits using analog circuits, but it doesn’t make those circuits binary/digital. Likewise, you can represent continuous, analog functions using discrete logic, but it will always be an approximation. What makes these chips different is that they are able to not only represent but actually model continuous functions and values, like physical models.

          • floo@retrolemmy.com
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            2 days ago

            I think perhaps you might’ve misunderstood my comment, because this is exactly what I was saying (well, part of what I was saying, anyway). You’re just being a lot more specific in your explanation.

            I’ll try to be more clear in the future

            • floquant@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              2 days ago

              No offense taken! I just believe that a subtle difference does not mean unimportant and wanted to be precise. I didn’t take you as someone who doesn’t understand analog and digital, especially considering your instance :) I edited my previous comment for some additional clarity. I just think they’re neat ^^

        • Frezik@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 days ago

          That doesn’t contradict anything above.

          There’s a company pushing their hybrid analog/digital chip for real use cases. I dunno if it’s going to be successful, but it’s not retro.