• wonderingwanderer@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 day ago

      Yeah, self-hosted open-source models seem okay, as long as their training data is all from the public domain.

      Hopefully RAM becomes cheap as fuck after the bubble pops and all these data centers have to liquidate their inventory. That would be a nice consolation prize, if everything else is already fucked anyway.

      • addie@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        Unfortunately, server RAM and GPUs aren’t compatible with desktops. Also, NVidia have committed to releasing a new GPU every year, making the existing ones worth much less. So unless you’re planning to build your own data centre with slightly out-of-date gear - which would be folly, the existing ones will be desperate to recoup any investment and selling cheap - then it’s all just destined to become a mountain of e-waste.

        • wonderingwanderer@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Maybe that surplus will lay the groundwork for a solarpunk blockchain future?

          I don’t know if I understand what blockchain is, honestly. But what if a bunch of indie co-ops created a mesh network of smaller, more sustainable server operations?

          It might not seem feasible now, but if the AI bubble pops, Nvidia crashes spectacularly, data centers all need to liquidate their stock, and server compute becomes basically viewed as junk, then it might become possible…

          I’m just trying to find a silver lining, okay?

          • MagicShel@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            22 hours ago

            Like AI, blockchain is a solution in search of a problem. Both have their uses but are generally part of overcomplicated, expensive solutions which are better done with more traditional techniques.

            • wonderingwanderer@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              19 hours ago

              Maybe I didn’t mean blockchain, cause I’m still not really certain what it is. I mean like the fediverse itself, or a mesh network, where a bunch of hobbyist self-hosting their own servers can federate as a system of nodes for a more distributed model.

              Instead of all the compute being hoarded in power-hungry data centers; regular folks, hobbyists, researchers, indie devs, etc., would be able to run more powerful simulations, meta-analyses, renderings, etc., and then pool their data/collaborate on projects, and ultimately create a more efficient and intelligently guided use of the compute instead of simply “CEO says generate more profit! 24/7 overdrive!!!”

              At the very least, a surplus of cheap RAM would expand the computing capabilities of everyone who isn’t a greedy corporation with enough money to buy up all the expensive RAM.

            • wonderingwanderer@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              19 hours ago

              I would imagine any program running simulations, rendering environments, analyzing metadata, and similar tasks would be able to use it.

              It would be useful for academic researchers, gamers, hobbyists, fediverse instances. Basically whatever capabilities they have now, they would be able to increase their computing power for dirt cheap.

              Someone could make a fediverse MMO. That could be cool, especially when indie devs start doing what zuck never could with VR.

            • addie@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              21 hours ago

              Google Stadia wasn’t exactly a responding success…

              From a previous job in hydraulics, the computational fluid dynamics / finite element analysis that we used to do would eat all your compute resource and ask for more. Split your design into tiny cubes, simulate all the flow / mass balance / temperature exchange / material stress calculations for each one, gain an understanding of how the part would perform in the real world. Very easily parallelizable, a great fit for GPU calculation. However, it’s a ‘hundreds of millions of dollars’ industry, and the AI bubble is currently ‘tens of trillions’ deep.

              Yes, they can be used for other tasks. But we’ve just no use for the amount that’s been purchased - there’s tens of thousands of times as much as makes any sense.

              • wonderingwanderer@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                19 hours ago

                So there would be an enormous surplus and a lot of e-waste. That’s a shame, but that’s going to happen anyway. I’m only saying that the silver lining is that it means GPU and RAM would become dirt cheap (unless companies manufacture scarcity like the snakes they are).

                Industrial applications aren’t the only uses for it. Academic researchers could use it to run simulations and meta-analyses. Whatever they can do now, they could do more powerfully with cheap RAM.

                Gamers who self-host could render worlds more powerfully. Indie devs could add more complex dynamics to their games. Computer hobbyists would have more compute to tinker with. Fediverse instances would be able to handle more data. Maybe someone could even make a fediverse MMO. I wonder if that would catch on.

                Basically, whatever people can do now, more people would be able to do more powerfully and for cheaper. Computations only academia and industry can do now would become within reach of hobbyists. Hobbyists would be able to expand their capacities. People who only have computers to tinker with now would be able to afford servers to tinker with.

                “Trickle-down” is a bullshit concept, as everything gets siphoned to the top and hoarded. But when that cyst bursts, and those metaphorical towers come crashing down, there’s gonna be a lot of rubble to sift through. It’s going to enable the redistribution of RAM on a grand scale.

                I’m not pretending it’ll solve everyone’s problems, and of course it would have been better if they had left the minerals in the ground and data centers had never grown to such cancerous proportions. But when the AI bubble bursts and tech companies have to liquidate, there’s no denying that the price of RAM would plummet. It’s not a magic bullet, just a silver lining.

        • MagicShel@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          22 hours ago

          I read I think just last week but for sure in the last month that someone has created an AI card that lowers power usage by 90%. (I know that’s really vague and leaves a lot of questions.) It seems likely that AI-specific hardware and graphics hardware will diverge — I hope.

          • wonderingwanderer@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            19 hours ago

            I think it’s called an inferencing chip. I read about it a few months ago.

            Basically, the way it was explained, the most energy-intensive part of AI is training the models. Once training is complete, it requires less energy to make inferences from the data.

            So the idea with these inferencing chips is that the AI models are already trained; all they need to do now is make inferences. So the chips are designed more specifically to do that, and they’re supposed to be way more efficient.

            I kept waiting to see it in devices on the consumer market, but then it seemed to disappear and I wasn’t able to even find any articles about it for months. It was like the whole thing vanished. Maybe Nvidia wanted to suppress it, cause they were worried it would reduce demand for their GPUs.

            At one point I had seen a smaller-scale company listing laptops for sale with their own inferencing chips, but the webpage seems to have disappeared. Or at least the page where they were selling it.

    • Sabin10@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      Agreed, AI has uses but c-suite execs have no idea what they are and are paying millions to get their staff using them in hopes of finding what those uses are. In reality they are making things worse with no tangible benefit because they are all scared that someone will find this imaginary golden goose first.