• wildncrazyguy138@fedia.io
    link
    fedilink
    arrow-up
    7
    arrow-down
    35
    ·
    2 days ago

    This is disconcerting. China’s not going to slow down on their build out of infrastructure.

    And for those who spin this as a positive. AI is not all LLMs. Real diseases are being cured by the complex modeling, real world tangible products, like airplanes and ships, are being designed safer.

    I speculate that there’s a good chance that the modeling will eventually help to resolve the climate issue too, rather than continue to contribute to it. Physics models become more robust for simulating nuclear fusion; logistics models for transportation and energy distribution too.

    Living in a tourist town, I don’t want a large data center in my backyard either, but there are plenty of places that do and where it makes sense to do so both from a logistical and resource perspective.

    We get behind the curve here and it’s going to be near impossible to catch up, and when the smart people can’t play ball with the newest toys, that leads to brain drain.

    • CorrectAlias@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      26
      ·
      2 days ago

      These data centers are NOT being used for research. Universities and research companies generally have their own computer clusters for that.

      These are being designed and built for LLMs and LLMs only.

      Plus, the US has already lost. We’ve cut off our allies, destroyed our trade partnerships, made the economy unsustainable, and also caused the very brain drain you mention here. We’re cooked for decades at the least.

    • ramble81@lemmy.zip
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      1
      ·
      2 days ago

      Oh fuck off with your whitewashing. Majority of these AI installations are LLMs. Look at what Oracle, Grok, OpenAI, Microsoft and more have been trying to build off. It’s all an infrastructure race that is hurting us all (physically too if you look at how they’re being powered).

      The sooner it collapses, the better.

      • L7HM77@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        2 days ago

        I don’t think these data centers really are for LLMs. Right now, I can go to a dozen websites and use some LLM, without sitting on a wait list, for exactly $0.00 out of my pocket. So there’s obviously enough processing power to meet demand as-is, but… What? Demand will skyrocket when they crank up the fees? OpenAI operated around ~$18 billion in deficit last year, is everyone really gonna pay $200 - $600 per month for this? Plus, LLMs are reaching a plateau, more data doesn’t equal a more coherent model, they’re running into a dead end.

        My local data center is steamrolling over public opinion. We’re not allowed to ask who will own it, how much power it will consume, nothing. “Officially,” the installation has stalled, but they’re still bulldozing the trees to make the lot where its supposed to go.

        My personal conspiracy theory is that this is coming from Palantir, laundering resources through the tech companies, using DoD money. The data centers aren’t for LLMs, but to build out a massive dragnet to track civilian travel, who goes where and when, to be used by DHS. That explains why they need to be distributed geographically per capita, the extreme secrecy around them, and the way utility companies and local politicians keep bending over despite public outcry.

        • greyscale@lemmy.grey.ooo
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Its not consumers they charge.

          They were talking about businesses budgeting about 50% of a developer salary per developer on tokens.

          A consumer $20 claude code account gets about $1800 worth of tokens, iirc.

          • ramble81@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            What determines the value of a token? Is it Standley Nickels calculation, or did they actually attempt to tie it out to infrastructure and operating costs? If the latter, then anyone using these systems needs to be prepared for a serious rug pull as that’s squarely in “the first hit is free” territory.

              • ramble81@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                That doesn’t answer the question. What determines “$1800 worth of tokens”? Is that value calculated from computer time-infrastructure cost? Is it what they think an equivalent of work would be for the time it takes the query to run? Or is it an entirely arbitrary number?

                If the last one, most likely they’re running at a loss and it’s gonna bite them hard when the bill is due for infrastructure.

                • greyscale@lemmy.grey.ooo
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 day ago

                  If I recall, it was something Ed Zitron said but I can’t find the quote, so it might not be. The implication that the cost of delivering that $20 user account was approximately $1800 worth of compute.