• 0 Posts
  • 72 Comments
Joined 6 months ago
cake
Cake day: June 8th, 2025

help-circle

  • I generally overshoot when it comes to the hardware specs, if I can. That way you’re prepared in advance if you end up having the option to upgrade your Internet connection.

    Otherwise, you may find yourself locked in to the slower plan due to the cost of upgrading your hardware.

    But, sometimes you have to choose realistic over optimal, of course. So in other words, I do not think I know what’s best for you! Just offering a perspective that has worked for me, in case it helps you in some way as you evaluate the choice that’s best for you.







  • Sorry, no LLM is ever going to spontaneously gain the abilities self-replicate. This is completely beyond the scope of generative AI.

    This whole hype around AI and LLMs is ridiculous, not to mention completely unjustified. The appearance of a vast leap forward in this field is an illusion. They’re just linking more and more processor cores together, until a glorified chatbot can be made to appear intelligent. But this is struggling actual research and innovation in the field, instead turning the market into a costly, and destructive, arms race.

    The current algorithms will never “be good enough to copy themselves”. No matter what a conman like Altman says.


  • Eh, no. The ability to generate text that mimics human working does not mean they are intelligent. And AI is a misnomer. It has been from the beginning. Now, from a technical perspective, sure, call em AI if you want. But using that as an excuse to skip right past the word “artificial” is disingenuous in the extreme.

    On the other hand, the way the term AI is generally used technically would be called GAI, or General Artificial Intelligence, which does not exist (and may or may not ever exist).

    Bottom line, a finely tuned statistical engine is not intelligent. And that’s all LLM or any other generative “AI” is at the end of the day. The lack of actual intelligence is evidenced by the way they create statements that are factually incorrect at such a high rate. So, if you use the most common definition for AI, no, LLMs absolutely are not AI.


  • Heh. Yes they’re similar, but on the technical side different in a very important way. It has to do with opening a file from inside another program. If you select a shortcut, the program with treat or as a separate file, so most of the time the action will fail. A link, though, you should end up with the program opening the target of the link. In other words, a shortcut is a file that points at a different file, where a symlink involves she filesystem trickery to accomplish almost the same thing.

    That’s a horrible, just terrible explanation, though - but I’m pretty sure this is the gist of it.





  • forrgott@lemmy.sdf.orgtomemes@lemmy.worldI agree
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    5 months ago

    Oh, sure, makes total sense. Sure, sure…

    Except for the fact that nature has successfully balanced itself out for, well, as long as life has existed on this planet. Including recovery and finding a new balance after extremely drastic shifts in the environment.

    Humans managed to remain a part of this for most our existence, too. So the current trends have absolutely nothing to do with our ability to manipulate our environment.

    We’ve allowed an “elite” class of parasitic sociopaths to dictate the direction of modern society, and their influence has spread a corruption to every corner of the modern world. This insatiable greed will be our downfall, and there’s nothing natural about it.



  • Whoah, no need to get defensive! That was just a guess based on my own observations, and I never said anything about whether they have the right to do whatever the hell they want with their own chips (which would be true regardless, I reckon).

    That said, given that their business model depends on their walled garden approach, I just find myself wondering if they might’ve seen the possibility of running any old ARM executable on their silicon as a potential threat to their business model? But there’s all sorts of factors in play, so maybe I’m wrong and that whole possibility is irrelevant.

    Do you happen to know if they strictly extended existing APIs?