Off-and-on trying out an account over at @[email protected] due to scraping bots bogging down lemmy.today to the point of near-unusability.

  • 41 Posts
  • 2.56K Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle










  • have a gtx 6700 gpu which should be plenty overkill for cs2. But ive heard cs2 is processor heavy.

    I don’t know which game you’re playing, not sure what CS2 is, as some other folks mention. However, if you install mangohud and run it via mangohud <gamename> — if this is Steam, in the game’s Launch Options, that’ll be “mangohud %command%” — it’ll show you CPU and GPU load in an overlay on top of your game.

    EDIT: Example:

    EDIT2: Note that by default, it shows “composite CPU load”, same as top does by default. So, say you have a 32-core CPU and a game uses only a single thread, then it’ll only show it running at 3%, even if the game is bottlenecked on the single core that it’s using. MANGOHUD_CONFIG=full mangohud <gamename> will show all CPU cores independently (along with some other data). E.g.:

    It sounds like you’re using Counter-Strike 2 from other comments, and that CS2 only really uses 1-2 cores:

    https://steamcommunity.com/app/730/discussions/0/594026537713459453/

    CS2 still heavily loads only 1–2 CPU threads, even on modern CPUs with multiple high-performance cores. Other cores remain mostly idle while one thread runs at 100%.





  • Meta’s chief AI scientist and Turing Award winner Yann LeCun plans to leave the company to launch his own startup focused on a different type of AI called “world models,” the Financial Times reported.

    World models are hypothetical AI systems that some AI engineers expect to develop an internal “understanding” of the physical world by learning from video and spatial data rather than text alone.

    Sounds reasonable.

    That being said, I am willing to believe that an LLM could be part of an AGI. It might well be an efficient way to incorporate a lot of knowledge about the world. Wikipedia helps provide me with a lot of knowledge, for example, though I don’t have a direct brain link to it. It’s just that I don’t expect an AGI to be an LLM.

    EDIT: Also, IIRC from past reading, Meta has separate groups aimed at near-term commercial products (and I can very much believe that there might be plenty of room for LLMs here) and aimed advanced AI. It’s not clear to me from the article whether he just wants more focus on advanced AI or whether he disagrees with an LLM focus in their afvanced AI group.

    I do think that if you’re a company building a lot of parallel compute capacity now, that to make a return on that, you need to take advantage of existing or quite near-future stuff, even if it’s not AGI. Doesn’t make sense to build a lot of compute capacity, then spend fifteen years banging on research before you have something to utilize that capacity.

    https://datacentremagazine.com/news/why-is-meta-investing-600bn-in-ai-data-centres

    Meta reveals US$600bn plan to build AI data centres, expand energy projects and fund local programmes through 2028

    So Meta probably cannot only be doing AGI work.