Off-and-on trying out an account over at @[email protected] due to scraping bots bogging down lemmy.today to the point of near-unusability.

  • 50 Posts
  • 3.05K Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle


  • I mean, it’s probably a good idea to have them higher, given that if someone wants to use it with some typical out-of-the-box desktop settings, that’s not unreasonable, but while I haven’t looked at the Ubuntu installer for a while, I strongly suspect that it permits you to do a minimal install, and that all the software in the Debian family is also there, so you can do a lightweight desktop based on Ubuntu.

    My current desktop environment has sway, blueman-applet, waybar, and swaync-client running. I’m sure that you could replicate the same thing on an Ubuntu box. Sway is the big one there, at an RSS of 189MB (mostly 148MB of which is shared, probably essentially all use of shared libraries). That’s the basic “desktop graphical environment” memory cost.

    I use foot as a terminal (not in daemon mode, which would shrink memory further, though be less-amenable to use of multiple cores). That presently has 40 MB RSS, 33 of which are shared. It’s running tmux, at 16MB RSS, 4 of which are shared. GNU screen, which I’ve also used and could get by on, would be lighter, but it has an annoying patch that causes it to take a bit before terminating.

    Almost the only other graphical app I ever have active is Firefox, which is presently at an RSS of 887.1, of which 315MB is shared. That can change, based on what Firefox has open, but I think that use of a web browser is pretty much the norm everwhere, and if anything, the Firefox family is probably on the lighter side in 2026 compared to the main alternative of the Chrome family.

    I’m pretty sure that one could run that same setup pretty comfortably on a computer from the late 1990s, especially if you have SSD swap available to handle any spikes in memory usage. Firefox would feel sluggish, but if you’re talking memory usage…shrugs I’ve used an i3/Xorg-based variant of that on an eeePC that had 2GB of memory that I used mostly as a web-browser plus terminal thin client to a “real machine” to see if I could, did that for an extended period of time. Browser could feel sluggish on some websites, but other than that…shrugs.

    Now, if you want to be, I don’t know, playing some big 3D video game, then that is going to crank up the requirements on hardware. But that’s going to be imposed by the game. It’s not overhead from your basic graphical environment.

    I’d also be pretty confident that you could replicate that setup using the same packages on any Debian-family system, and probably on pretty much any major Linux distro with a bit of tweaking to the installed packages.




  • I assume so. Here’s a video of someone floating a boat (apparently in air) in it, and then sinking it by pouring cups of sulfur hexafluoride over it:

    https://www.youtube.com/watch?v=ee2NaYRnRGo

    If it avoids diffusing into air to the degree that you can scoop it up and pour it, I’d imagine that it’d pour out of one’s lungs the same way.

    But if you just want to get most of it out of your lungs — like, you’ve been breathing it and don’t want to asphyxiate — I imagine that exhaling all the air you can and inhaling air and doing that a few times would probably do a pretty good job, the way the Mythbusters video above did with the helium.



  • I’d guess that most industrial users of helium don’t consume it and could theoretically recover it from whatever process it’s involved in rather than just releasing it.

    EDIT: Hard drives being an exception, as apparently some ship helium-filled; there, it’s actually being consumed during the manufacture.

    EDIT2: I’d also point out that in the long run, we probably do have to be more conservative with our helium supply. We get it from pockets in the earth. It’s actually not all that common; it just happens, though, that we go to a lot of effort to extract natural gas, and that happens to sometimes also come up with helium, so we get that supply. But because it’s not reactive, it doesn’t bond to anything — it stays in gas form. When we let it go, it heads to near the top of our atmosphere and eventually gets lost to solar wind. Many users who today just release it — because why not, as the natural gas people will be providing more, and it’s cheaper that way — probably will need to capture what they’re using if we want helium to continue to be available.




  • I doubt that there’s actually a substantial impact on battery cell production. Might be on rack-mountable batteries containing those cells. But setting that aside:

    Panasonic plans to expand lithium-ion cell

    Non-rechargable AAA batteries are typically alkaline, and rechargeables are typically NiMH, not lithium-ion.

    EDIT: Looking at a handful of rack-mount lithium-ion batteries on Amazon price history using camelcamelcamel, prices are either unchanged or very slightly up. Could be Panasonic looking to get into the news, but it’s not clear to me that there’s a shortage of even rack-mount lithium-ion batteries.






  • I use “mono-9” in all my terminals, including for emacs. On my Debian trixie system, that maps to DejaVu Sans Mono in the fonts-dejavu-mono package.

    $ cat ~/.config/foot/foot.ini
    [main]
    font=mono-9
    $ fc-match mono-9
    DejaVuSansMono.ttf: "DejaVu Sans Mono" "Book"
    $ fc-list|grep DejaVuSansMono.ttf
    /usr/share/fonts/truetype/dejavu/DejaVuSansMono.ttf: DejaVu Sans Mono:style=Book
    $ dpkg -S /usr/share/fonts/truetype/dejavu/DejaVuSansMono.ttf
    fonts-dejavu-mono: /usr/share/fonts/truetype/dejavu/DejaVuSansMono.ttf
    $
    

    https://en.wikipedia.org/wiki/DejaVu_fonts

    The DejaVu fonts are a superfamily of fonts designed for broad coverage of the Unicode Universal Character Set. The fonts are derived from Bitstream Vera (sans-serif) and Bitstream Charter (serif), two fonts released by Bitstream under a free license that allowed derivative works based upon them; the Vera and Charter families were limited mainly to the characters in the Basic Latin and Latin-1 Supplement portions of Unicode, roughly equivalent to ISO/IEC 8859-15, and Bitstream’s licensing terms allowed the fonts to be expanded upon without explicit authorization.

    The full project incorporates the Bitstream Vera license, an extended MIT License, which restricts naming of modified distributions and prohibits individual sale of the typefaces, although they may be embedded within a larger commercial software package (terms also found in the later Open Font License); to the extent that the DejaVu fonts’ changes can be separated from the original Bitstream Vera and Charter fonts, these changes have been deeded to the public domain.[1]


  • ‘Toad-proofing’ farms could help stop the march of invasive pest

    invasive cane toads

    Australia’s most damaging invasive species

    https://en.wikipedia.org/wiki/Cane_toad

    Because of its voracious appetite, the cane toad has been introduced to many regions of the Pacific and the Caribbean islands as a method of agricultural pest control.

    https://en.wikipedia.org/wiki/Cane_toads_in_Australia

    Native to South and mainland Middle America, imported cane toads had been used in Puerto Rico to control sugar cane pests since 1920, and an influential 1932 research paper by Raquel Dexter showed that they largely ate beetle larvae that in turn ate sugar cane.[3] Based on her findings, they were introduced to Hawaii by Cyril Pemberton in the early 1930s, and then introduced to Australia from Hawaii in June 1935 by the Bureau of Sugar Experiment Stations, now Sugar Research Australia, in an attempt to control the native grey-backed cane beetle (Dermolepida albohirtum) and French’s beetle (Lepidiota frenchi).[4] Those beetles are native to Australia and they are detrimental to sugarcane crops, which are a major source of income for Australia.

    The sudden inundation of foreign species has led to severe breakdowns in Australian ecology, after overwhelming proliferation of a number of introduced species, for which the continent has no efficient natural predators or parasites, and which displace native species; in some cases, these species are physically destructive to habitat, as well.

    Sounds like all we need is to introduce something that preys on cane toads. That’ll do 'er!


  • There are some memory latency benefits to putting memory on a single chip, but to date, that’s largely been handled by adding cache memory to the CPU, and later adding multiple tiers of it, rather than eliminating discrete memory.

    The first personal computer I used had 4kB of main memory.

    My current desktop has a CPU with 1MB of L1 cache, 16MB of L2 cache, 128MB of L3 cache, and then the system as a whole has 128GB of discrete main memory.

    Most of the time, the cache just does the right thing, and for software that is highly performance-sensitive, one might go use something like Valgrind’s cachegrind or something like that to profile and optimize the critical bits of software to minimize cache misses.

    I could believe that maybe, say, one could provide on-core memory that the OS could be more-aware of, say, let it have more control over the tiered storage. Maybe restructure the present system. But I’m more dubious that we’ll say “there’s no reason to have a tier of expandable, volatile storage off-CPU at all on desktops”.

    EDIT: That argument is mostly a technical one, but another, this one from a business standpoint. I expect PC builders have a pretty substantial business reason to not want to move to SoCs. Right now, PC builders can, to some degree, use price discrimination to convert consumer surplus to producer surplus. A consumer will typically pay disproportionately more for a computer with more memory, for example, when they purchase from a given vendor. If the system is instead sized at the CPU vendor, then the CPU vendor is going to do the same thing, probably more effectively, as there’s less competition in the CPU market, and it’ll be the PC builder seeing money head over to the CPU vendor — they’ll pay a premium for high-end SoCs.

    In Apple’s case, that’s not a factor, because Apple has vertically-integrated production. They make their own CPUs. Apple’s PC builder guys aren’t concerned about Apple’s CPU guys extracting money from them. But Dell or HP or suchlike don’t manufacture their own CPUs, and thus have a business incentive to maintain a modular system. Unless one thinks that the PC market as a whole is going to transition to a small number of vertically-integrated businesses that look like Apple, I guess, where you have one or two giant PC makers who basically own their supply chain, but I haven’t heard about anything like that happening.



  • Aside from them, discrete graphics cards are history, just as disk controllers were a few decades earlier. DIMM slots are going too. The primary storage will be built in. (The industry missed a great deal there.)

    Discrete disk controllers are still around.

    My last desktop had a PCI SATA card that I added after I exhausted all of the on-motherboard SATA slots.

    My current one has a JBOD SATA USB Mass Storage enclosure.