Better than anything. I run through vulkan on lm studio because rocm on my rx 5600xt is a heavy pain
I like sysadmin, scripting, manga and football.
Better than anything. I run through vulkan on lm studio because rocm on my rx 5600xt is a heavy pain
Ollama has had for a while an issue opened abou the vulkan backend but sadly it doesn’t seem to be going anywhere.
Put up some docker stats in a reply in case you missed it on the refresh
For the whole stack in the past 16 hours
# docker-compose stats
Depends on how many communities do you subscribe too and how much activity they have.
I’m running my single user instance subscribed to 20 communities on a 2c/4g vps who also hosts my matrix server and a bunch of other stuff and right now I mostly see peaks from 5/10% of CPU and RAM at 1.5GB
I have been running for 15months and the docker volumes total 1.2GBs A single pg_dump for the lemmy database in plain text is 450M
Yep I also run wildcard domains for simplicity
I do the dns challenge with letsencrypt too but to not leak local dns names into the public I just run a pihole locally that can resolve those domains
What is your budget?
i5-7200U
That is Kaby Lake and seems to support up to HEVC for decoding which might be enough for you
https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Hardware_decoding_and_encoding
For a bit of future proof you might want to check out something Tiger Lake or newer since it seems like they support AV1 decoding in hardware.
Somebody who needs the dopamine of running yay -Syyyyyuuuuuuu 4 times a day wouldn’t be running broken and outdated *-bin packages but always target *-git alternatives /s
you could say it was very basic AI
Arch Linu, Kubuntu, Supergrub, Tails, Kali, Windows
The G in LLM stands for Girlfriend
I just stop my containers and tar gzip their compose files, their volumes and the /etc folder on the host
Assuming you have all of them under a folder, I just run this lol
for f in *; do
echo "$f";
git -C "$f" pull;
git -C "$f" submodule update --recursive --remote;
echo "";
echo "#########################################################################";
echo "";
done
For archival purposes software encoding is always more efficient size wise.
I am also waiting for an Arc to arrive to plug into my jellyfin box.
Hardware encoding is fast yeah but wont save me disk space.
Still not sure whether I will upgrade to 9900x or 9700x from my 3700x
I’m recently:
So going harder on the stronger CPU rather than an expensive the GPU seems to be the answer for me. If I gamble on proper rocm support for some AI workloads and fail at least I could run some casual stuff using the CPU device.
From what I understand its still restrained by the server in real time so downloading a 2 hour movie would still take two hours 🥲
I have a lot of artworks I downloaded over years that were saved in png files and after converting them loslessly to avif I still was able to regain some space.
For videos you cant afford lossless if you want to recover space but visually lossless results are usually good enough on AV1