I very reluctantly put a new mac mini on order last Sunday. I didn’t feel great about it but I was feeling done with Windows for a bit at least for home use.
I very reluctantly put a new mac mini on order last Sunday. I didn’t feel great about it but I was feeling done with Windows for a bit at least for home use.
Plex is excellent, and even if you prefer the features or interface of Jellyfin, you should never expose any application (Plex, Jellyfin, or otherwise) directly to the Internet. This should be non-negotiable. Plex solves for external access with the mobile/desktop apps and app.plex.tv by brokering client connections into your network without a NAT/PAT on your router or firewall.
For a music library, even a small one, tracks should have proper metadata applied to them and be stored in directories. Plex provides guidance on this here: https://support.plex.tv/articles/200265296-adding-music-media-from-folders/
My own strategy: I deviate slightly from Plex’s file and directory naming strategy, but it works perfectly. I start with high quality music, mostly from Bandcamp and process it through Musicbrainz Picard into ALBUMARTIST\YYYY - ALBUMNAME\01 - TRACKNAME.FLAC. Picard sets the metadata and ensures that there is an album cover image also.
Before moving the organized files to my Plex server, I run them through MP3Tag and overwrite any mismatched artist names with the album artist (getting rid of artist fields with 'feat xxxx artist’s). This is important for when I sync files in Media Monkey to my iPod, since the iPod would break apart albums with multiple artists. My preference is to keep them grouped together.
Hope this helps good luck 👍. Let me know if you want to know a decent strategy on movie backups also.
It’s public information transmitted over airwaves and several sites exist already. Flightradar24 and adsbexchange are the two I use, though Elon and Taylor Swift are far too boring to pay attention to when you can watch refuelers and jets instead.
I used it on an Android DAP to sync my music collection from my NAS after giving up on Folder sync due to its issues with new file detection breaking after a daylight savings time change. Synching was definitely more reliable but it takes ages to do the scan.
A major rewrite is pending. He stated this on his Twitter I think.
Suyu was mismanaged and died very quickly. Sudachi was excellent and has an Android build but the dev is taking a break.
I’m going to stand by what Retro Game Corps stated that they don’t want to advocate for piracy. Lawful backup of your own games is protected. The aggression against Russ and others is uncalled for.
Well hang tight there, the games are dope. Just the legal dicks are the scumbags.
I’ve been gradually building up my Switch digital and physical library and stuff like this makes me want to just switch back to Steam and spend my dollar elsewhere.
The strikes are absolutely frivolous and Russ sets a great anti piracy example for others. Backups of your own content are protected.
At one point I had been playing GTA V online pretty consistently when I had a cheater start targeting me. It was pretty frustrating and after 30 minutes of that I gave up and closed down for the day. I shifted my attention to other games after that. I definitely get that they want to stop cheaters - cheaters ruin the fun for others. It’s a shame that the new anti cheat has made it so that Steam Deck players are stuck unable to play online.
Yeah I suppose if the convenience is worth it. To me it’s odd to rent anything - especially retro games.
For some I’m sure it might have value but I can’t imagine subscribing to a service to play retro games. Dozens of excellent retro handhelds exist with no monthly fee to fiddle with.
Links awakening is great. Past that some other GB titles:
Super Mario 2: 6 Golden Coins is incredible.
Tetris is great but I prefer the Rosy Retrospection hack. Maybe wait and see if Modretro releases the officially licensed Chromatic Tetris down the road. Chromatic Tetris is already being used in competitions and it looks incredible.
Dr. Mario
You do not need to port forward to share a Plex instance over the Internet. App.plex.tv manages the inbound connections automatically. All you need to do is manage invites to your friends. They log in with their email/password or with Google SSO to app.plex.tv and your content will be available over a secure connection with no port forwarding.
Plex should not be accessed externally using a port forward. Always use app.plex.tv as it prevents unauthenticated users from seeing the instance.
Play services actually works very well for containerizing work apps. Better actually than on iOS. My work can offer a set of apps that are available in this isolated container and apply policy to them that doesn’t impact other areas of the phone. I can also shut off all of them with a single button when I am on PTO. Microsoft’s apps require these services to build the container, and I believe Android phones in China do not have play services. It’s not perfect, but I personally think it works very well.
The costs are definitely a huge consideration and need to be optimized. A few years back we ran a POC of Open Shift in AWS that seemed to idle at like $3k/mo with barely anything running at all. That was a bad experiment. I could compare that to our new VMWare bill, which more than doubled this year following the Broadcom acquisition.
The products in AWS simplify costs into an opex model unlike anything that exists on prem and eliminate costly and time consuming hardware replacements. We just put in new load balancers recently because our previous ones were going EoL. They were a special model that ran us a about a half-mil for a few HA pairs including the pro services for installation assistance. How long will it take us to hit that amount using ALBs in AWS? What is the cost of the months that it took us to select the hardware, order, wait 90 days for delivery, rack-power-connect, configure with pro services, load hundreds of certs, gather testers, and run cutover meetings? What about the time spent patching for vulnerabilities? In 5-7 years it’ll be the same thing all over again.
Now think about having to do all of the above for routers, switches, firewalls, VM infra, storage, HVAC, carrier circuits, power, fire suppression.
The cloud today significantly different than the 2003 cpanel LAMP server. It’s a whole new landscape. Complex, highly-available architectures that cannot be replicated in an on-prem environment are easily built from code in minutes on AWS.
Those capabilities come with a steep learning curve on how to operate them in a secure and effective manor, but that’s always going to be the case in this industry. The people that can grow and learn will.
The core features of a WAF do require SSL offload, which of course means that the data needs to be unencrypted with your certificate on their edge nodes, then re-encrypted with your origin certificates. There is no other way in a WAF to protect from these exploits if the encryption is not broken, and WAF vendors can respond much faster than developers can to put protections in place for emerging threats.
I had never considered that Akamai or Cloudflare would be doing any deeper analytics on our data, as it would open them up to significant liability, same as I know for certain that AWS employees cannot see the data within our buckets.
As for the captcha prompts, I can’t speak to how those work in Cloudflare, though I do know that the AWS WAF does leave the sensitivity of the captcha prompts entirely up to the website owner. For free versions of CF there might be fewer configurable options.
Hah, yes that was an odd placement. It seems like a non issue though.