I still don’t get why Strava activities are public by default and why they do not make their users aware of it. I remember having to rummage through the settings to make activities private by default.
I still don’t get why Strava activities are public by default and why they do not make their users aware of it. I remember having to rummage through the settings to make activities private by default.
If you want an experience similar to Arc without the AI nonsense, there is Zen Browser, a Firefox fork with vertical tabs, profiles and side panel.
I’m not familiar with Nextcloud, but from reading the How to use this? section of the README I believe you can run it behind a reverse proxy:
--publish 80:80
This means that port 80 of the container should get published on the host using port 80. It is used for getting valid certificates for the AIO interface if you want to use port 8443. It is not needed if you run AIO behind a web server or reverse proxy and can get removed in that case as you can simply use port 8080 for the AIO interface then.
(Emphasis mine, in “Explanation of the command”)
My understanding is you only have to forward traffic from the reverse proxy to the port 8080. It uses a self-signed certificate though, so you might check if the reverse proxy you are using checks certificates signatures for upstream servers.
It is possible, what you’re looking for is a reverse proxy: it’s an HTTP server that will listen to the standard ports for HTTP and HTTPS that will redirect traffic to the chosen service based on the domain name or URL.
In your case, every subdomain would point to your VPS’s IP and traffic that’s for mastodon.example.tld
will be seemlesly proxied to your Mastodon container.
Do some research on Caddy or Nginx, and I strongly recommend you learn Docker Compose and Docker networking, it will help you make it easier to maintain everything.
PS: CNAME pointing to A record is the way to go. You can do it one better by having a CNAME entry for *.example.tld
, so that you don’t have to create a CNAME entry for every new service you deploy, but you better make sure that your reverse proxy won’t proxy requests to an unexpected container when requesting a bogus subdomain.
Parents, maybe? They are usually so concerned about children’s safety, whether that’s their kids or someone else’s.
I already did back when Microsoft announced they would drop WMR, but it was (and still is) pretty experimental, with no controller support and 6DoF requiring external tracking.
to keep Copilot off your desktop or learn Linux
For me it’s one year to keep Windows Mixed Reality working. I’m still miffed that they pulled the plug with no alternative other than putting my headset in the bin and get a new one…
It will end up being analogous to Uber and Lyft, and neither helps reducing the amount of cars on the road.
Also it doesn’t respect robots.txt
(the file that tells bots whether or not a given page can be accessed) unlike most AI scrapping bots.
If you go for RAID, I would advise for software RAID rather than hardware (i.e provided by your motherboard or a physical car). Hardware RAID will lock you to the particular motherboard or RAID card, which would represent an additional hurdle when upgrading or replacing it.
I see Amazon is trying something else for their 2024 attrition strategy.
RSS/ATOM has to be the best thing to come out of XML
It does for a few versions now, and even before there was at least one extension adding this feature.
I feel like not everyone is conscious of these biases and we need to raise the awareness and try preventing for example HR people from buying AI-based screening software that has a strong bias that is not disclosed by their vendors (because why would you advertise that?)
Seems like not a bias by Al models themselves, rather a reflection of the source material.
That’s what is usually meant by AI bias: a bias in the material used to train the model that reflects in its behavior
Earlier this year, researchers from security firm Avast spotted a newer FudModule variant that bypassed key Windows defenses such as Endpoint Detection and Response, and Protected Process Light. Microsoft took six months after Avast privately reported the vulnerability to fix it, a delay that allowed Lazarus to continue exploiting it.
Dammit Microsoft, you only had one job!
There are multiple causes to its demise.
The big one was security (or lack thereof) as attackers would abuse plug-ins through NPAPI. I remember a time when every month had new 0-days exploiting a vulnerability in Flash.
The second one in my opinion, is the desire to standardize features in the browser. For example, reading DRM-protected content required Silverlight, which wasn’t supported on Linux. Most interactive games and some websites required Flash which had terrible performance issues. So it felt natural to provide these features directly in the browser without lock-in.
Which leads to your second question: I don’t think we will ever see the return to NPAPI or something similar. The browser ecosystem is vibrant and the W3C is keen to standardize newly needed features. The first example that comes to mind is WebAuthn: it has been integrated directly in the browsers when 10 years ago it would have been supported through NPAPI.
I forgot this gimmick… Did Google drop it on the Pixel 9?
Oh, I wasn’t aware of this since the BBC article does not mention it. Then Disney’s attempt to arbitrate based on the account terms barely holds water.
I have been contemplating moving to SearNXG for a few weeks, but I have a hard time finding whether I can configure things like domain down-ranking/blocking or custom bangs and lenses, does anyone know if you can do that on a user or instance-level?