• Limonene@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    9 hours ago

    It sucks that you can’t browse anywhere without javascript anymore. It used to be that all the open source sites, most news sites, forums running phpbb, even YouTube aside from the actual <video> element all worked without javascript, and as a bonus there would be no ads.

    Now, you can’t browse anywhere without these challenges. At least this one is noninvasive, but the Cloudflare one and the Google Recaptcha do a ton of fingerprinting to choose whether to let you in.

    USPS has a home-rolled one that requires web assembly enabled, or it silently fails with a blank page. There’s no non-malicious excuse for that.

    If this is the future of web browsing, hopefully more sites use systems like Anubis. But I also hope at least static pages can be viewable as plain html.

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      42 minutes ago

      Yeah. I just leave pages that use this crap, but it’s steadily becoming a larger and larger % of the sites I try to visit each day.

    • Kilgore Trout@feddit.it
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 hours ago

      USPS has a home-rolled one that requires web assembly enabled, or it silently fails with a blank page

      I sadly found out when trying to track the replacement display for my phone from iFixit through the Kobo web browser.

      • Limonene@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        8 hours ago

        Web browsers have a huge attack surface, and are most people’s main exposure to potential exploits. Without javascript, 99% of the attack surface disappears, becuase the attacker no longer has a way to run arbitrary code.

        A lot of terminal-based browsers don’t do javascript.

        If I want to scrape a page, this makes it a pain for both parties. I’m not an AI company, so I can afford the hash tax, but it’s still a pain to spin up Firefox from a from cron job instead of wget. And I’m still doing it, Anubis doesn’t stop small time scrapers like me who aren’t running AI training, and only scrape like one page per day. So now the server has to serve the original page, plus all the Anubis stuff each time my crown job goes off.