• affenlehrer@feddit.org
    link
    fedilink
    arrow-up
    13
    ·
    3 days ago

    We need to go back to gopher or one of those newer simple protocols. It’s just too complex to implement HTML / CSS / JavaScript and all the other stuff correctly from scratch.

    • The_Decryptor@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      It’s just too complex to implement HTML / CSS / JavaScript and all the other stuff correctly from scratch.

      It depends on what you’re trying to do really, if you’re trying to keep pace when Google then yeah it’s insurmountable (Microsoft literally couldn’t do it), but if you just want basic functionality then that’s actually rather static and unchanging.

      Though it doesn’t help when sites use JS for literally everything, and the vast majority do so incorrectly.

    • rmuk@feddit.uk
      link
      fedilink
      English
      arrow-up
      9
      ·
      3 days ago

      I think Gopher would be an even more unworkable shit show than HTTP/HTML is it had to deal with the last thirty year of changes.

      Now, Teletext on the other hand…

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      I’d some plans to write my own e-pub reader, since all the existing ones are shite in their own way, but since e-pub files are secretly xhtml and css in disguise, it’s actually a hell of a job, much bigger than I’d anticipated.

      I don’t think making network requests for files nor parsing any of those formats is so difficult, and while the actual layout rules interact in a complicated way they’re not insurmountable. However, doing it securely and in a way that runs at an acceptable speed is much harder. Tokenizing JS and interpreting it isn’t so bad, but that’s not going to run a modern website with tens of thousands of lines of scripts. Displaying video with hardware acceleration? Best bust out some code.

      Moving to another protocol will either need the cooperation of everyone everywhere all at once, or since that’ll never happen, alternatively convincing all the major browser manufacturers to support both for a while so that other companies can enter the market, which will also never happen. Going to be a tough sell.

      • affenlehrer@feddit.org
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        You’re correct but besides implementation effort I personally think the web has become too “free” or “rich”.

        I don’t actually like that every website has a slightly or sometimes completely different layout, design philosophy, tech stack etc. Often this freedom is just used to display ads everywhere, track users or to look “on brand” but it’s difficult to find the actual content (as user, but also if you want to find it programmatically). With web assembly it’s become even more opaque. It’s also pretty difficult to do anything dynamic (not a static website) in a secure way. Most of the common frameworks and CMS have a ton of dependencies and almost every one of them can impact safety in a negative way.

        Not that I want to get rid of it completely. There are certainly a lot of websites that make good use of the freedom to create a unique and worthwhile experience but for a very very large part of the web (company information, blogs, wikis, forums etc) I’d prefer something much more simple that’s more straight to the point.

        E.g. personally I was super sad that the usenet died and thought (especially at the beginning) that web forums where a big downgrade. Same with early web chats compared with IRC.