A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    1 day ago

    (Most) TVs still have a long way to go with color space and brightness. AKA HDR. Not to speak of more sane color/calibration standards to make the picture more consistent, and higher ‘standard’ framerates than 24FPS.

    But yeah, 8K… I dunno about that. Seems like a massive waste.

    • JigglySackles@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 day ago

      For media I highly agree. 8k doesn’t seem to add much. For computer screens I can see the purpose though as it adds more screen real estate which is hard to get enough of for some of us. I’d love to have multiple 8k screens so I can organize and spread out my work.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Are you sure about that? You likely use DPI scaling at 4K, and you’re likely limited by physical screen size unless you already use a 50” TV (which is equivalent to 4x standard 25” 1080p monitors).

        8K would only help at like 65”+, which is kinda crazy for a monitor on a desk… Awesome if you can swing it, but most can’t.


        I tangentially agree though. PCs can use “extra” resolution for various things like upscaling, better text rendering and such rather easily.

        • JigglySackles@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Truthfully I haven’t gotten a chance to use an 8k screen, so my statement is more hypothetical “I can see a possible benefit”.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            24 hours ago

            I’ve used 5K some.

            IMO the only ostensible benefit is for computer type stuff. It gives them more headroom to upscale content well, to avoid anti aliasing or blurry, scaled UI rendering, stuff like that. 4:1 rendering (to save power) would be quite viable too.

            Another example would be editing workflows, for 1:1 pixel mapping of content while leaving plenty of room for the UI.

            But for native content? Like movies?

            Pointless, unless you are ridiculously close to a huge display, even if your vision is 20/20. And it’s too expensive to be worth it: I’d rather that money go into other technical aspects, easily.

    • SpacetimeMachine@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      1 day ago

      The frame rate really doesn’t need to be higher. I fully understand filmmakers who balk at the idea of 48 or 60 fps movies. It really does change the feel of them and imo not in a necessarily positive way.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 day ago

        I respectfully disagree. Folk’s eyes are ‘used’ to 24P, but native 48 or 60 looks infinitely better, especially when stuff is filmed/produced with that in mind.

        But at a bare minimum, baseline TVs should at least eliminate jitter with 24P content by default, and offer better motion clarity by moving on from LCDs, using black frame insertion or whatever.