This is for multiplying your fps by 3x or 4x, but the input lag, ghosting, stuttering, and other issues make everything worse. Overall I’d recommend sticking to lossless scaling at 2x or not using framegen at all.

  • bitjunkie@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    17 hours ago

    This is anecdotal but I don’t get all the frame gen hate, I’ve had to tweak it a little bit toward the quality setting but everything looks normal and it’s totally smooth in maxed Cyberpunk at 165 fps

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      16 hours ago

      If the rendered framerate is over 60fps (which I’d wager is the case for you), it probably looks great.

      Interpolation isn’t psychic; of course its going to look like jello “guessing” what’s between frames at a slideshow pace, especially with the constraint of low latency (without any future frames to use).

      But I do have issue with some devs (and some of Nvidia’s marketing) treating it as a crutch. It’s not going to fix 15fps, but it’s a sane way to get from 60 to 165 smoothly. TBH its less wasteful than trying to hit a native 165.

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 hours ago

        Exactly. Frame gen is trash if you can only hit like 28 FPS before turning it on. But if you are already over 60, it can be fine.

        But there is always a latency penalty, and that’s why if you can’t make frame rate without it, you are just digging a deeper hole latency wise.