Radeon Synthetic Frames and do it driver side
. That would be firing some serious shots across the bow.
Interesting idea.
The best quality frame interpolation will need motion vectors, so that would require game-specific integration.
However there's some VR reprojection solutions that don't require motion vectors, and can get reasonably good results. Virtual Desktop's Synchronous Spacewarp doesn't use motion vectors from the game, but uses the XR2's motion estimation hardware to essentially generate motion vectors directly from the image itself.
[yt]zH7qCsey2to[/yt]
So theoretically you could have a driver based solution. Maybe call it FSR 1.5?
Actually there's cool things you could do with something like that, if it was REALLY robust. Lets say you could create several synthetic in-between frames (instead of just one). Now you can run UT2004 at 1000 fps, and then translate that into really amazing motion blurring.
If you could generate synthetic frames fast enough (1ms for example), you'd have a ton of flexibility to generate synthetic frames at whatever intervals you want. Then whatever intervals the game renders frames, you could generate synthetic frames based on motion estimation to ensure the GPU can deliver frames to the display based on the display's intervals. This could accomplish a few things:
- Eliminate the need for Freesync, as you're always sending frames in lock-step with the monitor's native refresh intervals.
- Run all games at the full refresh rate at your monitor, regardless of the render framerate. 60fps? 30fps? 90fps? Doesn't matter, 'cause I'm generating synthetic frames at 144fps with perfect frame-pacing.
- Eliminate microstutters & judder.
- Reduce the perception of input latency in many circumstances.
- Eliminate tearing, as well as vsync.
- Ensure perfectly smooth frame-pacing even for games that run much faster than the monitor's refresh-rate, without needing a framerate limiter. Running UT2004 at 500fps is a juddery mess on a 60fps monitor, but motion-estimated synthetic frames could solve that, and yield the absolute lowest input latency.
- Add gorgeous motion-blur to any game you want.
So we're talking about totally divorcing render intervals from delivery intervals. Similar to how advanced upscaling is starting to divorce render resolution from display resolution.
Of course the more powerful your GPU, the frames you render, the more accurate your synthetic frames, as your motion estimation is happening in smaller and smaller intervals.
But a weaker GPU would still produce 100% smooth gameplay... but perhaps with more visual artifacts and input latency... but it would still be perfectly smooth on any monitor you ran it on.