The FSR Thread (Announcements & Discussion)

Those are very impressive numbers they are marketing.

I hope it is true..... does anyone think it will be that close?

I would assume the example they give (God fall) is their best one, anyone have ideas or thoughts on what will be the average increase?

I am not familiar enough with DLSS numbers, how do these numbers compare?

In terms of image quality, the link also claims 'almost identical' image to the native 4k one (i assume at the ultra setting.... not sure on the other options). Is that similar for DLSS?

on a scale of 1 to 10 how true do you feel this announcement is?
 
[yt]eHPmkJzwOFc[/yt]

I remain skeptical. But the wide support going way back is uber sweet.
 
Just to get this straight. This is to be baked into games and will not be a "Radeon Settings" option.. right?
 
Just to get this straight. This is to be baked into games and will not be a "Radeon Settings" option.. right?

I haven't read anything yet on where the control over the quality settings will be located, but given that it supports GTX and RTX cards it's a safe bet it won't be a "Radeon Settings" option.
 
[yt]9UoghWAZ_L0[/yt]


HU seems a bit sceptical if it will actually be competitive to DLSS 2.0


So far, from AMDs own slide, looks more like DLSS 1.0.


It's welcome especially to non RTX owners, but a software scalar likely won't ever give the same kind of qualitive output as an AI trained hardware scalar can do.
 
It's welcome especially to non RTX owners, but a software scalar likely won't ever give the same kind of qualitive output as an AI trained hardware scalar can do.

Yeah but at this point it makes RT feasible on Radeon cards without the huge FPS hit.
 
From what people are saying via leaks it's in-between dlss 1 and dlss 2 but way easier to implement and with such broad support seems like it will be a way easier sell for implementation.

I view it as the new freesync vs gysnc and we all see how that went.

Sent from my GM1917 using Tapatalk
 
F.
I view it as the new freesync vs gysnc and we all see how that went.

Sent from my GM1917 using Tapatalk

How has that gone? GSync hardware modules are still superior to FreeSync and monitors with those modules still sell extremely well with a price mark-up.
 
How has that gone? GSync hardware modules are still superior to FreeSync and monitors with those modules still sell extremely well with a price mark-up.
Given at this time there are around 91 actual gsync monitors vs the hundreds of freesync monitors and that Nvidia has had to target marketing at gsync compatible to actually be relevant. And that now with HDMI 2.1 starting to pick up steam pushing it further out of that market . Sure it may be the " better" solution but it lost the war.

Sent from my GM1917 using Tapatalk
 
Many of those hundreds of monitors are literally trash though.

Freesync premium/premium pro monitors are fine, but anything predating that isn't even remotely comparable to gsync monitors in terms of quality standards.
 
Last edited:
Drop the analogy and if you want to have a feesync/gsync debate, create a new thread please.


**Further comments on the feesync/gsync debate will be deleted from this thread**
 
What about Vsync, Adaptive Sync?

NQYMyU2.gif
 
anyway the other news going around is that it looks better than TAA performs better than TAA and is easier to implement than dlss and works on the last 5 years of GPUs from AMD and Nvidia if they decide to use it. AND supports both PS5 and Xbox series S/X.


I have a feeling the new UE5 Temporal reconstruction stuff is actually based on or the foundation for FSR.
 
I'm pretty excited it works on GCN, because my R9 Fury is GCN and this could potentially help me wait out the current supply/demand and pricing issues with GPU's. Assuming it will work on my R9 Fury. At the moment, they just say as far back as RX500 series. But CGN never really changed much feature wise.... going to be interesting :)

*late edits!*
Oh and if it gets implemented on anything I play of course.
 
Back
Top