RX 5700 XT vs 2060 Super: 6 months later.

But DLSS isn't always going to be there and likely won't be unless nVidia sponsors the game.

p.s. Unless a non-proprietary version of DLSS pops into existence.

I'm sure AMD will have it's own version of DLSS when Big Navi launches.

Right now nv RTX/DLSS is the only game in town.
 
but where are the all the 100's of games they promised :hmm:

we got less than a dozen for 1350 bucks after 18 months and 3 or 4 AAA most are crap games with RT tacked on to sell a crap game :(



and we all know it is what delayed cyberpunk 2077 we would be playing it in 2 days but for RTX :p


Regardless of your thoughts on RTX, where did your $1350 end up going? :lol:




But DLSS isn't always going to be there and likely won't be unless nVidia sponsors the game.


Doesn't have to be there for every game, just the ones people would be most looking forward to. So far, it looks a large number of AAA games due to release in 2020 worth their title will have DLSS 2.0 support:


Atomic Heart
Serious Sam 4: Planet Badass
Vampire: The Masquerade – Bloodlines 2
Cyberpunk 2077
Watch Dogs Legion


Those are the one's I'm personally interested but there's more. This week even Minecraft is getting a ray tracing and DLSS 2.0 update as well.


Some of those games probably won't need DLSS once the 3080Ti drops, but for people who buy the lower end cards like the 2060, it would be foolish to dismiss the benefits DLSS 2.0 can offer these gamers. We already know 4k DLSS 2.0 performance mode is superior to native 1440p in terms of quality, a 5700XT simply wouldn't be able to compete performance or IQ wise in this area.



p.s. Unless a non-proprietary version of DLSS pops into existence.


And who would front that non proprietary version of DLSS so it runs on all platforms other than their own? Do you think AMD is going to dedicate AI server farms out of its own pocket to release a non proprietary version of DLSS that will work on it's competitors hardware? FYI, AMD already tried this as a concept and didn't find it worth the time and cost to implement. Unlike Nvidia, which has AI server farms up the ass since it's one of their core businesses so they can set aside a dedicated neural network for DLSS training.
 
Regardless of your thoughts on RTX, where did your $1350 end up going? :lol:







Doesn't have to be there for every game, just the ones people would be most looking forward to. So far, it looks a large number of AAA games due to release in 2020 worth their title will have DLSS 2.0 support:


Atomic Heart
Serious Sam 4: Planet Badass
Vampire: The Masquerade – Bloodlines 2
Cyberpunk 2077
Watch Dogs Legion


Those are the one's I'm personally interested but there's more. This week even Minecraft is getting a ray tracing and DLSS 2.0 update as well.


Some of those games probably won't need DLSS once the 3080Ti drops, but for people who buy the lower end cards like the 2060, it would be foolish to dismiss the benefits DLSS 2.0 can offer these gamers. We already know 4k DLSS 2.0 performance mode is superior to native 1440p in terms of quality, a 5700XT simply wouldn't be able to compete performance or IQ wise in this area.





And who would front that non proprietary version of DLSS so it runs on all platforms other than their own? Do you think AMD is going to dedicate AI server farms out of its own pocket to release a non proprietary version of DLSS that will work on it's competitors hardware? FYI, AMD already tried this as a concept and didn't find it worth the time and cost to implement. Unlike Nvidia, which has AI server farms up the ass since it's one of their core businesses so they can set aside a dedicated neural network for DLSS training.

Newegg & Asus and Taiwan most likely :p

and NV already dumped most of the DLSS training with 2.0

Responding to both competitive pressure and the realization of their own technology limitations, the latest iteration of NVIDIA’s upscaling technology is a rather significant overhaul of the technique. While NVIDIA is still doing AI upscaling at a basic level, DLSS 2.0 is no longer a pure upscaler; NVIDIA is now essentially combining it with temporal anti-aliasing. The results, NVIDIA is promising, is both better image quality than DLSS 1.0, as well as faster integration within individual games by doing away with per-game training.

https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors
 
But it does matter. Because it's a game specific only boost with zero guarantee it's going to be there. So it can't be relied on.

Getting a generic version of a DLSS like feature would probably take Microsoft to implement a spec or implement on the Xbox. I merely mentioned this, as it's the only way you would see wide spread support of such a feature.
 
Newegg & Asus and Taiwan most likely :p

and NV already dumped most of the DLSS training with 2.0



https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

I hope you read that article bill:

NVIDIA is taking a different tack. Instead of relying on individual, per-game neural networks, NVIDIA has built a single generic neural network that they are optimizing the hell out of.

The single biggest change here is of course the new generic neural network. Looking to remove the expensive per-game training and the many (many) problems that non-deterministic games presented in training, NVIDIA has moved to a single generic network for all games. This newer neural network is based on a fully synthetic training set rather than individual games, which in turn is fully deterministic, allowing NVIDIA to extensively train the new network in exactly fashion they need for it to iterate and improve over generations.



I don't know how you can interpret that as "dumping DLSS training", in fact it's essentially Nvidia doubling down on DLSS training which is a good thing, because DLSS 1.0 was for all intents and purpose an overall failure.
 
But it does matter. Because it's a game specific only boost with zero guarantee it's going to be there. So it can't be relied on.

For the games that will have it, it will extend the life the card significantly. It can even be equated to essentially the same as a GPU upgrade if DLSS 2.0 allows 2060 owners to run games at higher resolutions and settings they could not otherwise run natively. It will even allow you to run RTX games with acceptable performance, which isn't even an option for 5700XT owners. Considering the similar price for both cards, you simply cannot say a 5700XT offers better long term value than a 2060S can offer when DLSS 2.0 is on the table.

Getting a generic version of a DLSS like feature would probably take Microsoft to implement a spec or implement on the Xbox. I merely mentioned this, as it's the only way you would see wide spread support of such a feature.

They need to do more than implement a spec. Again, who's going to run those server farms? Microsoft? AMD? Who's going to pay who to do it? I'd like to listen in on that conversation.
 
I hope you read that article bill:

NVIDIA is taking a different tack. Instead of relying on individual, per-game neural networks, NVIDIA has built a single generic neural network that they are optimizing the hell out of.

The single biggest change here is of course the new generic neural network. Looking to remove the expensive per-game training and the many (many) problems that non-deterministic games presented in training, NVIDIA has moved to a single generic network for all games. This newer neural network is based on a fully synthetic training set rather than individual games, which in turn is fully deterministic, allowing NVIDIA to extensively train the new network in exactly fashion they need for it to iterate and improve over generations.



I don't know how you can interpret that as "dumping DLSS training", in fact it's essentially Nvidia doubling down on DLSS training which is a good thing, because DLSS 1.0 was for all intents and purpose an overall failure.

i did
and putting out articles is great

NVIDIA is now essentially combining it with temporal anti-aliasing
and looking at 2.0 it is a lot more sharpening than training



but we have yet to see if DLSS 2.0 is still coming out in games months after everyone is done playing them
they need to fix that :hmm:
 
For the games that will have it, it will extend the life the card significantly. It can even be equated to essentially the same as a GPU upgrade if DLSS 2.0 allows 2060 owners to run games at higher resolutions and settings they could not otherwise run natively. It will even allow you to run RTX games with acceptable performance, which isn't even an option for 5700XT owners. Considering the similar price for both cards, you simply cannot say a 5700XT offers better long term value than a 2060S can offer when DLSS 2.0 is on the table.



They need to do more than implement a spec. Again, who's going to run those server farms? Microsoft? AMD? Who's going to pay who to do it? I'd like to listen in on that conversation.

we will probably find out one day nvidia has been using people's rtx cards in background to do it like folding :p
 
For the games that will have it, it will extend the life the card significantly. It can even be equated to essentially the same as a GPU upgrade if DLSS 2.0 allows 2060 owners to run games at higher resolutions and settings they could not otherwise run natively. It will even allow you to run RTX games with acceptable performance, which isn't even an option for 5700XT owners. Considering the similar price for both cards, you simply cannot say a 5700XT offers better long term value than a 2060S can offer when DLSS 2.0 is on the table.

and yet, its the generally accepted advice going around is a 5700XT over a 2060 Super *shrug*

They need to do more than implement a spec. Again, who's going to run those server farms? Microsoft? AMD? Who's going to pay who to do it? I'd like to listen in on that conversation.


:nuts: who would invest in AI server farms... wth type of question is that? Microsoft... Amazon... Google.... it's just a server farm setup to train an AI model. That doesn't require nVidia at all. If there was a spec, anyone could do it and run the AI model in shaders.

At the moment nVidia is just throwing money at it to let them implement it for the devs. It's PhysX and GameWorks all over again.
 
i did
and putting out articles is great


and looking at 2.0 it is a lot more sharpening than training



but we have yet to see if DLSS 2.0 is still coming out in games months after everyone is done playing them
they need to fix that :hmm:


The sharpening component is only to offset the temporal element. The AI upscaling is still there, which is evident in all the DLSS 2.0 overviews so far. You can't sharpen a 1440p image up to 4k, the AI upscalar is still doing the significant bulk of the work. Which you can see the examples where 4k DLSS quality mode is often giving better texture detail than native 4k. This because the AI neural network is still being fed 16k images to ensure little to no loss during the upscaling process. (Note, this was also explained in the article you referenced).
 
and yet, its the generally accepted advice going around is a 5700XT over a 2060 Super *shrug*

Those same sources also giving that advice are also saying
"Many of you probably think long term the 5700XT will age much better than the 2060 Super, that might not be the case if DLSS 2.0 game support dramatically improves in the future"

You can't take one quote and disregard the other. So now a 5700XT is a better value than a 2060S, but in the future that will not be the case if DLSS 2.0 support in games becomes more widespread. See the difference?

:nuts: who would invest in AI server farms... wth type of question is that? Microsoft... Amazon... Google.... it's just a server farm setup to train an AI model. That doesn't require nVidia at all. If there was a spec, anyone could do it and run the AI model in shaders.
It's so easy why did AMD abandon it in favor of RIS? Running a super computing cluster isn't cheap or easy. Is Amazon and Google going to offer DLSS services now? :lol:

No, you mentioned consoles. It's going to be either Microsoft or AMD that has to front the cost and services. If you think Google or Amazon is somehow going to get involved just because they have server farms then you do not understand the technology. Nvidia already has a strong foothold in the AI neural network market which is why they can throw their weight around with petty projects like DLSS. You know, there is talk of Nvidia implementing this to Nintendo for the Switch Pro. There is nothing from either AMD or Microsoft, not even an interest in bringing anything similar to PS5/XboxX, instead relying on existing brute force or inferior methods like checkerboard rendering.

At the moment nVidia is just throwing money at it to let them implement it for the devs. It's PhysX and GameWorks all over again.
Physx is integrated in nearly every game engine today in software. That's not a good reference. Gameworks did require sponsorship from Nvidia and dev work, the difference here is that DLSS 2.0 the bulk of the work is being done by Nvidia not the devs so in theory DLSS 2.0 should be easier implement and thus a larger library of support (which we're seeing now)
 
Those same sources also giving that advice are also saying
"Many of you probably think long term the 5700XT will age much better than the 2060 Super, that might not be the case if DLSS 2.0 game support dramatically improves in the future"

You can't take one quote and disregard the other. So now a 5700XT is a better value than a 2060S, but in the future that will not be the case if DLSS 2.0 support in games becomes more widespread. See the difference?

I'm not saying the feature it usless. But you can't say a card ia more future proof because a handful of games will run faster if it's has a vendor locked feature.

Maybe Mantle would be a better comparison? Till DLSS can be enabled without the need for a game to implement it, it's usefulness is limited.

It's so easy why did AMD abandon it in favor of RIS? Running a super computing cluster isn't cheap or easy. Is Amazon and Google going to offer DLSS services now? :lol:

No, you mentioned consoles. It's going to be either Microsoft or AMD that has to front the cost and services. If you think Google or Amazon is somehow going to get involved just because they have server farms then you do not understand the technology. Nvidia already has a strong foothold in the AI neural network market which is why they can throw their weight around with petty projects like DLSS. You know, there is talk of Nvidia implementing this to Nintendo for the Switch Pro. There is nothing from either AMD or Microsoft, not even an interest in bringing anything similar to PS5/XboxX, instead relying on existing brute force or inferior methods like checkerboard rendering.

Who said AMD did RIS because they abandoned a method like DLSS?

It's an AI model trained to reconstruct the image based on a higher quality reference. It's not easy, but it doesn't require anything nVidia only. It just made allot of sense for nVidia to leverage a tech they are investing in.

Physx is integrated in nearly every game engine today in software. That's not a good reference. Gameworks did require sponsorship from Nvidia and dev work, the difference here is that DLSS 2.0 the bulk of the work is being done by Nvidia not the devs so in theory DLSS 2.0 should be easier implement and thus a larger library of support (which we're seeing now)

Yes... because it's software based physics engine is pretty good. But the hardware acceleration and implementation died because nVidia locked it down.
 
I'm not saying the feature it usless. But you can't say a card ia more future proof because a handful of games will run faster if it's has a vendor locked feature.

Until AMD comes out with a RT card this is the only game in town so for the foreseeable future (since we have no idea when Big Navi will come out thanks to Covid) this what we have.
 
Until AMD comes out with a RT card this is the only game in town so for the foreseeable future (since we have no idea when Big Navi will come out thanks to Covid) this what we have.

RT is not DLSS. And if the 2060 series struggles to do RT without DLSS now, then holding up DLSS as the saving grace for the 2060's future is foolish. As it requires a game to implement DLSS.
 
RT is not DLSS. And if the 2060 series struggles to do RT without DLSS now, then holding up DLSS as the saving grace for the 2060's future is foolish. As it requires a game to implement DLSS.

Ok, we're going around in circles.
 
Sure are a lot of IFs being argued about here. Currently, the only arguments that hold any real meaning are those that are not based on IFs, which means currently, the 5700 XT is the best buy because it currently doesn't rely on IFs to achieve that.


Now if the IFs materialize 6 months or a year from now, then there might be something to argue about, and the 5700XT might not be the best buy at that time. However, by the time any of those IFs materialize which weighs heavily on enough games to be released that support certain features, there will be new cards out that have replaced both the 5700 XT and the 2060 super, so in all reality, there is no argument, even based on IFs.
 
Until AMD comes out with a RT card this is the only game in town so for the foreseeable future (since we have no idea when Big Navi will come out thanks to Covid) this what we have.

till a new AAA games comes out like cyberpunk 2077 there is not one :p

and anyone taking bets it hits without DLSS 2.0 :hmm:
 
nVidia sponsored title, so I bet it will.

if so i think it will be the first to release with both RT and DLSS working day one



but i pre ordered it when the release date was April 16

then they said they would do RTX and a few weeks later pushed it to September 17

i'm now waiting on the virus delay till next year


i would rather have had it this Thursday without RTX
 
Back
Top