Nvidia RTX DLSS/Ray Tracing Discussion

It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase.

First, Battlefield V and metro do not fully implement Raytracing. They are only using it partially, as neither game would be playable at 4k if it was fully implemented, that is why it is only viable at 1080P if fully utilized. Removing and/or reducing what the game is ray tracing doesn't remove that fact.

Second, DLSS and raytracing at 4k is a not 4k, it is a lower resolution
(1440p) upscaled, which means the ray tracing (only partially being implemented) and AA is being applied to the lower image and then being upscaled (I originally thought ray tracing was being done after DLSS upscaling, but realized that to gain frame rate with DLSS, it can only be accomplished by also performing ray tracing at the lower resolution before it is upscaled). This means that it will never look as good as a true native 4k implementation. Basically it's the same as a HD 4k blue ray player that upscales standard Blu ray movies to 4k, they may look a bit better than the original resolution, but it will never be true 4k. (infact, if you ran a game at 1440p on a 4k monitor, doens't the monitor already upscale it to display it at full screen on a the 4k monitor?.. I know my 1440p monitor does it for 1080p... does that mean all games are now 4k.... oh wait)

As for the video's that are showing the comparisons side by side being only at 1080p doesn't change anything. Because going from 4k to 1080p in a video will equally effect both the 4k DLSS example and the Native 4k example, so if the video's where indeed encoded and watchable at 4k resolution, the clarity differences noted would be identical to what we see in the 1080p videos as they are both effected equally as it wouldn't just effect the DLSS examples.

But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D

But right now, this is just smoke and mirrors in it's current form. Will it get better, Yes, at least where Ray tracing is concerned. But DLSS, in it's current form, is really just a fancy name for upscaling more so than AA.

only DLSS is at lower res

you can have RT at full 4k 30 to 45 FPS


It's fun to watch all the non RTX owners with their sour grapes

………...
is it perfect no is it better than anything AMD has now oh hell yes


and my 2080 ti runs division 2 beta cranked up to max at 4k well witch is the main reason I got it (to run games at 4k) not for DLSS or RTX they are just a bonus

sorry AMD has no card that will run games at 4k well when they do I will buy one but till then Vega 2 is a joke so NV is the only game in town .
 
Doesn't look worse either, its comparable according to this video. Which means DLSS is actually doing its job, giving you a better image than 1440p by upscaling to 4k so you don't have to run in 1440p RTX natively. Why mess with a custom resolution if you can just tick a box?

Also a big point is that you cannot use RTX with 1800p custom resolution with Metro Exodus.

Or you know, just run it at the resolution you want to claim you do.

The entire thing of DLSS to me is just one amazing example of how people still haven't mastered the idea of garbage in, garbage out. There is no way that you are going to ever have parity between an up scaled resolution vs. native resolution. There may be some trickery to go with it; you'll see some canned examples of how it is near perfect in the form of synthetic demos, but it will not be quick enough in any actual "gameplay" to be on that same level as simply running it at the resolution you want to claim it is.

4k DLSS isn't 4k. It is a marketing gimmick plain and simple.
 
Tend to look at it as another welcomed choice to have for the gamer. 4k may be out-of-reach for some titles, where 4k dlss may be appreciated for some.
 
Tend to look at it as another welcomed choice to have for the gamer. 4k may be out-of-reach for some titles, where 4k dlss may be appreciated for some.

It is like Jensen standing up there and saying that a perpetual motion machine "just works" and the wallets opening up to agree. Because, when you can convince someone to consider something a new feature, that is literally worse than a feature that has existed for a while, I guess you really do have a perpetual motion machine of a company that can just keep printing money...
 
Or you know, just run it at the resolution you want to claim you do.

The entire thing of DLSS to me is just one amazing example of how people still haven't mastered the idea of garbage in, garbage out. There is no way that you are going to ever have parity between an up scaled resolution vs. native resolution. There may be some trickery to go with it; you'll see some canned examples of how it is near perfect in the form of synthetic demos, but it will not be quick enough in any actual "gameplay" to be on that same level as simply running it at the resolution you want to claim it is.

4k DLSS isn't 4k. It is a marketing gimmick plain and simple.

It amazes me how people who don't own the card want to make decisions for people who do own them.

I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make.

Also, I like how we're constantly reminded how 4K DLSS isn't native 4k, as if its some unknown revelation only now revealed by the last person who said it.
 
It is like Jensen standing up there and saying that a perpetual motion machine "just works" and the wallets opening up to agree. Because, when you can convince someone to consider something a new feature, that is literally worse than a feature that has existed for a while, I guess you really do have a perpetual motion machine of a company that can just keep printing money...

Not all agree blindly and desire more maturity, content, quality and compatibility from Dlss but do appreciate the engineering potential of the feature moving forward.
 
Not all agree blindly and desire more maturity, content, quality and compatibility from Dlss but do appreciate the engineering potential of the feature moving forward.

It amazes me how people who don't own the card want to make decisions for people who do own them.

I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make.

Also, I like how we're constantly reminded how 4K DLSS isn't native 4k, as if its some unknown revelation only now revealed by the last person who said it.

Therein lies the rub (for both of your points) you actually don't get to make the choice, Nvidia and a game developer decides whether you can use it or not, remember? They've got to put it to their AI to learn and adapt, so what if the game you need it for and want to use it for, isn't supported? What about the card you have not being supported as well? Is it still really a choice?

I find it troubling to also have them tout it as being faster than the native resolution for the simple fact that they are apples and oranges. You cannot have an actual comparison between native vs. DLSS and imply that one is 40% faster than other like some sites claim and seemingly ignore all of the weird artifacts, blurry textures, odd DoF, and more aliasing that is present in one, but not the other. It is just another thing that we're going to have to look for when someone makes any sort of claim on performance.

I'm not telling you to not use the fancy new "feature", on the contrary, you do you (if you're allowed to) ... I just worrty for the time when it just further muddies already brackish waters on fair and accurate comparisons. Also, excuse my reluctance to herald Nvidia on adding more proprietary 'features' that never seemingly work out for the industry or the consumer.
 
Therein lies the rub (for both of your points) you actually don't get to make the choice, Nvidia and a game developer decides whether you can use it or not, remember? They've got to put it to their AI to learn and adapt, so what if the game you need it for and want to use it for, isn't supported? What about the card you have not being supported as well? Is it still really a choice?

Yes, it's still a choice. It's as easy as choosing a different AA method or turning it off completely or running whatever resolution/scaling/AA combo that existed since the decades prior to DLSS.



I find it troubling to also have them tout it as being faster than the native resolution for the simple fact that they are apples and oranges. You cannot have an actual comparison between native vs. DLSS and imply that one is 40% faster than other like some sites claim and seemingly ignore all of the weird artifacts, blurry textures, odd DoF, and more aliasing that is present in one, but not the other. It is just another thing that we're going to have to look for when someone makes any sort of claim on performance.
On the contrary, you can have an actual comparison because the entire point of deep learning super resolution is to take a lower resolution and use AI to extrapolate a near perfect higher resolution sample. This existed long before Nvidia had anything to do with it and used in various fields including medical and satellite imagery. As far as your laundry list of "weird artifacts" and "DOF" etc..., you're only clinging to the worst case scenario (Battlefield 5) and ignoring the success scenario (Port Royal) including implementations that are improved to the point there is only a minor loss of detail from native (Metro Exodus).

I'm not telling you to not use the fancy new "feature", on the contrary, you do you (if you're allowed to) ... I just worrty for the time when it just further muddies already brackish waters on fair and accurate comparisons. Also, excuse my reluctance to herald Nvidia on adding more proprietary 'features' that never seemingly work out for the industry or the consumer.
Then just say your pessimism is due to your negativity towards Nvidia rather than beat around the bush. You're in the same boat as NWR who have long dismissed Turing/Nvidia even before the cards were released because of your preexisting bias.

Problem is, in this case AMD is pursuing the same technology as well, so this isn't a bush you can beat around much longer. They too are working on a DLSS implementation that takes a lower resolution image and reconstructs it to a higher resolution via AI. And their results will likely have the same "wierd artifacts" and "odd DOF" you speak of, probably worse if they do not have the same supercomputing resources as Nvidia to dedicate their AI networks with.
 
Last edited:
We already know no game is "fully ray traced", that's impossible with current hardware. "Fully ray traced" would be near unlimited rays with near unlimited light bounces.

You must be referring to the performance gained by Battlefield 5. If so, the only thing they removed was the back and forth light bouncing from objects that didn't need to be ray traced, like leaves. That and further bug fixes and optimizations that resulted in much better performance without sacrificing image quality. This is old news and discussed in the Battlefield 5 thread numerous times already.

Also I'm glad you pointed out 4k DLSS is not native 4k. None of us knew that before.

Also keep up that tone, I'm sure a vacation will be in your future if you keep that up!

Why would I get a vacation? I have done nothing wrong except state a different opinion that is not in agreement with yours, so please stop with your "vacation" threats.

I was countering you comment about "what happened to the 'it won't be playable above 1080p' " comment, which you just admitted that it can't be done at 4k, as well as admitting it isn't true 4k.. so my comment stands that raytracing is only viable at 1080p fully utilizing it. Not the water down versions that we see in Battlefield V and Metro. Also, in battlefield 5, they removed the amount of ray traces being done per frame (If I remember correctly, it as more than a 50% reduction in what is being ray traced) not just the back and forth lighting etc. That was their answer to optimization.
 
Last edited:
only DLSS is at lower res

you can have RT at full 4k 30 to 45 FPS


It's fun to watch all the non RTX owners with their sour grapes

………...
is it perfect no is it better than anything AMD has now oh hell yes


and my 2080 ti runs division 2 beta cranked up to max at 4k well witch is the main reason I got it (to run games at 4k) not for DLSS or RTX they are just a bonus

sorry AMD has no card that will run games at 4k well when they do I will buy one but till then Vega 2 is a joke so NV is the only game in town .

If it was only DLSS that is being done at low rez, and then upscaled, why has it been demonstrated that 1800p to 4k, without DLSS can be upscaled giving exactly the same performance and a better picture? There is no way that you can take raytracing NO DLSS that results in low fps and magically turn on DLSS and magically increase the frame rates substantially.. The only reasonable explanation is that both raytracing and DLSS is being performed at a lower resolution and then upscaled. Even in one of your past examples when 3dmark dlss benchmark was released you showed without DLSS your fps being.. 17fps if I remember correctly, yet magically increased substantially with DLSS..

Now, I may be wrong, but that is my take on it.
 
Last edited:
I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make.

This example right here, pretty much demonstrates that 4k DLSS is just 1440p being upscaled and both raytracing and DLSS are being performed at 1440p before the upscale.. otherwise, you wouldn't be able to go fro 30fps-40ps native 4k, to 60fps at 4k with DLSS (the identical performance you get at 1440p native resolution). The math just doesn't add up any other way. Sorry!
 
Last edited:
Yes, it's still a choice. It's as easy as choosing a different AA method or turning it off completely or running whatever resolution/scaling/AA combo that existed since the decades prior to DLSS.

You also have the choice to run at 640x480.

On the contrary, you can have an actual comparison because the entire point of deep learning super resolution is to take a lower resolution and use AI to extrapolate a near perfect higher resolution sample. This existed long before Nvidia had anything to do with it and used in various fields including medical and satellite imagery. As far as your laundry list of "weird artifacts" and "DOF" etc..., you're only clinging to the worst case scenario (Battlefield 5) and ignoring the success scenario (Port Royal) including implementations that are improved to the point there is only a minor loss of detail from native (Metro Exodus).

Do you understand what I was meaning by the term garbage in, garbage out? If a source is crappy, the output of that source is going to be crappy. No matter how much tomfoolery you want to believe can happen via the modern magic of "AI" it will never be the same as just the native resolution. It is like you believe this to be the same techniques used on CSI and other crime shows to infinitely zoom into a digital picture and magically give a crystal clear cap of a license plate that was only a few pixels wide in the original; you cannot simply create pixel data from nothing.

As for the demos, there are two titles that actually use this right now and a canned benchmark. Of the two games, where you choose to take whatever path you want, look wherever you want, shoot whatever you want we've seen that DLSS results in a blurrier picture with shimmering artifacts, textures that lack detail, and aliasing. Of the canned benchmark which the developers choose the path you take, choose what you look at, choose what happens in the scene, and make sure that it happens this way every time then you see "better" performance of the AI. It is like congratulating the person who studied for a test via having a copy of the questions beforehand. If nothing changes from each run, why would it not get better? Yet, how is this in anyway an accurate reflection of the technology to you a gamer? What you want to prop up as amazing is just the masturbatory product of letting a computer look at the same sequence of images for hours upon hours and then letting it give you a result that is still inferior to the native resolution. No matter how "close" it is, it is still an unachievable result for any game where the player has any agency in the course of its strung together images.


Then just say your pessimism is due to your negativity towards Nvidia rather than beat around the bush. You're in the same boat as NWR who have long dismissed Turing/Nvidia even before the cards were released because of your preexisting bias.

I've dismissed it as an overpriced novelty from the get go because that is entirely what it is. There is no redeeming value to spending another $400 on top of the cost of my 1080ti to achieve marginally faster results without any of the RTX specific stuff. Furthermore, none of the games I play need it. I'm quite happy to sit at my 4K 60FPS for all of the PC games I play. There is no need for me to upgrade. That money is better served waiting for Ryzen 2 in the next few months.

As far as bias goes, some of it is well earned. The whole port royal nonsense above just reminds me of the time they were caught in 3DMark03(?) actively occluding things to artificially lighten their load and result in higher performance that never saw an ounce of utility in the real world games you would actually play.... sounds awfully familiar.

Problem is, in this case AMD is pursuing the same technology as well, so this isn't a bush you can beat around much longer. They too are working on a DLSS implementation that takes a lower resolution image and reconstructs it to a higher resolution via AI. And their results will likely have the same "wierd artifacts" and "odd DOF" you speak of, probably worse if they do not have the same supercomputing resources as Nvidia to dedicate their AI networks with.

If they do, they do, I have the choice to not use either implementation. I'd rather they spend their resources on things that actually improve my experience rather than give me a placebo like effect to think that I'm at a resolution I'm not at, with image quality that is not as good as I should expect.
 
You also have the choice to run at 640x480.

As I said, you have the choice between native 4k and 4k DLSS. We know the strengths and benefits of each. So why are you here keep repeating the same ad naseum DLSS isn't native 4k that everyone one the planet knows?

Do you understand what I was meaning by the term garbage in, garbage out? If a source is crappy, the output of that source is going to be crappy.

I'm sorry, then you are wholly ignorant of the entire field of deep learning super resolution that existed even before Nvidia renamed it "Deep Learning Super Sampling" in their own implementation. We had an entire discussions on this in the other thread that you missed with numerous example of entirely different fields using what is essentially "DLSS" (Nvidia's term).
https://deepsense.ai/using-deep-learning-for-single-image-super-resolution/

No matter how much tomfoolery you want to believe can happen via the modern magic of "AI" it will never be the same as just the native resolution.
For the last time, no one is saying that is the case. However, I suggest you do some research into deep learning super resolution, especially note the examples in that link.

It is like you believe this to be the same techniques used on CSI and other crime shows to infinitely zoom into a digital picture and magically give a crystal clear cap of a license plate that was only a few pixels wide in the original; you cannot simply create pixel data from nothing.
No, no one believes this, YOU are the one trying to portray this argument (and thus argue the point). Have fun with that.

As for the demos, there are two titles that actually use this right now and a canned benchmark. Of the two games, where you choose to take whatever path you want, look wherever you want, shoot whatever you want we've seen that DLSS results in a blurrier picture with shimmering artifacts, textures that lack detail, and aliasing.
DLSS is trash in Battlefield 5 but excellent in Metro Exodus (and Final Fantasy XV). Every 2080Ti owner here has reported there is only a minor loss of detail when using DLSS with Metro Exodus, on par expected of a 1440p image upscaled. I don't see any shimmering, textures that lack detail, or anything "blurry". Turn on DLSS at 4k and you get clarity equivalent to 1800p (better than 1440p native) with zero aliasing and much better performance than native 4k.

You are arguing a frivolous point because you just want to argue a non argument. Have you tested DLSS yourself? On what card? On what machine?

Of the canned benchmark which the developers choose the path you take, choose what you look at, choose what happens in the scene, and make sure that it happens this way every time then you see "better" performance of the AI. It is like congratulating the person who studied for a test via having a copy of the questions beforehand. If nothing changes from each run, why would it not get better?
Then let's congratulate the developers of Metro for their DLSS implementation, because their post patch results are MUCH superior than the blurry mess they had before.

Yet, how is this in anyway an accurate reflection of the technology to you a gamer? What you want to prop up as amazing is just the masturbatory product of letting a computer look at the same sequence of images for hours upon hours and then letting it give you a result that is still inferior to the native resolution. No matter how "close" it is, it is still an unachievable result for any game where the player has any agency in the course of its strung together images.
No matter how close it is? If it gets 90% close to a reference 4k native image then THAT is a success. And that's pretty much what we got with the newest Metro DLSS patch, with none of that garbage you keep spewing about because you haven't seen DLSS in action yourself.


I've dismissed it as an overpriced novelty from the get go because that is entirely what it is. There is no redeeming value to spending another $400 on top of the cost of my 1080ti to achieve marginally faster results without any of the RTX specific stuff. Furthermore, none of the games I play need it. I'm quite happy to sit at my 4K 60FPS for all of the PC games I play. There is no need for me to upgrade. That money is better served waiting for Ryzen 2 in the next few months.
Who cares? Take that BS elsewhere because this is not the thread for it.


As far as bias goes, some of it is well earned. The whole port royal nonsense above just reminds me of the time they were caught in 3DMark03(?) actively occluding things to artificially lighten their load and result in higher performance that never saw an ounce of utility in the real world games you would actually play.... sounds awfully familiar.
Yes, because ATI hasn't done the exact sort of thing either. :rolleyes:

The funny thing about DLSS is, it's all about image quality. So reviewers are specifically looking at image quality down to the pixel. So you point out one credible source that Nvidia is "cheating" in Port Royale or "cheating" in some other way. Go ahead, I'll wait. Or could this be a case of the usual...

If they do, they do, I have the choice to not use either implementation. I'd rather they spend their resources on things that actually improve my experience rather than give me a placebo like effect to think that I'm at a resolution I'm not at, with image quality that is not as good as I should expect.
Too bad, AMD is looking to have their equivalent DLSS implementation once DirectML is released, which will allow deep learning aglorithms to run on their compute cores in directx 12.
 
Why would I get a vacation? I have done nothing wrong except state a different opinion that is not in agreement with yours, so please stop with your "vacation" threats.

Your baited statements like these have been reported:

It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase.

But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D



Essentially you dropped in with those statements with the clear intent to troll/bait.


I was countering you comment about "what happened to the 'it won't be playable above 1080p' " comment, which you just admitted that it can't be done at 4k, as well as admitting it isn't true 4k.. so my comment stands that raytracing is only viable at 1080p fully utilizing it. Not the water down versions that we see in Battlefield V and Metro. Also, in battlefield 5, they removed the amount of ray traces being done per frame (If I remember correctly, it as more than a 50% reduction in what is being ray traced) not just the back and forth lighting etc. That was their answer to optimization.

All of your comments are wrong and your issue you won't admit being wrong even with facts presented directly to you. Such as the case when you had all sorts of misinformation about Deep Learning itself and others had to correct you on numerous occasions.

As I said before, Battlefield 5 removed ray tracing on objects that didnt need them, like leaves. Did you see any reflections on leaves before the RTX performance patch? No, so that wasteful resource was eliminated. And there was an RTX rendering bug that allowed a large number of light bounces that they fixed.

I feel I am repeating myself here. There are videos from Digital Foundry and Dice themselves on the specific improvements in the other Battlefield 5 thread. Go have a gander.

If all you wanted to say was ray tracing is only really playable in 1080p "fully utilized", then that is yet another wrong statement...just for the fact that Battlefield 5 was fully playable at 1440p 60fps BEFORE any performance patches. Again, I feel I'm repeating myself here, this was posted in the Battlefield 5 RTX thread.
 
It is quite ironic you want me to educate myself on something you obviously fail to grasp.

Answer me this. If I take an image that is 1440p and up scale it to 4k, how do I account for the fact there are fewer total pixels in the 1440p image? What technique is used to reach the higher pixel count?
 
This example right here, pretty much demonstrates that 4k DLSS is just 1440p being upscaled and both raytracing and DLSS are being performed at 1440p before the upscale.. otherwise, you wouldn't be able to go fro 30fps-40ps native 4k, to 60fps at 4k with DLSS (the identical performance you get at 1440p native resolution). The math just doesn't add up any other way. Sorry!

Are you and Jay competing for the Nobel prize to see who can repeat the same thing over and over again?

We already know 4k DLSS is 1440p upscaled. The question is if DLSS is doing enough a good job to warrant its use.

For Metro Exodus, YES.

For Battlefield 5, NO.

Metro Exodus gives a good, clear representation of the benefits of DLSS. It's quite funny reading the doomsday argument about blurring, artifacts, etc.. when I'm playing the game right now with everything Ultra including RTX and DLSS looks picture perfect as can be. I'd say it's about 90% equivalent to the native 4k image, with none of the aliasing artifacts.

Don't take my word for it though. There are other 2070/2080/2080Ti owners here. DLSS also works great at 1440p according to acroig since that's his native resolution.
 
Are you and Jay competing for the Nobel prize to see who can repeat the same thing over and over again?

We already know 4k DLSS is 1440p upscaled. The question is if DLSS is doing enough a good job to warrant its use.

For Metro Exodus, YES.

For Battlefield 5, NO.

Metro Exodus gives a good, clear representation of the benefits of DLSS. It's quite funny reading the doomsday argument about blurring, artifacts, etc.. when I'm playing the game right now with everything Ultra including RTX and DLSS looks picture perfect as can be. I'd say it's about 90% equivalent to the native 4k image, with none of the aliasing artifacts.

Don't take my word for it though. There are other 2070/2080/2080Ti owners here. DLSS also works great at 1440p according to acroig since that's his native resolution.

This, 100% agreed.
 
It is quite ironic you want me to educate myself on something you obviously fail to grasp.

Answer me this. If I take an image that is 1440p and up scale it to 4k, how do I account for the fact there are fewer total pixels in the 1440p image? What technique is used to reach the higher pixel count?

Did you bother reading the link I posted?

You know what happens when you take a 1440p image and upscale it 4k like in photoshop? It looks like crap.

I'm 99% certain that's how you think deep learning super resolution works. And there in lies the fault of your argument.

To answer your question, answer my question....

What if you take that 1440p image, look at it side by side with a native 4k image, would you be able to derive an algorithm that eventually comes close to the native 4k image? Because that's exactly what deep learning super resolution does.... you train the AI on ground truth images, feed it thousands of times over and over and over days or months, and eventually it will learn to produce similar native quality results from those lower resolution images. Those "missing pixels" are extrapolated from the reference native images, something you can't do with a simple upscaling filter. If you understand this then you can understand why Nvidia has a dedicated supercomputing cluster for training DLSS on games and why the more training time the better the output will be. Probably also explains why it takes so long to get DLSS implemented in game, and they likely rushed it for Battlefield 5. They did mention the next BF5 patch will vastly improve DLSS, it's unfortunately they rushed it out prematurely.

Lastly can you do me a favor and actually play with DLSS on/off first hand in games like Metro Exodus/FFXV before making judgement? I'm sorry, but 1080p youtube videos don't cut it.
 
Last edited:
Curious to know more about the inner workings of the tensor cores on how it makes Dlss possible. Need to learn more on this.
 
Back
Top