Rage3D Discussion Area

Rage3D Discussion Area (http://www.rage3d.com/board/index.php)
-   Other Graphics Cards and 3D Technologies (http://www.rage3d.com/board/forumdisplay.php?f=65)
-   -   Nvidia RTX DLSS/Ray Tracing Discussion (http://www.rage3d.com/board/showthread.php?t=34049038)

bill dennison Mar 1, 2019 11:33 AM

Quote:

Originally Posted by NWR_Midnight (Post 1338121729)
It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase.

First, Battlefield V and metro do not fully implement Raytracing. They are only using it partially, as neither game would be playable at 4k if it was fully implemented, that is why it is only viable at 1080P if fully utilized. Removing and/or reducing what the game is ray tracing doesn't remove that fact.

Second, DLSS and raytracing at 4k is a not 4k, it is a lower resolution
(1440p) upscaled, which means the ray tracing (only partially being implemented) and AA is being applied to the lower image and then being upscaled (I originally thought ray tracing was being done after DLSS upscaling, but realized that to gain frame rate with DLSS, it can only be accomplished by also performing ray tracing at the lower resolution before it is upscaled). This means that it will never look as good as a true native 4k implementation. Basically it's the same as a HD 4k blue ray player that upscales standard Blu ray movies to 4k, they may look a bit better than the original resolution, but it will never be true 4k. (infact, if you ran a game at 1440p on a 4k monitor, doens't the monitor already upscale it to display it at full screen on a the 4k monitor?.. I know my 1440p monitor does it for 1080p... does that mean all games are now 4k.... oh wait)

As for the video's that are showing the comparisons side by side being only at 1080p doesn't change anything. Because going from 4k to 1080p in a video will equally effect both the 4k DLSS example and the Native 4k example, so if the video's where indeed encoded and watchable at 4k resolution, the clarity differences noted would be identical to what we see in the 1080p videos as they are both effected equally as it wouldn't just effect the DLSS examples.

But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D

But right now, this is just smoke and mirrors in it's current form. Will it get better, Yes, at least where Ray tracing is concerned. But DLSS, in it's current form, is really just a fancy name for upscaling more so than AA.

only DLSS is at lower res

you can have RT at full 4k 30 to 45 FPS


It's fun to watch all the non RTX owners with their sour grapes

………...
is it perfect no is it better than anything AMD has now oh hell yes


and my 2080 ti runs division 2 beta cranked up to max at 4k well witch is the main reason I got it (to run games at 4k) not for DLSS or RTX they are just a bonus

sorry AMD has no card that will run games at 4k well when they do I will buy one but till then Vega 2 is a joke so NV is the only game in town .

acroig Mar 1, 2019 11:42 AM

Quote:

Originally Posted by Exposed (Post 1338121732)
Also I'm glad you pointed out 4k DLSS is not native 4k. None of us knew that before.

Quote:

Originally Posted by bill dennison (Post 1338121734)
It's fun to watch all the non RTX owners with their sour grapes

sorry AMD has no card that will run games at 4k well when they do I will buy one but till then Vega 2 is a joke so NV is the only game in town .

I figured you two would respond so I stayed out. ;)

Jay20016 Mar 1, 2019 08:59 PM

Quote:

Originally Posted by Exposed (Post 1338121553)
Doesn't look worse either, its comparable according to this video. Which means DLSS is actually doing its job, giving you a better image than 1440p by upscaling to 4k so you don't have to run in 1440p RTX natively. Why mess with a custom resolution if you can just tick a box?

Also a big point is that you cannot use RTX with 1800p custom resolution with Metro Exodus.

Or you know, just run it at the resolution you want to claim you do.

The entire thing of DLSS to me is just one amazing example of how people still haven't mastered the idea of garbage in, garbage out. There is no way that you are going to ever have parity between an up scaled resolution vs. native resolution. There may be some trickery to go with it; you'll see some canned examples of how it is near perfect in the form of synthetic demos, but it will not be quick enough in any actual "gameplay" to be on that same level as simply running it at the resolution you want to claim it is.

4k DLSS isn't 4k. It is a marketing gimmick plain and simple.

SIrPauly Mar 1, 2019 09:35 PM

Tend to look at it as another welcomed choice to have for the gamer. 4k may be out-of-reach for some titles, where 4k dlss may be appreciated for some.

Jay20016 Mar 1, 2019 09:48 PM

Quote:

Originally Posted by SIrPauly (Post 1338121823)
Tend to look at it as another welcomed choice to have for the gamer. 4k may be out-of-reach for some titles, where 4k dlss may be appreciated for some.

It is like Jensen standing up there and saying that a perpetual motion machine "just works" and the wallets opening up to agree. Because, when you can convince someone to consider something a new feature, that is literally worse than a feature that has existed for a while, I guess you really do have a perpetual motion machine of a company that can just keep printing money...

Exposed Mar 1, 2019 10:15 PM

Quote:

Originally Posted by Jay20016 (Post 1338121816)
Or you know, just run it at the resolution you want to claim you do.

The entire thing of DLSS to me is just one amazing example of how people still haven't mastered the idea of garbage in, garbage out. There is no way that you are going to ever have parity between an up scaled resolution vs. native resolution. There may be some trickery to go with it; you'll see some canned examples of how it is near perfect in the form of synthetic demos, but it will not be quick enough in any actual "gameplay" to be on that same level as simply running it at the resolution you want to claim it is.

4k DLSS isn't 4k. It is a marketing gimmick plain and simple.

It amazes me how people who don't own the card want to make decisions for people who do own them.

I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make.

Also, I like how we're constantly reminded how 4K DLSS isn't native 4k, as if its some unknown revelation only now revealed by the last person who said it.

SIrPauly Mar 1, 2019 10:32 PM

Quote:

Originally Posted by Jay20016 (Post 1338121827)
It is like Jensen standing up there and saying that a perpetual motion machine "just works" and the wallets opening up to agree. Because, when you can convince someone to consider something a new feature, that is literally worse than a feature that has existed for a while, I guess you really do have a perpetual motion machine of a company that can just keep printing money...

Not all agree blindly and desire more maturity, content, quality and compatibility from Dlss but do appreciate the engineering potential of the feature moving forward.

Jay20016 Mar 1, 2019 11:14 PM

Quote:

Originally Posted by SIrPauly (Post 1338121832)
Not all agree blindly and desire more maturity, content, quality and compatibility from Dlss but do appreciate the engineering potential of the feature moving forward.

Quote:

Originally Posted by Exposed (Post 1338121830)
It amazes me how people who don't own the card want to make decisions for people who do own them.

I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make.

Also, I like how we're constantly reminded how 4K DLSS isn't native 4k, as if its some unknown revelation only now revealed by the last person who said it.

Therein lies the rub (for both of your points) you actually don't get to make the choice, Nvidia and a game developer decides whether you can use it or not, remember? They've got to put it to their AI to learn and adapt, so what if the game you need it for and want to use it for, isn't supported? What about the card you have not being supported as well? Is it still really a choice?

I find it troubling to also have them tout it as being faster than the native resolution for the simple fact that they are apples and oranges. You cannot have an actual comparison between native vs. DLSS and imply that one is 40% faster than other like some sites claim and seemingly ignore all of the weird artifacts, blurry textures, odd DoF, and more aliasing that is present in one, but not the other. It is just another thing that we're going to have to look for when someone makes any sort of claim on performance.

I'm not telling you to not use the fancy new "feature", on the contrary, you do you (if you're allowed to) ... I just worrty for the time when it just further muddies already brackish waters on fair and accurate comparisons. Also, excuse my reluctance to herald Nvidia on adding more proprietary 'features' that never seemingly work out for the industry or the consumer.

Exposed Mar 2, 2019 12:12 AM

Quote:

Originally Posted by Jay20016 (Post 1338121837)
Therein lies the rub (for both of your points) you actually don't get to make the choice, Nvidia and a game developer decides whether you can use it or not, remember? They've got to put it to their AI to learn and adapt, so what if the game you need it for and want to use it for, isn't supported? What about the card you have not being supported as well? Is it still really a choice?

Yes, it's still a choice. It's as easy as choosing a different AA method or turning it off completely or running whatever resolution/scaling/AA combo that existed since the decades prior to DLSS.



Quote:

I find it troubling to also have them tout it as being faster than the native resolution for the simple fact that they are apples and oranges. You cannot have an actual comparison between native vs. DLSS and imply that one is 40% faster than other like some sites claim and seemingly ignore all of the weird artifacts, blurry textures, odd DoF, and more aliasing that is present in one, but not the other. It is just another thing that we're going to have to look for when someone makes any sort of claim on performance.
On the contrary, you can have an actual comparison because the entire point of deep learning super resolution is to take a lower resolution and use AI to extrapolate a near perfect higher resolution sample. This existed long before Nvidia had anything to do with it and used in various fields including medical and satellite imagery. As far as your laundry list of "weird artifacts" and "DOF" etc..., you're only clinging to the worst case scenario (Battlefield 5) and ignoring the success scenario (Port Royal) including implementations that are improved to the point there is only a minor loss of detail from native (Metro Exodus).

Quote:

I'm not telling you to not use the fancy new "feature", on the contrary, you do you (if you're allowed to) ... I just worrty for the time when it just further muddies already brackish waters on fair and accurate comparisons. Also, excuse my reluctance to herald Nvidia on adding more proprietary 'features' that never seemingly work out for the industry or the consumer.
Then just say your pessimism is due to your negativity towards Nvidia rather than beat around the bush. You're in the same boat as NWR who have long dismissed Turing/Nvidia even before the cards were released because of your preexisting bias.

Problem is, in this case AMD is pursuing the same technology as well, so this isn't a bush you can beat around much longer. They too are working on a DLSS implementation that takes a lower resolution image and reconstructs it to a higher resolution via AI. And their results will likely have the same "wierd artifacts" and "odd DOF" you speak of, probably worse if they do not have the same supercomputing resources as Nvidia to dedicate their AI networks with.

NWR_Midnight Mar 2, 2019 05:19 AM

Quote:

Originally Posted by Exposed (Post 1338121732)
We already know no game is "fully ray traced", that's impossible with current hardware. "Fully ray traced" would be near unlimited rays with near unlimited light bounces.

You must be referring to the performance gained by Battlefield 5. If so, the only thing they removed was the back and forth light bouncing from objects that didn't need to be ray traced, like leaves. That and further bug fixes and optimizations that resulted in much better performance without sacrificing image quality. This is old news and discussed in the Battlefield 5 thread numerous times already.

Also I'm glad you pointed out 4k DLSS is not native 4k. None of us knew that before.

Also keep up that tone, I'm sure a vacation will be in your future if you keep that up!

Why would I get a vacation? I have done nothing wrong except state a different opinion that is not in agreement with yours, so please stop with your "vacation" threats.

I was countering you comment about "what happened to the 'it won't be playable above 1080p' " comment, which you just admitted that it can't be done at 4k, as well as admitting it isn't true 4k.. so my comment stands that raytracing is only viable at 1080p fully utilizing it. Not the water down versions that we see in Battlefield V and Metro. Also, in battlefield 5, they removed the amount of ray traces being done per frame (If I remember correctly, it as more than a 50% reduction in what is being ray traced) not just the back and forth lighting etc. That was their answer to optimization.

NWR_Midnight Mar 2, 2019 05:23 AM

Quote:

Originally Posted by bill dennison (Post 1338121734)
only DLSS is at lower res

you can have RT at full 4k 30 to 45 FPS


It's fun to watch all the non RTX owners with their sour grapes

………...
is it perfect no is it better than anything AMD has now oh hell yes


and my 2080 ti runs division 2 beta cranked up to max at 4k well witch is the main reason I got it (to run games at 4k) not for DLSS or RTX they are just a bonus

sorry AMD has no card that will run games at 4k well when they do I will buy one but till then Vega 2 is a joke so NV is the only game in town .

If it was only DLSS that is being done at low rez, and then upscaled, why has it been demonstrated that 1800p to 4k, without DLSS can be upscaled giving exactly the same performance and a better picture? There is no way that you can take raytracing NO DLSS that results in low fps and magically turn on DLSS and magically increase the frame rates substantially.. The only reasonable explanation is that both raytracing and DLSS is being performed at a lower resolution and then upscaled. Even in one of your past examples when 3dmark dlss benchmark was released you showed without DLSS your fps being.. 17fps if I remember correctly, yet magically increased substantially with DLSS..

Now, I may be wrong, but that is my take on it.

NWR_Midnight Mar 2, 2019 05:45 AM

Quote:

Originally Posted by Exposed (Post 1338121830)
I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make.

This example right here, pretty much demonstrates that 4k DLSS is just 1440p being upscaled and both raytracing and DLSS are being performed at 1440p before the upscale.. otherwise, you wouldn't be able to go fro 30fps-40ps native 4k, to 60fps at 4k with DLSS (the identical performance you get at 1440p native resolution). The math just doesn't add up any other way. Sorry!

Jay20016 Mar 2, 2019 06:32 AM

Quote:

Originally Posted by Exposed (Post 1338121843)
Yes, it's still a choice. It's as easy as choosing a different AA method or turning it off completely or running whatever resolution/scaling/AA combo that existed since the decades prior to DLSS.

You also have the choice to run at 640x480.

Quote:

Originally Posted by Exposed (Post 1338121843)
On the contrary, you can have an actual comparison because the entire point of deep learning super resolution is to take a lower resolution and use AI to extrapolate a near perfect higher resolution sample. This existed long before Nvidia had anything to do with it and used in various fields including medical and satellite imagery. As far as your laundry list of "weird artifacts" and "DOF" etc..., you're only clinging to the worst case scenario (Battlefield 5) and ignoring the success scenario (Port Royal) including implementations that are improved to the point there is only a minor loss of detail from native (Metro Exodus).

Do you understand what I was meaning by the term garbage in, garbage out? If a source is crappy, the output of that source is going to be crappy. No matter how much tomfoolery you want to believe can happen via the modern magic of "AI" it will never be the same as just the native resolution. It is like you believe this to be the same techniques used on CSI and other crime shows to infinitely zoom into a digital picture and magically give a crystal clear cap of a license plate that was only a few pixels wide in the original; you cannot simply create pixel data from nothing.

As for the demos, there are two titles that actually use this right now and a canned benchmark. Of the two games, where you choose to take whatever path you want, look wherever you want, shoot whatever you want we've seen that DLSS results in a blurrier picture with shimmering artifacts, textures that lack detail, and aliasing. Of the canned benchmark which the developers choose the path you take, choose what you look at, choose what happens in the scene, and make sure that it happens this way every time then you see "better" performance of the AI. It is like congratulating the person who studied for a test via having a copy of the questions beforehand. If nothing changes from each run, why would it not get better? Yet, how is this in anyway an accurate reflection of the technology to you a gamer? What you want to prop up as amazing is just the masturbatory product of letting a computer look at the same sequence of images for hours upon hours and then letting it give you a result that is still inferior to the native resolution. No matter how "close" it is, it is still an unachievable result for any game where the player has any agency in the course of its strung together images.


Quote:

Originally Posted by Exposed (Post 1338121843)
Then just say your pessimism is due to your negativity towards Nvidia rather than beat around the bush. You're in the same boat as NWR who have long dismissed Turing/Nvidia even before the cards were released because of your preexisting bias.

I've dismissed it as an overpriced novelty from the get go because that is entirely what it is. There is no redeeming value to spending another $400 on top of the cost of my 1080ti to achieve marginally faster results without any of the RTX specific stuff. Furthermore, none of the games I play need it. I'm quite happy to sit at my 4K 60FPS for all of the PC games I play. There is no need for me to upgrade. That money is better served waiting for Ryzen 2 in the next few months.

As far as bias goes, some of it is well earned. The whole port royal nonsense above just reminds me of the time they were caught in 3DMark03(?) actively occluding things to artificially lighten their load and result in higher performance that never saw an ounce of utility in the real world games you would actually play.... sounds awfully familiar.

Quote:

Originally Posted by Exposed (Post 1338121843)
Problem is, in this case AMD is pursuing the same technology as well, so this isn't a bush you can beat around much longer. They too are working on a DLSS implementation that takes a lower resolution image and reconstructs it to a higher resolution via AI. And their results will likely have the same "wierd artifacts" and "odd DOF" you speak of, probably worse if they do not have the same supercomputing resources as Nvidia to dedicate their AI networks with.

If they do, they do, I have the choice to not use either implementation. I'd rather they spend their resources on things that actually improve my experience rather than give me a placebo like effect to think that I'm at a resolution I'm not at, with image quality that is not as good as I should expect.

Exposed Mar 2, 2019 08:24 AM

Quote:

Originally Posted by Jay20016 (Post 1338121872)
You also have the choice to run at 640x480.

As I said, you have the choice between native 4k and 4k DLSS. We know the strengths and benefits of each. So why are you here keep repeating the same ad naseum DLSS isn't native 4k that everyone one the planet knows?

Quote:


Do you understand what I was meaning by the term garbage in, garbage out? If a source is crappy, the output of that source is going to be crappy.
I'm sorry, then you are wholly ignorant of the entire field of deep learning super resolution that existed even before Nvidia renamed it "Deep Learning Super Sampling" in their own implementation. We had an entire discussions on this in the other thread that you missed with numerous example of entirely different fields using what is essentially "DLSS" (Nvidia's term).
https://deepsense.ai/using-deep-lear...er-resolution/

Quote:

No matter how much tomfoolery you want to believe can happen via the modern magic of "AI" it will never be the same as just the native resolution.
For the last time, no one is saying that is the case. However, I suggest you do some research into deep learning super resolution, especially note the examples in that link.

Quote:

It is like you believe this to be the same techniques used on CSI and other crime shows to infinitely zoom into a digital picture and magically give a crystal clear cap of a license plate that was only a few pixels wide in the original; you cannot simply create pixel data from nothing.
No, no one believes this, YOU are the one trying to portray this argument (and thus argue the point). Have fun with that.

Quote:

As for the demos, there are two titles that actually use this right now and a canned benchmark. Of the two games, where you choose to take whatever path you want, look wherever you want, shoot whatever you want we've seen that DLSS results in a blurrier picture with shimmering artifacts, textures that lack detail, and aliasing.
DLSS is trash in Battlefield 5 but excellent in Metro Exodus (and Final Fantasy XV). Every 2080Ti owner here has reported there is only a minor loss of detail when using DLSS with Metro Exodus, on par expected of a 1440p image upscaled. I don't see any shimmering, textures that lack detail, or anything "blurry". Turn on DLSS at 4k and you get clarity equivalent to 1800p (better than 1440p native) with zero aliasing and much better performance than native 4k.

You are arguing a frivolous point because you just want to argue a non argument. Have you tested DLSS yourself? On what card? On what machine?

Quote:


Of the canned benchmark which the developers choose the path you take, choose what you look at, choose what happens in the scene, and make sure that it happens this way every time then you see "better" performance of the AI. It is like congratulating the person who studied for a test via having a copy of the questions beforehand. If nothing changes from each run, why would it not get better?
Then let's congratulate the developers of Metro for their DLSS implementation, because their post patch results are MUCH superior than the blurry mess they had before.

Quote:

Yet, how is this in anyway an accurate reflection of the technology to you a gamer? What you want to prop up as amazing is just the masturbatory product of letting a computer look at the same sequence of images for hours upon hours and then letting it give you a result that is still inferior to the native resolution. No matter how "close" it is, it is still an unachievable result for any game where the player has any agency in the course of its strung together images.
No matter how close it is? If it gets 90% close to a reference 4k native image then THAT is a success. And that's pretty much what we got with the newest Metro DLSS patch, with none of that garbage you keep spewing about because you haven't seen DLSS in action yourself.


Quote:


I've dismissed it as an overpriced novelty from the get go because that is entirely what it is. There is no redeeming value to spending another $400 on top of the cost of my 1080ti to achieve marginally faster results without any of the RTX specific stuff. Furthermore, none of the games I play need it. I'm quite happy to sit at my 4K 60FPS for all of the PC games I play. There is no need for me to upgrade. That money is better served waiting for Ryzen 2 in the next few months.
Who cares? Take that BS elsewhere because this is not the thread for it.


Quote:

As far as bias goes, some of it is well earned. The whole port royal nonsense above just reminds me of the time they were caught in 3DMark03(?) actively occluding things to artificially lighten their load and result in higher performance that never saw an ounce of utility in the real world games you would actually play.... sounds awfully familiar.
Yes, because ATI hasn't done the exact sort of thing either. :rolleyes:

The funny thing about DLSS is, it's all about image quality. So reviewers are specifically looking at image quality down to the pixel. So you point out one credible source that Nvidia is "cheating" in Port Royale or "cheating" in some other way. Go ahead, I'll wait. Or could this be a case of the usual...

Quote:


If they do, they do, I have the choice to not use either implementation. I'd rather they spend their resources on things that actually improve my experience rather than give me a placebo like effect to think that I'm at a resolution I'm not at, with image quality that is not as good as I should expect.
Too bad, AMD is looking to have their equivalent DLSS implementation once DirectML is released, which will allow deep learning aglorithms to run on their compute cores in directx 12.

Exposed Mar 2, 2019 08:43 AM

Quote:

Originally Posted by NWR_Midnight (Post 1338121859)
Why would I get a vacation? I have done nothing wrong except state a different opinion that is not in agreement with yours, so please stop with your "vacation" threats.

Your baited statements like these have been reported:

It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase.

But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D



Essentially you dropped in with those statements with the clear intent to troll/bait.


Quote:

I was countering you comment about "what happened to the 'it won't be playable above 1080p' " comment, which you just admitted that it can't be done at 4k, as well as admitting it isn't true 4k.. so my comment stands that raytracing is only viable at 1080p fully utilizing it. Not the water down versions that we see in Battlefield V and Metro. Also, in battlefield 5, they removed the amount of ray traces being done per frame (If I remember correctly, it as more than a 50% reduction in what is being ray traced) not just the back and forth lighting etc. That was their answer to optimization.
All of your comments are wrong and your issue you won't admit being wrong even with facts presented directly to you. Such as the case when you had all sorts of misinformation about Deep Learning itself and others had to correct you on numerous occasions.

As I said before, Battlefield 5 removed ray tracing on objects that didnt need them, like leaves. Did you see any reflections on leaves before the RTX performance patch? No, so that wasteful resource was eliminated. And there was an RTX rendering bug that allowed a large number of light bounces that they fixed.

I feel I am repeating myself here. There are videos from Digital Foundry and Dice themselves on the specific improvements in the other Battlefield 5 thread. Go have a gander.

If all you wanted to say was ray tracing is only really playable in 1080p "fully utilized", then that is yet another wrong statement...just for the fact that Battlefield 5 was fully playable at 1440p 60fps BEFORE any performance patches. Again, I feel I'm repeating myself here, this was posted in the Battlefield 5 RTX thread.

Jay20016 Mar 2, 2019 08:46 AM

It is quite ironic you want me to educate myself on something you obviously fail to grasp.

Answer me this. If I take an image that is 1440p and up scale it to 4k, how do I account for the fact there are fewer total pixels in the 1440p image? What technique is used to reach the higher pixel count?

Exposed Mar 2, 2019 08:50 AM

Quote:

Originally Posted by NWR_Midnight (Post 1338121862)
This example right here, pretty much demonstrates that 4k DLSS is just 1440p being upscaled and both raytracing and DLSS are being performed at 1440p before the upscale.. otherwise, you wouldn't be able to go fro 30fps-40ps native 4k, to 60fps at 4k with DLSS (the identical performance you get at 1440p native resolution). The math just doesn't add up any other way. Sorry!

Are you and Jay competing for the Nobel prize to see who can repeat the same thing over and over again?

We already know 4k DLSS is 1440p upscaled. The question is if DLSS is doing enough a good job to warrant its use.

For Metro Exodus, YES.

For Battlefield 5, NO.

Metro Exodus gives a good, clear representation of the benefits of DLSS. It's quite funny reading the doomsday argument about blurring, artifacts, etc.. when I'm playing the game right now with everything Ultra including RTX and DLSS looks picture perfect as can be. I'd say it's about 90% equivalent to the native 4k image, with none of the aliasing artifacts.

Don't take my word for it though. There are other 2070/2080/2080Ti owners here. DLSS also works great at 1440p according to acroig since that's his native resolution.

acroig Mar 2, 2019 08:54 AM

Quote:

Originally Posted by Exposed (Post 1338121889)
Are you and Jay competing for the Nobel prize to see who can repeat the same thing over and over again?

We already know 4k DLSS is 1440p upscaled. The question is if DLSS is doing enough a good job to warrant its use.

For Metro Exodus, YES.

For Battlefield 5, NO.

Metro Exodus gives a good, clear representation of the benefits of DLSS. It's quite funny reading the doomsday argument about blurring, artifacts, etc.. when I'm playing the game right now with everything Ultra including RTX and DLSS looks picture perfect as can be. I'd say it's about 90% equivalent to the native 4k image, with none of the aliasing artifacts.

Don't take my word for it though. There are other 2070/2080/2080Ti owners here. DLSS also works great at 1440p according to acroig since that's his native resolution.

This, 100% agreed.

Exposed Mar 2, 2019 09:01 AM

Quote:

Originally Posted by Jay20016 (Post 1338121888)
It is quite ironic you want me to educate myself on something you obviously fail to grasp.

Answer me this. If I take an image that is 1440p and up scale it to 4k, how do I account for the fact there are fewer total pixels in the 1440p image? What technique is used to reach the higher pixel count?

Did you bother reading the link I posted?

You know what happens when you take a 1440p image and upscale it 4k like in photoshop? It looks like crap.

I'm 99% certain that's how you think deep learning super resolution works. And there in lies the fault of your argument.

To answer your question, answer my question....

What if you take that 1440p image, look at it side by side with a native 4k image, would you be able to derive an algorithm that eventually comes close to the native 4k image? Because that's exactly what deep learning super resolution does.... you train the AI on ground truth images, feed it thousands of times over and over and over days or months, and eventually it will learn to produce similar native quality results from those lower resolution images. Those "missing pixels" are extrapolated from the reference native images, something you can't do with a simple upscaling filter. If you understand this then you can understand why Nvidia has a dedicated supercomputing cluster for training DLSS on games and why the more training time the better the output will be. Probably also explains why it takes so long to get DLSS implemented in game, and they likely rushed it for Battlefield 5. They did mention the next BF5 patch will vastly improve DLSS, it's unfortunately they rushed it out prematurely.

Lastly can you do me a favor and actually play with DLSS on/off first hand in games like Metro Exodus/FFXV before making judgement? I'm sorry, but 1080p youtube videos don't cut it.

SIrPauly Mar 2, 2019 09:20 AM

Curious to know more about the inner workings of the tensor cores on how it makes Dlss possible. Need to learn more on this.

Exposed Mar 2, 2019 09:40 AM

Quote:

Originally Posted by acroig (Post 1338121890)
This, 100% agreed.

To be honest if there's no further DLSS updates for Metro I'm ok with it, it's pretty good as is.

Next game to have DLSS is Anthem, we'll see how that goes. Not sure if that's my kind of game though.

Jay20016 Mar 2, 2019 09:40 AM

You like to make assumptions and then go on tirades don't you? Would you like my login credentials so that you can continue to argue with yourself, or would you like to answer a question to continue?

My entire thing with this entire "feature" is that it is effort put into something that is only used when they allow and gives an inferior result to boot. It doesn't matter if your, plucked from the anus, 90% as good is enough for you it's wasted energy and resources on creating a mummer's farce that could've been spent on other optimizations that benefit the whole.

acroig Mar 2, 2019 09:44 AM

Quote:

Originally Posted by Exposed (Post 1338121907)
To be honest if there's no further DLSS updates for Metro I'm ok with it, it's pretty good as is.

Next game to have DLSS is Anthem, we'll see how that goes. Not sure if that's my kind of game though.

I might pick it up when it goes on deep sale.... in 2 weeks. ;)

Exposed Mar 2, 2019 09:51 AM

Quote:

Originally Posted by Jay20016 (Post 1338121908)
You like to make assumptions and then go on tirades don't you? Would you like my login credentials so that you can continue to argue with yourself, or would you like to answer a question to continue?

My entire thing with this entire "feature" is that it is effort put into something that is only used when they allow and gives an inferior result to boot. It doesn't matter if your, plucked from the anus, 90% as good is enough for you it's wasted energy and resources on creating a mummer's farce that could've been spent on other optimizations that benefit the whole.

:lol: Plucked from the anus? :lol:

Your argument would be valid if Nvidia was taking away resources from something else. Let's call it VAJ69. So Nvidia works to implement VAJ69 in games, somehow I think the same "wasted resources" argument for DLSS would be applied to VAJ69 as well.

The way I see it, Nvidia gave us a choice with DLSS and it appears to be primarily aimed at RTX owners because ray tracing is demanding in the first place.

So 90% of native 4k at 140% the performance seems like a really good option to me. OPTION. If it sucks, like in Battlefield 5, I won't use it. But it works very well in Metro so I use it.

These 4k screenshots show there is very little detail lost if you want to see for yourself.

https://wccftech.com/metro-exodus-up...t-nvidia-dlss/

NWR_Midnight Mar 2, 2019 11:47 AM

Quote:

Originally Posted by Exposed (Post 1338121887)
Your baited statements like these have been reported:

It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase.

But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D



Essentially you dropped in with those statements with the clear intent to troll/bait.




All of your comments are wrong and your issue you won't admit being wrong even with facts presented directly to you. Such as the case when you had all sorts of misinformation about Deep Learning itself and others had to correct you on numerous occasions.

As I said before, Battlefield 5 removed ray tracing on objects that didnt need them, like leaves. Did you see any reflections on leaves before the RTX performance patch? No, so that wasteful resource was eliminated. And there was an RTX rendering bug that allowed a large number of light bounces that they fixed.

I feel I am repeating myself here. There are videos from Digital Foundry and Dice themselves on the specific improvements in the other Battlefield 5 thread. Go have a gander.

If all you wanted to say was ray tracing is only really playable in 1080p "fully utilized", then that is yet another wrong statement...just for the fact that Battlefield 5 was fully playable at 1440p 60fps BEFORE any performance patches. Again, I feel I'm repeating myself here, this was posted in the Battlefield 5 RTX thread.

I'm sorry that you take my comment as a personal insult, it wasn't, it was just my observation, for people to justify their $1300 purchase. It's not a baited comment, it is my opinion of the situation.

The long ago arguement about deep learning, was not argued about with many others.. only you and 1 single comment from 1 other person, with me not continue the argument as we both have different perspectives and meaning on the definition of AI and deep learning.

You complain about about baited questions, but you continue to bring up old arguments in different threads and topics over and over that happened months and years ago, you have done it with AMD, Intel, Nvidia, etc.. Just to try and bait me into starting up old arguments again, all thought I have dropped it, hence why I discontinued arguing with you about it. Including coming into other forums topics (gaming the latest) to just drop snide comments and to try and run me out of those discussions.

I have not conceded, or admitted I am wrong concerning DLSS because from how I read and my interpretation of the full definition of AI and deep learning (machine learning).. DLSS does not meet the definition, in which I gave evidence that supported my position.. But that is an old argument, that we just need to agree to disagree on.. so stop bringing it up.

As for Battlefield 5 they went from, I believe 72 ray traces to 30.. that is reducing it by over 50%.. to sit and eat up the excuse that it was un-needed is just a pr statement to make people feel okay with the change and manipulate the facts to accept that Ray tracing optimizations, when indeed they removed many of the ray tracing being done to a lot of surfaces. Again, Battlefield 5 does not fully implement ray tracing, nor does metro, so you can stop talking out both sides of your mouth.. one minute saying it is impossible to fully use Ray tracing, as we don't have the hardware to do it yet, and then say that Battlefield 5 was playable at 60 fps before the fix with ray tracing, when you have already admitted that we cannot use Ray tracing to it's fullest.. Battlefield 5 doesn't even come close. So you counter your own argument with your own statements, as well as the fact that you manipulating facts, because the only way you could have been getting 60fps with raytracing prior to the optimization patch was if you had all settings on low.. ultra, it wasn't happening according to this demonstration and has a hard time doing it with the optimizations (NO DLSS, but even DLSS added to it goes along with my comments above concerning that it is all being done at 1440p and then upscaled to 4k, including the raytracing):


NWR_Midnight Mar 2, 2019 11:54 AM

Quote:

Originally Posted by Exposed (Post 1338121889)
Are you and Jay competing for the Nobel prize to see who can repeat the same thing over and over again?

We already know 4k DLSS is 1440p upscaled. The question is if DLSS is doing enough a good job to warrant its use.

For Metro Exodus, YES.

For Battlefield 5, NO.

Metro Exodus gives a good, clear representation of the benefits of DLSS. It's quite funny reading the doomsday argument about blurring, artifacts, etc.. when I'm playing the game right now with everything Ultra including RTX and DLSS looks picture perfect as can be. I'd say it's about 90% equivalent to the native 4k image, with none of the aliasing artifacts.

Don't take my word for it though. There are other 2070/2080/2080Ti owners here. DLSS also works great at 1440p according to acroig since that's his native resolution.

:lol: And you completely ignore that just upscaling without the use of DLSS has equal performance and a better quality picture, or for argument sake, equal quality.. which translates to: DLSS is just smoke and mirrors and is don't nothing. This is where my placebo perspective comes into play.

It's also funny that nearly every review out there will completely disagree with your 90% equivalent of native 4k image, with some having the same opinion that has already been brought up here: that it just looks like they are using a sharpness filter in the latest fix in Metro. Now, we are not going to agree on most of this, so this is where you need to sit back and agree to disagree..

Everyone has a right to have different opinions, even if they don't support your positive outlook on the technology/company. We all have a right to voice such opinions (even if they seem wrong to you, or in the end, are completely wrong). But nobody had the right to chastise, or talk down to, and make fun of others for that have those difference of opinions, which it seems is what you like to do to everyone that disagrees with YOUR opinion. I admit, I am a stubborn person, and I am very hard to sway from my beliefs, and I am a very course and straight forward speaking person which to some comes across as insulting (personal trait that I have been trying to manage for years), but I don't generally attack people and make fun of people in such a matter continually and/or on purpose. I don't know if the same could be said about you, because once you get tired of someone's position, you don't just simply say, lets agree to disagree, you just start making personal attacks and making fun of such people, dropping snide remarks, and never drop it and you seem to hold a grudge in every conversation and topic going forward. It's been 30+ years, but every time you do it, I feel like I am back in middle school and high school. So, do us all a favor, if you feel like you are repeating yourself and/or are tired of discussing the topic, just simply agree to disagree and don't bring it up again.

Exposed Mar 2, 2019 12:04 PM

It's clear the only reason you came in with those blanket insultive comments towards every 2080ti owner here is to poison this thread with vitriol, as you have in countless other threads. Thats why i alerted the mods o your presence here, especially with those starting comments.

Rather than go back and forth with your nonsense, here are my thoughts on bf5 prior to any patch:

http://www.rage3d.com/board/showthread.php?t=34048514

Looks i was getting 52+ at Rotterdam at Ultra at 1440p. Seems more than playable than just 1080p.

Now, show me evidence of image quality diminished after the performance optimizations. Make sure it includes before and after shots

NWR_Midnight Mar 2, 2019 12:20 PM

Quote:

Originally Posted by Exposed (Post 1338121930)
It's clear the only reason you came in with those blanket insultive comments towards every 2080ti owner here is to poison this thread with vitriol, as you have in countless other threads. Thats why i alerted the mods o your presence here, especially with those starting comments.

read my edit in my above comment, this is a perfect example. Thanks.

Quote:

Rather than go back and forth with your nonsense, here are my thoughts on bf5 prior to any patch:

http://www.rage3d.com/board/showthread.php?t=34048514

Looks i was getting 52+ at Rotterdam at Ultra at 1440p. Seems more than playable than just 1080p.

Now, show me evidence of image quality diminished after the performance optimizations. Make sure it includes before and after shots
Again, you are talking out both sides of your mouth, as you said:
Quote:

just for the fact that Battlefield 5 was fully playable at 1440p 60fps BEFORE any performance patches.
You did not say 52 fps. So thanks for showing us evidence of how you change up the facts to suit your position. For the record, I actually missed the 1440p as it didn't register because we have been mainly talking about 4k, so my actual response was geared towards 4k, not 1440p.. but you have even validated my point with your own examples using your own previous posts, as 52 fps is not 60 fps either way. :D

As for showing you evidence of image quality diminishing after the performance optimizations: WHY?

I never made such a statement, or implied that was the case, All I said is they reduced the amount of ray tracing by 50% and running a watered down version of ray tracing, so why would I show you evidence of such image quality changes or diminishes after the optimizations, that I never said exist?

Exposed Mar 2, 2019 01:06 PM

Quote:

Originally Posted by NWR_Midnight (Post 1338121934)
read my edit in my above comment, this is a perfect example. Thanks.

No such example existed. Thanks.


Quote:


Again, you are talking out both sides of your mouth, as you said: You did not say 52 fps. So thanks for showing us evidence of how you change up the facts to suit your position. For the record, I actually missed the 1440p as it didn't register because we have been mainly talking about 4k, so my actual response was geared towards 4k, not 1440p.. but you have even validated my point with your own examples using your own previous posts, as 52 fps is not 60 fps either way. :D

Thanks for showing us (yet again) that you only see what you want to see.

Do yourself a favor, scroll down a few posts, look at the top left corner of that screenshot. What framerate do you see there?

In fact you're so clung up on that 52 number here is what I said exactly:

60fps vsync locked in that area with all those reflections going on. Other areas did dip into the 30's but I was surprised overall it stayed above 50fps most of the time.


Seriously, you're doing the "you" thing where you dig yourself a hole and refuse to climb back up with dignity due to your inability to acknowledge you're wrong or misunderstood something. I'll give you a pass on this one if it makes you feel better.


Quote:

As for showing you evidence of image quality diminishing after the performance optimizations: WHY?


Quote:

I never made such a statement, or implied that was the case,
Quote:

All I said is they reduced the amount of ray tracing by 50% and running a watered down version of ray tracing, so why would I show you evidence of such image quality changes or diminishes, that I never said exist?
What is a "watered down version of ray tracing"?

Please, show me a watered down version of this ray tracing. This would naturally show up in the reflection quality post patch.

Shouldn't be too hard, one of the things every reviewer checked after the performance patch was to see if ray tracing quality diminished in any way. Please, show me your source where this was confirmed and the before/after screenshots.


Or... could it be the internal optimizations did not affect visible ray tracing output overall? Nah, it has to be watered down. Just waiting on your visual proof.....

NWR_Midnight Mar 2, 2019 01:22 PM

Quote:

Originally Posted by Exposed (Post 1338121937)
just more examples of me manipulating my own comments to mean what I need them to mean at the time I am using them to support my position

:lol: Why must you bring in a completely different thread to try and support your argument on a comment you made in this thread? Either way, being playable at 60 fps means you are averaging 60 fps, not above 50 fps, or 52 fps, or hitting hitting 60fps. But you can manipulate your various comments to mean what ever you want, as I am done with talking about it.

Battlefield V has been a watered down version of Ray tracing from the start, and their optimizations reduced the amount of ray tracing from there. But, again, you are cherry picking my comment, manipulating it, and trying to mold it into your own definition to mean what you want it to mean so you can argue about it, knowing damn good and well that is not what I said, meant, or implied.. so stop!

Exposed Mar 2, 2019 01:39 PM

Quote:

Originally Posted by NWR_Midnight (Post 1338121941)
Being playable at 60 fps means you are averaging 60 fps, not above 50 fps, or 52 fps, or hitting hitting 60fps, but you can manipulate into meaning what ever you want. I am done talking about it.

Nothing is manipulated, the data is there to see and you've found yourself on the wrong end again and can't admit to it. You stormed in here clamoring how ray tracing in BF5 was only good for 1080p, I showed you I was hitting 60 fps vsync locked at 1440p and RTX Ultra in that section I was testing in.


Quote:

Battlefield V has been a watered down version of Ray tracing from the start, and their optimizations reduced the amount of ray tracing from there. But, again, you are cherry picking my comment, manipulating it, and trying to mold it into your own definition to mean what you want it to mean so you can argue about it, knowing damn good and well that is not what I said, meant, or implied.. so stop!
Watered down from the start? From what? Please provide specifics.

Also still waiting on those screenshots that shows a reduction in ray tracing quality from before the performance patch and afterwards. Side by side would be preferable since it would immediately show the difference.

NWR_Midnight Mar 2, 2019 02:18 PM

Quote:

Originally Posted by Exposed (Post 1338121943)
Nothing is manipulated, the data is there to see and you've found yourself on the wrong end again and can't admit to it. You stormed in here clamoring how ray tracing in BF5 was only good for 1080p, I showed you I was hitting 60 fps vsync locked at 1440p and RTX Ultra in that section I was testing in.


Watered down from the start? From what? Please provide specifics.

Also still waiting on those screenshots that shows a reduction in ray tracing quality from before the performance patch and afterwards. Side by side would be preferable since it would immediately show the difference.

More cherry picking and manipulation. First the talk of ray tracing only being good at 1080p was brought up months ago, when it was first introduced. And as optimizations and changes are being made, you keep making comments towards those that stated such a thing as time progresses, which at the time was accurate statements and if ray tracing was fully implemented, is still accurate today. Tell you what, how about you go argue with Bill Gates on his comment "640K ought to be enough for anybody." since he stated that years ago.. and see where it gets you, because you are kind of doing the same thing here about past comments made by Many people as if they hold true today, based off of what was happening months ago. Then you ignore parts of what I say, aka cherry picking my comments and trying to manipulate them to all mean the same.

In this thread. I said Battlefield 5 and Metro does not fully utilize Raytracing, and if fully Utilized, it would only be good at 1080p (I will add to that comment, and say that it may not even be able to do that as it might be a stretch with todays hardware). You even admitted that we don't have the hardware to fully utilize ray tracing.. So what do you call it? I call it a water down version since it can't be fully utilized. I have no reason to summit any screen shots of image quality differences, as I have not argued that point, it is all a made up argument that is only in your head, because you have made it up to try and argue something else with me. Please stop.

Now you have went from playable at 60 fps, to hitting 60 fps. Completely two different statements with two different meanings. But, we can go round and round all day long, because you will just continue being you, manipulating and cherry picking and try to argue made up stuff that was never intended. So,it is time to firmly say it is time to just agree that we disagree and end it.

Exposed Mar 2, 2019 02:29 PM

Quote:

Originally Posted by NWR_Midnight (Post 1338121947)
More cherry picking and manipulation. First the talk of ray tracing only being good at 1080p was brought up months ago, when it was first introduced. And as optimizations and changes are being made, you keep making comments towards those that stated such a thing as time progresses, which at the time was accurate statements and if ray tracing was fully implemented, is still accurate today. Tell you what, how about you go argue with Bill Gates on his comment "640K ought to be enough for anybody." since he stated that years ago.. and see where it gets you, because you are kind of doing the same thing here about past comments made by Many people as if they hold true today, based off of what was happening months ago. Then you ignore parts of what I say, aka cherry picking my comments and trying to manipulate them to all mean the same.

In this thread. I said Battlefield 5 and Metro does not fully utilize Raytracing, and if fully Utilized, it would only be good at 1080p (I will add to that comment, and say that it may not even be able to do that as it might be a stretch with todays hardware). You even admitted that we don't have the hardware to fully utilize ray tracing.. So what do you call it? I call it a water down version since it can't be fully utilized. I have no reason to summit any screen shots as I have not argued that point, it is all a made up argument that is only in your head, because you have made it up to try and argue something else with me. Please stop.

Now you have went from playable at 60 fps, to hitting 60 fps. Completely two different statements with two different meanings.

Please provide specifics. In what state was Battlefield 5 "fully ray traced" that you claimed was only playable at 1080p? At what point did it become "watered down"? Where is the before/after screenshots when ray tracing in BF5 was "watered down" from its initial state?

Mangler Mar 2, 2019 02:31 PM

Has anyone ever claimed that the rtx cards would be able to fully raytrace anything?

Didn't nvidia themselves push the hybrid rendering aspect from the start?

NWR_Midnight Mar 2, 2019 02:32 PM

Quote:

Originally Posted by Exposed (Post 1338121948)
Please provide specifics. In what state was Battlefield 5 "fully ray traced" that you claimed was only playable at 1080p? At what point did it become "watered down"? Where is the before/after screenshots when ray tracing in BF5 was "watered down" from its initial state?

Now you are just trolling, as I never said Battlefield ever fully utilized ray tracing even at 1080p I said IF, hence why I consider it a water down version of Ray tracing, which is what it is.

Exposed Mar 2, 2019 02:40 PM

Quote:

Originally Posted by NWR_Midnight (Post 1338121951)
Now you are just trolling, as I never said Battlefield ever fully utilized ray tracing even at 1080p I said IF, hence why I consider it a water down version of Ray tracing, which is what it is.

Is there a difference in image quality with RTX Ultra from the release game to after the RTX performance patch?

NWR_Midnight Mar 2, 2019 02:43 PM

Quote:

Originally Posted by Mangler (Post 1338121950)
Has anyone ever claimed that the rtx cards would be able to fully raytrace anything?

Didn't nvidia themselves push the hybrid rendering aspect from the start?

Not sure. But isn't DLSS the hybrid rendering aspect they are talking about?

Exposed Mar 2, 2019 02:52 PM

Quote:

Originally Posted by Mangler (Post 1338121950)
Has anyone ever claimed that the rtx cards would be able to fully raytrace anything?

Didn't nvidia themselves push the hybrid rendering aspect from the start?

No one has claimed that. Not even Microsoft's own DXR specifications specifies anything of the sort.

What DXR allows is purpose specific ray tracing that can be used in tandem with traditional rendering. Reflections for BF5, global illumination for Metro Exodus.

That's why it is strange certain individuals would bring up the argument "well it's not fully ray traced", completely ignoring the uplift in visual quality this hybrid rendering already gives.

Case in point, another game to include ray tracing later this month (Justice MMO). HUGE difference in quality with the ray tracing renderer enabled. This game does both ray traced illuminations and reflections.


Cream Mar 2, 2019 04:45 PM

Quote:

Originally Posted by Mangler (Post 1338121950)
Has anyone ever claimed that the rtx cards would be able to fully raytrace anything?

Didn't nvidia themselves push the hybrid rendering aspect from the start?

They certainly did.

Can't believe how childish some so called adult can act over a piece of technology.

Cream Mar 2, 2019 04:49 PM

Quote:

Originally Posted by NWR_Midnight (Post 1338121954)
Not sure. But isn't DLSS the hybrid rendering aspect they are talking about?

Anyone that has even slightly put a bit of effort into researching ray tracing and Nvidia SHOULD know that these card are nothing but a stepping stone. Instead of looking at what they cant' do why not look at what they can do. Maybe appreciate the HUGE jump in raytracing performance over anything that has gone before.


All times are GMT -5. The time now is 05:17 AM.

Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
All trademarks used are properties of their respective owners. Copyright 1998-2011 Rage3D.com