Rage3D Discussion Area

Rage3D Discussion Area (http://www.rage3d.com/board/index.php)
-   Other Graphics Cards and 3D Technologies (http://www.rage3d.com/board/forumdisplay.php?f=65)
-   -   Nvidia RTX DLSS/Ray Tracing Discussion (http://www.rage3d.com/board/showthread.php?t=34049038)

SIrPauly Jul 19, 2020 04:51 PM

This investigative review on Dlss 2 is pretty intricate and detail oriented. Personally investigate quality of features from a foundational low resolution and work my way up.

https://youtu.be/N8M8ygA9yWc

the_sextein Jul 19, 2020 09:06 PM

That's an interesting video comparison. It looks like some light sources are blown out and the sparks are erased from the scene but it provides better AA. I'm more concerned with texture quality myself but it seems the various methods of AA have a negative effect on the texture quality at times so it's not a case where one always looks better than the other. At 4k I find AA less of a game changer than it used to be at lower resolutions but it's still helpful to the overall image. Hopefully DLSS 3.0 is a bit better. All that aside, Nvidia still has to program support for DLSS on a per game basis so it simply won't be available in many titles. I'm glad it's around though, It's always nice to have another option to play with to squeeze every last drop of quality out of the image when you are struggling for good performance.

Based on some of the video's I've seen, we may need DLSS even with a 3080TI to run Cyberpunk 2077 at 4k. I figure SLI won't be supported or they would be proudly advertising it's support of it by now.

demo Jul 22, 2020 06:47 PM

I just tried playing DS on my monitor for the first time, 21:9 3440x1440, and have to admit DLSS looks far better than native on this display. Not just a little, but better by a country mile.

Really odd as I ran it on my 4k TV again and think native is better there.

Napoleonic Jul 22, 2020 09:16 PM

Quote:

Originally Posted by demo (Post 1338207942)
I just tried playing DS on my monitor for the first time, 21:9 3440x1440, and have to admit DLSS looks far better than native on this display. Not just a little, but better by a country mile.

Really odd as I ran it on my 4k TV again and think native is better there.

Do your have your TV tech post processing effects applied? TV usually pre-configured like that.

SirBaron Jul 23, 2020 02:36 AM

DLSS Is great but really only useful for my OLED, 2560x1080 it's kinda pointless.

Not really in the market to buy a new monitor until they make a OLED with no burn in (I can dream) that is ultrawide and same size :lol:.

Or MicroLED starts to become dominant.

I really don't see much reason to upgrade as this monitor looks great, and is sharp enough, and I prefer playing on here than my OLED, beacuse I like to be close to the monitor.

demo Jul 23, 2020 02:43 AM

Quote:

Originally Posted by Napoleonic (Post 1338207976)
Do your have your TV tech post processing effects applied? TV usually pre-configured like that.

Samsung QLED set to gaming mode which supposedly removes post processing effects.

It is however at 4:2:2 12-bit HDR, compared to RGB 10-bit HDR on monitor. I wonder if anything is being lost or not translated well with chroma subsampling?

Trunks0 Jul 23, 2020 02:56 AM

Try flipping it over to 10-bit? Panels not likely 12-bit anyway and if it ends up looking better it will have been worth it for such an easy thing to change.

demo Jul 23, 2020 03:13 AM

Yeh, 10 and 12-bit look identical. I even set 4:4:4 8-bit (HDR off) just to check, and 4k native still looks better IMO. Really strange as on monitor it's night and day difference.

SIrPauly Jul 23, 2020 06:03 PM

Dlss at 8k:

https://www.tweaktown.com/articles/9...des/index.html

Trunks0 Jul 23, 2020 06:58 PM

So... could you layer Digital Super Resolution and DLSS?

demo Jul 23, 2020 09:57 PM

I tried this and it looks fantastic but I came across two issues, one I could resolve and the other I could not.

First problem was on a 4K TV, DSR uses 4096x2160 as base resolution meaning all DSR resolutions on offer don't scale into 3840x2160 nicely, and end up looking worse. Solution was to use CRU to delete all entries of 4096 x 2160. After that, DSR uses 3840x2160 as base and 7680x4320 (4xSSAA) and 5760x3240 (2xSSAA) become available, and wow do they look great and run well with DLSS. :drool:

The other issue though, is that I couldn't get DSR to run with 4:2:2 10 bit HDR. It keeps reverting to RGB 8-bit. In the end I decided I'd prefer to have HDR.

SIrPauly Jul 23, 2020 10:16 PM

It sure is fun to investigate.

the_sextein Jul 24, 2020 07:06 AM

Quote:

Originally Posted by demo (Post 1338208216)
I tried this and it looks fantastic but I came across two issues, one I could resolve and the other I could not.

First problem was on a 4K TV, DSR uses 4096x2160 as base resolution meaning all DSR resolutions on offer don't scale into 3840x2160 nicely, and end up looking worse. Solution was to use CRU to delete all entries of 4096 x 2160. After that, DSR uses 3840x2160 as base and 7680x4320 (4xSSAA) and 5760x3240 (2xSSAA) become available, and wow do they look great and run well with DLSS. :drool:

The other issue though, is that I couldn't get DSR to run with 4:2:2 10 bit HDR. It keeps reverting to RGB 8-bit. In the end I decided I'd prefer to have HDR.

Can you use DSR at 4-2-2 10bit without DLSS being enabled?

demo Jul 24, 2020 08:37 AM

Nope, when a DSR resolution is selected it defaults to RGB 8-bit. If I select 10 or 12 bit 4:2:2 or 4:2:0 the control panel wigs out and closes..

But, I've just been playing with 8-bit RGB HDR with DSR+DLSS and frankly it looks fantastic. I am now a DLSS convert, it's pretty awesome.

SIrPauly Jul 24, 2020 09:21 AM

If Nvidia can improve compatibility and content would also be awesome.

the_sextein Jul 24, 2020 10:51 AM

Quote:

Originally Posted by demo (Post 1338208275)
Nope, when a DSR resolution is selected it defaults to RGB 8-bit. If I select 10 or 12 bit 4:2:2 or 4:2:0 the control panel wigs out and closes..

But, I've just been playing with 8-bit RGB HDR with DSR+DLSS and frankly it looks fantastic. I am now a DLSS convert, it's pretty awesome.

Thanks demo, I don't have my 1080TI's in my system but when I was using HDR with Hitman 2018 I thought it looked good in 8bit RGB but it's probably a game by game basis on how important it is. I do think full RGB 8 bit looks better than 444 10bit when HDR is not enabled.

I was hoping to use DSR with DLSS to push 8k on my 4K panel. I was not sure I would be able to do it but now I know that I can so this information helps. Thanks for the info. I would assume running HDR in 8 bit would kill a lot of the color accuracy in some titles but it probably comes down to the game and weather or not the team pushed color to the limits for HDR or not.

I will probably use SLI to push 8K DSR on older games like stalker and GTA V. I will have to use DLSS to do it on newer titles. I figure DLSS might be needed for 4K on some titles like CyberPunk and it will probably come in handy at 4k on a regular basis once the card has been out for a while. I just hope Nvidia manages to support a lot of games with it.

Destroy Jul 24, 2020 12:14 PM

I feel like this DLSS thing will become a crutch and promote inefficient, sloppy coding.

the_sextein Jul 24, 2020 01:24 PM

You're probably right. That is why I always liked SLI. They still had to optimize to get the game running nicely on a single GPU for the mainstream and the second card allowed you to decimate the game without having to rely on hacks.

Still, if I can play at 4K native or 8K DLSS. I'll probably choose 8K DLSS unless there are serious visual issues that don't make up for the DSR upgrade. If I have to choose between 8K DSR and 4K HDR? I'll probably go 4K HDR unless I try it out and the 10bit color doesn't make much of a difference. I'll have to see on a case by case basis.

SIrPauly Jul 24, 2020 01:59 PM

If anything, Dlss type features may make it possible to enjoy Raytracing features or Path tracing abilities at higher resolutions more than without the feature.

The feature makes sense for Gpu limited titles like Final Fantasy or Monster Hunter.

The feature makes sense to get rid of temporal aliasing in modern engines with less limitations.

The keys may be quality and compatibility. Dlss 2.0 has made great strides here.

The other key is content. If Nvidia can get more developers to add it and possibly offer a more global setting enhancement would be welcomed, one may imagine. For example: Red Dead Redemption 2 is Gpu limited and offers Taa. Wouldn't it be fantastic to have a Dlss choice here? Probably, not going to happen but would be welcomed.

What an impressive crutch.

acroig Jul 24, 2020 03:23 PM

I agree with everything Pauly said.

bill dennison Jul 24, 2020 03:39 PM

is dlss proprietary isn't it


SO what is going to happen to it when AMD with Sony & Microsoft comes out with one that works on PS5, Xbox series x and Windows :hmm:

acroig Jul 24, 2020 03:49 PM

Quote:

Originally Posted by bill dennison (Post 1338208363)
is dlss proprietary isn't it


SO what is going to happen to it when AMD with Sony & Microsoft comes out with one that works on PS5, Xbox series x and Windows :hmm:

It will become part of DirectX.

Trunks0 Jul 24, 2020 04:28 PM

Quote:

Originally Posted by acroig (Post 1338208365)
It will become part of DirectX.

Basically.

I mean nVidia did DLSS the way they did, because it allowed them to do it sooner while also complimenting their AI business(tensor cores real target was never the consumer market). That was a win/win for nVidia. They got a feature out ahead of everyone, gave their RTX graphics cards an exclusive feature for abit and drove the cost of tensor cores down by putting them in a consumer device.

*Late edit*
p.s. You might also see this feature show up in a bigger engine before DX, Vulkan or in AMD drivers etc etc. As it doesn't technically require anything special to run something like DLSS card side. It can be done with general compute shaders as we saw with DLSS 1.9 in control.

the_sextein Jul 24, 2020 05:10 PM

Quote:

Originally Posted by SIrPauly (Post 1338208346)
If anything, Dlss type features may make it possible to enjoy Raytracing features or Path tracing abilities at higher resolutions more than without the feature.

The feature makes sense for Gpu limited titles like Final Fantasy or Monster Hunter.

The feature makes sense to get rid of temporal aliasing in modern engines with less limitations.

The keys may be quality and compatibility. Dlss 2.0 has made great strides here.

The other key is content. If Nvidia can get more developers to add it and possibly offer a more global setting enhancement would be welcomed, one may imagine. For example: Red Dead Redemption 2 is Gpu limited and offers Taa. Wouldn't it be fantastic to have a Dlss choice here? Probably, not going to happen but would be welcomed.

What an impressive crutch.

I agree, if Rockstar had implemented DLSS 2.0 without major quality issues I think 2080TI users would have been very happy to push 4k DLSS in that title.

Exposed Jul 24, 2020 05:58 PM

I agree with everything acroig said.

SIrPauly Jul 24, 2020 06:08 PM

Quote:

Originally Posted by Trunks0 (Post 1338208375)
Basically.

I mean nVidia did DLSS the way they did, because it allowed them to do it sooner while also complimenting their AI business(tensor cores real target was never the consumer market). That was a win/win for nVidia. They got a feature out ahead of everyone, gave their RTX graphics cards an exclusive feature for abit and drove the cost of tensor cores down by putting them in a consumer device.

*Late edit*
p.s. You might also see this feature show up in a bigger engine before DX, Vulkan or in AMD drivers etc etc. As it doesn't technically require anything special to run something like DLSS card side. It can be done with general compute shaders as we saw with DLSS 1.9 in control.

I belIeve Dlss 2.0 has been added to Unreal engine.

Trunks0 Jul 24, 2020 07:07 PM

Which isn't surprising. PhysX was the default physics engine in UE for a very long time. But I wouldn't be surprised if Epic is looking at it's own vendor agnostic DLSS alternative just for UE.

Nagorak Jul 24, 2020 10:51 PM

The problem with DLSS is I expect it's not going to work/be supported a lot in VR applications, which means it will not be a true substitute for more brute force power in VR.

Trunks0 Jul 25, 2020 01:01 AM

Quote:

Originally Posted by Nagorak (Post 1338208432)
The problem with DLSS is I expect it's not going to work/be supported a lot in VR applications, which means it will not be a true substitute for more brute force power in VR.

Not yet at least, but potential is there. I know FB are working on something similar for AI powered reconstructed foveated rendering for oculus VR.

Destroy Jul 26, 2020 05:22 AM

I'm just going to put this here.
Definitely a nice enhancement but not OMG worthy imo.


SirBaron Jul 26, 2020 06:35 AM

It's not proper ray tracing that's why it's not omg worthy.

By proper I mean the game engine was designed for it, and not just injected.

It looks nice standing still, running around inside interiors looks terrible because of it being injected.

Exposed Jul 26, 2020 01:27 PM

That's just fake reshade "ray tracing". Not even worthy of this thread.

SIrPauly Jul 29, 2020 12:20 AM

Dlss 2.0 compared to Checkerboard:

https://youtu.be/9ggro8CyZK4


Nicely in-depth analysis!

demo Jul 29, 2020 04:49 AM

Great video. What I find interesting is when they compare writing on the backpack at ~8:50. Native 4k is clearly sharper and and more detailed while at the same time cleaner, when looking at the letter N for example in the centre of screen. BUT, I noticed the grass looks far better with DLSS on, way smoother. I think that's just inherent of SSAA filtering.

I do notice a slight detail loss with DLSS enabled, but overall the image is improved.

demo Jul 29, 2020 04:58 AM

@ 13:45 they also mention the aliased "particle trail" in motion, that I mentioned earlier in the thread. That is the worst point of DLSS IMHO, it looks horrid and is on EVERYTHING once you notice it..

Exposed Sep 11, 2020 10:15 AM

I'm not really sure what the point of this was, but here goes :lol:


I tested Death Stranding with DLSS Quality and no AA at 4k. There's no built in benchmark so I just loaded my last save point and simply looked around, tried to make everything line up exactly but Daryl sure likes to fidget alot.

2080Ti and Intel 8700k, 16GB memory, all settings maxed minus motion blur.

3840x2160 no AA:





3840x2160 DLSS Quality mode:





So what was this for again? To show no AA is faster than DLSS Quality? It isn't. I didn't test with DLSS performance because there's already a 30+ difference here and Quality mode keeps the image quality comparable.

I didn't test with TAA because it's already been covered by dozens of websites.

Also, I tried testing at 1080p but framerates were capped at 120fps, according to afterburner. I don't know if that was an afterburner issue or game engine, but a 2080Ti is way overkill for 1080p and every setting was locked to 120fps.

OverclockN' Sep 11, 2020 10:24 AM

Wow, true 4k actually looks noticeably better in that game. The others I had to struggle to see the difference and it was just mostly lighting changes.

Exposed Sep 11, 2020 10:31 AM

Quote:

Originally Posted by OverclockN' (Post 1338223373)
Wow, true 4k actually looks noticeably better in that game. The others I had to struggle to see the difference and it was just mostly lighting changes.




Keep in mind, that's a no AA image. There's lots of eye bleeding shimmering there and stair stepping. It is ALOT more noticeable when moving around, as you would expect running a game without AA.

OverclockN' Sep 11, 2020 10:38 AM

Quote:

Originally Posted by Exposed (Post 1338223379)
Keep in mind, that's a no AA image. There's lots of eye bleeding shimmering there and stair stepping. It is ALOT more noticeable when moving around, as you would expect running a game without AA.

No, the differences I see are in the texture work. Specifically the rock is what caught my immediate attention. Then the scenery in the background, and then his backpack.

Exposed Sep 11, 2020 10:44 AM

Quote:

Originally Posted by OverclockN' (Post 1338223382)
No, the differences I see are in the texture work. Specifically the rock is what caught my immediate attention. Then the scenery in the background, and then his backpack.


You're seeing the sharpness of a no AA image. You're not seeing the artifacts of playing it in motion, which is eye bleeding.



If you're looking at the backpack, then surely you see the razor stepping of the metal on top?



Do you normally play games without AA? How do you stand the shimmering?


All times are GMT -5. The time now is 05:11 AM.

Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
All trademarks used are properties of their respective owners. Copyright 1998-2011 Rage3D.com