![]() |
Quote:
|
Honestly, Metro dlss update just looks like they added a sharpening filter with outlines around everything.
|
I think DLSS just isn't going to work. It was an inappropriate use of machine learning. There's too much variation in the image data that's generated by a game and it just ends up averaging everything together. It can't come up with some magic filter that can be applied for a better result because it simply doesn't exist.
For some things AI is very effective, for many other things it is absolute crap. In the latter cases there's still no substitute for human intelligence. I think that what DLSS is showing us is that the upscaling and AA techniques that already exist are actually pretty good. There wasn't much hidden room for improvement. On the other hand, Nvidia has done a great job with its marketing to convince people to simply accept upscaling. So, maybe there is some good in that. Edit: Also I see the argument that "DLSS isn't just upscaling, it's supposed to do X". It doesn't matter what it's supposed to do. It matters what it actually does. If the technique isn't effective, which is what I suspect, then it really makes no difference if they tried to match an image running in 1,000,000P, it's still not going to work. The truth is sometimes you can crunch a lot of numbers and end up with nothing much at the end...it happens. |
Quote:
Nvidia has their own implementation on Turing: https://developer.nvidia.com/vrworks...blerateshading |
Quote:
Maybe we'll see better results with a smaller game like Atomic Heart. Though, the current implementation of Metro is pretty darn good for what it gives us. I'm sure we'll see the same for Battlefield 5 in the next patch they promised. |
Quote:
|
Don't desire 4k dlss to look like native 4k, but superior while moving. The key is the training image is x64 jittered super-sampled, so the strength, one may imagine is quality with moving with less aa limitations. Needs time and maturity.
|
It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase.
First, Battlefield V and metro do not fully implement Raytracing. They are only using it partially, as neither game would be playable at 4k if it was fully implemented, that is why it is only viable at 1080P if fully utilized. Removing and/or reducing what the game is ray tracing doesn't remove that fact. Second, DLSS and raytracing at 4k is a not 4k, it is a lower resolution (1440p) upscaled, which means the ray tracing (only partially being implemented) and AA is being applied to the lower image and then being upscaled (I originally thought ray tracing was being done after DLSS upscaling, but realized that to gain frame rate with DLSS, it can only be accomplished by also performing ray tracing at the lower resolution before it is upscaled). This means that it will never look as good as a true native 4k implementation. Basically it's the same as a HD 4k blue ray player that upscales standard Blu ray movies to 4k, they may look a bit better than the original resolution, but it will never be true 4k. (infact, if you ran a game at 1440p on a 4k monitor, doens't the monitor already upscale it to display it at full screen on a the 4k monitor?.. I know my 1440p monitor does it for 1080p... does that mean all games are now 4k.... oh wait) As for the video's that are showing the comparisons side by side being only at 1080p doesn't change anything. Because going from 4k to 1080p in a video will equally effect both the 4k DLSS example and the Native 4k example, so if the video's where indeed encoded and watchable at 4k resolution, the clarity differences noted would be identical to what we see in the 1080p videos as they are both effected equally as it wouldn't just effect the DLSS examples. But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D But right now, this is just smoke and mirrors in it's current form. Will it get better, Yes, at least where Ray tracing is concerned. But DLSS, in it's current form, is really just a fancy name for upscaling more so than AA. |
Scaling up and down, basics still apply. Some material can scale both directions, with unsharp-style sharpening filter to clean it up, and still look very, very good. Film/photography scale well being "analogue" in nature, but a pixel sharp digital render not so much when comparisons are made.
If the scene is so heavily processed there's barely any contrast, or the scene has toon-shader style graphics where texture isn't grainy enough, scaling can look very good when sharpened up slightly. Keep text native and GUI in native res, and it might not be so easy to spot any difference. But a "noisy" (as in lots of pixel detail, not just diffuse colorwith some shading) game like Metro, ....I think I'd keep the res native and turn down settings instead. I'd do anything to keep the game running at native res. Edit: just hit me, of course it's a personal preference. It's good to have options! |
Quote:
We already know no game is "fully ray traced", that's impossible with current hardware. "Fully ray traced" would be near unlimited rays with near unlimited light bounces. You must be referring to the performance gained by Battlefield 5. If so, the only thing they removed was the back and forth light bouncing from objects that didn't need to be ray traced, like leaves. That and further bug fixes and optimizations that resulted in much better performance without sacrificing image quality. This is old news and discussed in the Battlefield 5 thread numerous times already. Also I'm glad you pointed out 4k DLSS is not native 4k. None of us knew that before. Also keep up that tone, I'm sure a vacation will be in your future if you keep that up! |
Quote:
you can have RT at full 4k 30 to 45 FPS It's fun to watch all the non RTX owners with their sour grapes ………... is it perfect no is it better than anything AMD has now oh hell yes and my 2080 ti runs division 2 beta cranked up to max at 4k well witch is the main reason I got it (to run games at 4k) not for DLSS or RTX they are just a bonus sorry AMD has no card that will run games at 4k well when they do I will buy one but till then Vega 2 is a joke so NV is the only game in town . |
Quote:
Quote:
|
Quote:
The entire thing of DLSS to me is just one amazing example of how people still haven't mastered the idea of garbage in, garbage out. There is no way that you are going to ever have parity between an up scaled resolution vs. native resolution. There may be some trickery to go with it; you'll see some canned examples of how it is near perfect in the form of synthetic demos, but it will not be quick enough in any actual "gameplay" to be on that same level as simply running it at the resolution you want to claim it is. 4k DLSS isn't 4k. It is a marketing gimmick plain and simple. |
Tend to look at it as another welcomed choice to have for the gamer. 4k may be out-of-reach for some titles, where 4k dlss may be appreciated for some.
|
Quote:
|
Quote:
I have a choice between native 4k 30-40fps, 1440p native at 60+, or 4k DLSS at 60+. 4k DLSS looks better than 1440p to me so yeah, that is the choice EYE make. Also, I like how we're constantly reminded how 4K DLSS isn't native 4k, as if its some unknown revelation only now revealed by the last person who said it. |
Quote:
|
Quote:
Quote:
I find it troubling to also have them tout it as being faster than the native resolution for the simple fact that they are apples and oranges. You cannot have an actual comparison between native vs. DLSS and imply that one is 40% faster than other like some sites claim and seemingly ignore all of the weird artifacts, blurry textures, odd DoF, and more aliasing that is present in one, but not the other. It is just another thing that we're going to have to look for when someone makes any sort of claim on performance. I'm not telling you to not use the fancy new "feature", on the contrary, you do you (if you're allowed to) ... I just worrty for the time when it just further muddies already brackish waters on fair and accurate comparisons. Also, excuse my reluctance to herald Nvidia on adding more proprietary 'features' that never seemingly work out for the industry or the consumer. |
Quote:
Quote:
Quote:
Problem is, in this case AMD is pursuing the same technology as well, so this isn't a bush you can beat around much longer. They too are working on a DLSS implementation that takes a lower resolution image and reconstructs it to a higher resolution via AI. And their results will likely have the same "wierd artifacts" and "odd DOF" you speak of, probably worse if they do not have the same supercomputing resources as Nvidia to dedicate their AI networks with. |
Quote:
I was countering you comment about "what happened to the 'it won't be playable above 1080p' " comment, which you just admitted that it can't be done at 4k, as well as admitting it isn't true 4k.. so my comment stands that raytracing is only viable at 1080p fully utilizing it. Not the water down versions that we see in Battlefield V and Metro. Also, in battlefield 5, they removed the amount of ray traces being done per frame (If I remember correctly, it as more than a 50% reduction in what is being ray traced) not just the back and forth lighting etc. That was their answer to optimization. |
Quote:
Now, I may be wrong, but that is my take on it. |
Quote:
|
Quote:
Quote:
As for the demos, there are two titles that actually use this right now and a canned benchmark. Of the two games, where you choose to take whatever path you want, look wherever you want, shoot whatever you want we've seen that DLSS results in a blurrier picture with shimmering artifacts, textures that lack detail, and aliasing. Of the canned benchmark which the developers choose the path you take, choose what you look at, choose what happens in the scene, and make sure that it happens this way every time then you see "better" performance of the AI. It is like congratulating the person who studied for a test via having a copy of the questions beforehand. If nothing changes from each run, why would it not get better? Yet, how is this in anyway an accurate reflection of the technology to you a gamer? What you want to prop up as amazing is just the masturbatory product of letting a computer look at the same sequence of images for hours upon hours and then letting it give you a result that is still inferior to the native resolution. No matter how "close" it is, it is still an unachievable result for any game where the player has any agency in the course of its strung together images. Quote:
As far as bias goes, some of it is well earned. The whole port royal nonsense above just reminds me of the time they were caught in 3DMark03(?) actively occluding things to artificially lighten their load and result in higher performance that never saw an ounce of utility in the real world games you would actually play.... sounds awfully familiar. Quote:
|
Quote:
Quote:
https://deepsense.ai/using-deep-lear...er-resolution/ Quote:
Quote:
Quote:
You are arguing a frivolous point because you just want to argue a non argument. Have you tested DLSS yourself? On what card? On what machine? Quote:
Quote:
Quote:
Quote:
The funny thing about DLSS is, it's all about image quality. So reviewers are specifically looking at image quality down to the pixel. So you point out one credible source that Nvidia is "cheating" in Port Royale or "cheating" in some other way. Go ahead, I'll wait. Or could this be a case of the usual... Quote:
|
Quote:
It's fun to watch all the RTX 2080ti owners fall for a placebo effect to substantiate their expensive purchase. But, hey, keep believing in the placebo effect, if it makes you happy and feel better for spending $1300 on a GPU. :D Essentially you dropped in with those statements with the clear intent to troll/bait. Quote:
As I said before, Battlefield 5 removed ray tracing on objects that didnt need them, like leaves. Did you see any reflections on leaves before the RTX performance patch? No, so that wasteful resource was eliminated. And there was an RTX rendering bug that allowed a large number of light bounces that they fixed. I feel I am repeating myself here. There are videos from Digital Foundry and Dice themselves on the specific improvements in the other Battlefield 5 thread. Go have a gander. If all you wanted to say was ray tracing is only really playable in 1080p "fully utilized", then that is yet another wrong statement...just for the fact that Battlefield 5 was fully playable at 1440p 60fps BEFORE any performance patches. Again, I feel I'm repeating myself here, this was posted in the Battlefield 5 RTX thread. |
It is quite ironic you want me to educate myself on something you obviously fail to grasp.
Answer me this. If I take an image that is 1440p and up scale it to 4k, how do I account for the fact there are fewer total pixels in the 1440p image? What technique is used to reach the higher pixel count? |
Quote:
We already know 4k DLSS is 1440p upscaled. The question is if DLSS is doing enough a good job to warrant its use. For Metro Exodus, YES. For Battlefield 5, NO. Metro Exodus gives a good, clear representation of the benefits of DLSS. It's quite funny reading the doomsday argument about blurring, artifacts, etc.. when I'm playing the game right now with everything Ultra including RTX and DLSS looks picture perfect as can be. I'd say it's about 90% equivalent to the native 4k image, with none of the aliasing artifacts. Don't take my word for it though. There are other 2070/2080/2080Ti owners here. DLSS also works great at 1440p according to acroig since that's his native resolution. |
Quote:
|
Quote:
You know what happens when you take a 1440p image and upscale it 4k like in photoshop? It looks like crap. I'm 99% certain that's how you think deep learning super resolution works. And there in lies the fault of your argument. To answer your question, answer my question.... What if you take that 1440p image, look at it side by side with a native 4k image, would you be able to derive an algorithm that eventually comes close to the native 4k image? Because that's exactly what deep learning super resolution does.... you train the AI on ground truth images, feed it thousands of times over and over and over days or months, and eventually it will learn to produce similar native quality results from those lower resolution images. Those "missing pixels" are extrapolated from the reference native images, something you can't do with a simple upscaling filter. If you understand this then you can understand why Nvidia has a dedicated supercomputing cluster for training DLSS on games and why the more training time the better the output will be. Probably also explains why it takes so long to get DLSS implemented in game, and they likely rushed it for Battlefield 5. They did mention the next BF5 patch will vastly improve DLSS, it's unfortunately they rushed it out prematurely. Lastly can you do me a favor and actually play with DLSS on/off first hand in games like Metro Exodus/FFXV before making judgement? I'm sorry, but 1080p youtube videos don't cut it. |
Curious to know more about the inner workings of the tensor cores on how it makes Dlss possible. Need to learn more on this.
|
All times are GMT -5. The time now is 02:23 AM. |
Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
All trademarks used are properties of their respective owners. Copyright ©1998-2011 Rage3D.com