Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

You can add BF5 dx 11 to that list per the HU video at the 28 min mark. Who knows how many more? In dx 12 theres no doubt theres a top heavy per loss. In vulkan might not matter if the game already outputs hudnreds of fps...


It's well known the BF series scales beyond 4 cores for many years now. Most people are not CPU bound in that title.
 
Well the 3770k is 8 threads there and its not helping... Seems the cut off where this isnt an issue is 6 core intel or 5600x and up but that cut off may grow over time with the size and complexity of games and the nvidia driver.

IMO Nvidia needs to go back to a HW scheduler. I dont care to see AMD become King of the Hill for too long anymore than I like Nvidia or intel. Crazy banana prices is all we get over it.
 
Well the 3770k is 8 threads there and its not helping... Seems the cut off where this isnt an issue is 6 core intel or 5600x and up but that cut off may grow over time with the size and complexity of games and the nvidia driver.

IMO Nvidia needs to go back to a HW scheduler. I dont care to see AMD become King of the Hill for too long anymore than I like Nvidia or intel. Crazy banana prices is all we get over it.

Threads and cores are not the same thing.

I don't see a need to go back to HW scheduler. The SW scheduler is still doing what it was designed to do - increase performance in games with poor CPU utilisation, where it's needed most.
 
If you look at techspot benchies my vega56 is on par with 5600xt.You look at this benches you can pinpoint my rig as on par with 1600x + 5600xt combination.
So tell me what upgrade is worth in your opinion ? I am targeting 1080p ultra high and RTX full implementation and resources left for some years.My vega has almost 4 years in service.
I am looking at Cyberpunk 2077 1080p ultra high....The combination worth considering is 3090 + 5600xt to actually feel a really big leap and still have some performance left for some years.
And look at 1440p ultra high... THe only combination minimum 3600 + 3090 ... How funny isn't it ? :) The 5600x upgrade ofers 4-5fps added to minimum FPS.. There is an iminent fact that one will go under 60fps very quickly in the next years.I am talking a 3090 here and not some mid range or low end stuff...
I don't feel paying for 3000 series especially for the prices right now.Also when the 4000 series drops and i can upgrade the CPU market landscape will look very different.

Well it's like Acroig said, if you want ray tracing at 1080p and especially Cyberpunk there really is only one consideration, something from Nvidia. But even if it wasn't ray tracing you're looking at, you have to find a card that balances with your system, and what settings you want to play games at. If I was really in your shoes a video card would be the last thing to look at, first would be the monitor (1440p minimum, pretty cheap these days even with freesync), the CPU, and then the video card 3rd. Even just going to 1440p you can invest in a video card that will benefit it much more on the same processor than if you remained at 1080p.
 
We're sort of moving off the original topic, but right now if you see a GPU in the performance class you want from either AMD or Nvidia near to MSRP just buy it. Better ray tracing or whatever doesn't matter, just get whatever you have a shot at. Don't even think twice.
 
On the original topic, I do think this is something that is worth being aware of and I look forward to seeing more testing on it. No it won't matter if you have a high end CPU, but when building a PC for a friend who doesn't have a large budget, or maybe in a secondary PC, it's worth being aware if in some cases an AMD GPU would be a better choice.

I don't think it hurts to test in any case, even if it doesn't come into play all that frequently.
 
While its lacking in terms of DLSS the 6000 series may not be that far behind at native resolution in RT when properly optimized... Some show it at about 90% of RTX 3K. One guy has showed that RT on 6000 is woefully underutilized when only optimized for Nvidia.

But Id wait for the DLSS equivalent first before making a choice if you can wait that is.
 
Well it's like Acroig said, if you want ray tracing at 1080p and especially Cyberpunk there really is only one consideration, something from Nvidia. But even if it wasn't ray tracing you're looking at, you have to find a card that balances with your system, and what settings you want to play games at. If I was really in your shoes a video card would be the last thing to look at, first would be the monitor (1440p minimum, pretty cheap these days even with freesync), the CPU, and then the video card 3rd. Even just going to 1440p you can invest in a video card that will benefit it much more on the same processor than if you remained at 1080p.


The falling in Cyberpunk from 1080p ultra to 1440p ultra 86fps to 62 fps in minimum framerate is substantial really.I keep a card for a number of years and that means it will be overwhelmed faster in newer RTX games.And i am talking about RTX3090.Also i postponed the CPU upgrade because i am not upgrading at all anytime soon.I decided to play old games or newer games like Doom eternal that are easy on the current rig.When i will upgrade i will have different options and prices than now in the CPU market.Another matter is you don't when the GPU prices will come back to decent levels.Investing now in a CPU that i cannot use fully is a waste of money.I am still upgrading collateral components like monitor.The monitor market is not having such drama like GPU market right now.There are also stock available for my favourite models.
Also you forget about VSR.Is easier to go up in resolution than to go down.I didn't see blurring and stuff going up in resolution.Once you go 1440p native panel you cannot go 1080p if necesary.It will look crap.This is not acceptable for me.If you have guaranteed deep pocket and afford expensive gpu upgrades at every 2 year at least go ahead buy a native 1440p.In Cyberpunk the 1080p Ultra is equivalent with 1440p medium in performance.
 
Last edited:
There will probably not be another game as RT-heavy as cyperpunk this gen, the consoles can't handle it.
 
Watch Dogs Legion only had ray traced reflections and they had to water that down significantly for consoles as well. These consoles just came out too. :bleh:
 
Get Rid Of Nvidia Bloat...

Get Rid Of Nvidia Bloat...

Extract the Nvidia driver pack using whatever too, or just run the exe and then cancel the installation, the install file will be under the C:

Delete the highlighted folders:

AhxjhyT.jpg


You don't need Nvidia Audio or Physx unless you run HDMI to an audio source like an AV receiver, TV or Soundbar. And Physx... well....

Next open the Setup.CFG filed located in Nvidia Folder and scroll to the bottom

fL88lPz.jpg


Delete the highlighted section.

Your folder should now resemble something like this:

rdalQ53.jpg


YAY!!!! No more junk, run the setup.exe

Don't be a fool, stay in skewl.
 
The falling in Cyberpunk from 1080p ultra to 1440p ultra 86fps to 62 fps in minimum framerate is substantial really.I keep a card for a number of years and that means it will be overwhelmed faster in newer RTX games.And i am talking about RTX3090.Also i postponed the CPU upgrade because i am not upgrading at all anytime soon.I decided to play old games or newer games like Doom eternal that are easy on the current rig.When i will upgrade i will have different options and prices than now in the CPU market.Another matter is you don't when the GPU prices will come back to decent levels.Investing now in a CPU that i cannot use fully is a waste of money.I am still upgrading collateral components like monitor.The monitor market is not having such drama like GPU market right now.There are also stock available for my favourite models.
Also you forget about VSR.Is easier to go up in resolution than to go down.I didn't see blurring and stuff going up in resolution.Once you go 1440p native panel you cannot go 1080p if necesary.It will look crap.This is not acceptable for me.If you have guaranteed deep pocket and afford expensive gpu upgrades at every 2 year at least go ahead buy a native 1440p.In Cyberpunk the 1080p Ultra is equivalent with 1440p medium in performance.

badsykes if you're not in lock down you need to get out more as you're contemplating your navel too much. Firstly the monitor issue of 1.09bn vs 16.7m colours which most people won't see and now driver overheads limiting slower CPU's. You're drilling down into the minutiae too much and missing the big picture (pun intended) that a lot of people prefer gaming at higher resolutions.

Personally I'd prefer 62fps at 1440 than 86fps at 1080. I left 1080 behind a long time ago as a lot of people have done. I don't understand your comment about buying a 1440p native monitor and wanting to play at 1080? Why buy the monitor in the first place it's crazy.

The RTX 3090 won't be overwhelmed anytime soon as Mangler pointed out the consoles can't handle heavy RT and as most games are console ports a 3090 will be fine for a few years to come. In fact at 1440p the 3090 wasn't really being taxed as some benchmarks I ran were showing 50% GPU utilisation. Now I'm gaming at 3440x1440 the GPU utilisation is in the 90's so much better.
 
badsykes if you're not in lock down you need to get out more as you're contemplating your navel too much. Firstly the monitor issue of 1.09bn vs 16.7m colours which most people won't see and now driver overheads limiting slower CPU's. You're drilling down into the minutiae too much and missing the big picture (pun intended) that a lot of people prefer gaming at higher resolutions.

Personally I'd prefer 62fps at 1440 than 86fps at 1080. I left 1080 behind a long time ago as a lot of people have done. I don't understand your comment about buying a 1440p native monitor and wanting to play at 1080? Why buy the monitor in the first place it's crazy.

The RTX 3090 won't be overwhelmed anytime soon as Mangler pointed out the consoles can't handle heavy RT and as most games are console ports a 3090 will be fine for a few years to come. In fact at 1440p the 3090 wasn't really being taxed as some benchmarks I ran were showing 50% GPU utilisation. Now I'm gaming at 3440x1440 the GPU utilisation is in the 90's so much better.

I wanted to move away from 16.7 8bit panel that are being phased out ... Plain and simple.Also 1.07b is not worthless.It helps with gradients and fine stuff.Regarding monitors gaming community was so brainwashed to accept any missery and all sorts of compromises. (I am pointing to Lurk situation)

https://www.rtings.com/tv/tests/picture-quality/gradient --- For me personally is noticeble.


A 3090 with 62 fps mins will last much fewer years at 1440p Ultra.Not even 4k.This is a 2000$ card!!! It will be quickly overwhelmed.I would not give the 3090 more than 2 years until it can't sustain minimum of 60fps in fully RTX games.Also the deeply need of DLSS smells very bad.I have a feeling that deactivating DLSS on 1440p ultra you may not like it.If you have deep pockets to pay 2000$ every 2 years to keep 60fps minum in RTX games at 1440p be my guest.

Mangler has a valid perspective but I still don't think fully that we will be "stuck" the next years.We will see.
 
Last edited:
Back
Top