Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

The HU video is timestamped and you can see it mentionned at 45min.

HU is doing a cpu scaling vid, that many gpu reviews dont normally do as they tend to review a gpu with the latest and greatest setup.

But the whole point of AIB gpus is to upgrade the gpu on an older rig. Its kind of surprising so few do cpu scaling.

IMO many if not most are still running well under a 5600x cpu so such a review is valid today as game engines are now well thread aware.

You might think its not a problem putting a 3070 with say a 7700k when that setup is running enough fps to your liking but if you are hitting 100% on cpu you will hitch. And you will have less perf than on a radeon gpu or evenb some nvidia gpu that costs a lot less.

So what I think constitutes a problem, IE paying too much for a gpu for a given setup, might not be for you. It gets sujective as to what is a problem or not. IMO as I said I think this is.

I would think that *most* folks who do their own upgrades and follow hardware would never expect the same level of performance and resource usage by pairing a 30x0 with an older CPU as opposed to a newer one.

Couple that with the fact that maintaining compatibility for older hardware might be problematic after a few years. Drivers are developed for newer hardware moving forward, not maintaining for older hardware that just keeps getting older.

I have an old (3570k) CPU. I would never in my wildest dreams expect to pair a 30x0 with it and expect stellar performance from anything that needs that CPUs resources.

I'm filing this info under "nice to know, but kind of obvious" stuff...:bleh:
 
It all depends but there are many examples that are feasible in people minds without thinking about it. I think steam survey still puts 6 cores or lower at most gaming rigs which would not be a good pairing for many gpus. Theres even examples as HU showed a 1660ti not being a good pairing for a 3770k vs the R390.

So even low end gpus have to be scrutinized in cpu scaling for nvidia.
 
The HU video is timestamped and you can see it mentionned at 45min.

HU is doing a cpu scaling vid, that many gpu reviews dont normally do as they tend to review a gpu with the latest and greatest setup.

But the whole point of AIB gpus is to upgrade the gpu on an older rig. Its kind of surprising so few do cpu scaling.

IMO many if not most are still running well under a 5600x cpu so such a review is valid today as game engines are now well thread aware.

You might think its not a problem putting a 3070 with say a 7700k when that setup is running enough fps to your liking but if you are hitting 100% on cpu you will hitch. And you will have less perf than on a radeon gpu or evenb some nvidia gpu that costs a lot less.

So what I think constitutes a problem, IE paying too much for a gpu for a given setup, might not be for you. It gets sujective as to what is a problem or not. IMO as I said I think this is.

First off you don't need the latest and greatest CPU and if you're using an old AMD CPU, well then that's your problem because those CPU's were never good at gaming compared to Intel processors to begin with. So you're creating a CPU utilization issue just by using an inferior processor.

As for your timestamp, you focus on two sentences while ignoring the last 35 minutes. If you watched the entire video, the driver overhead issue is mitigated by moving the bottleneck from the CPU back to the GPU, if you insist on using a low end AMD CPU.

This article shows this even further:
https://www.techspot.com/article/2201-four-years-of-ryzen-5-gpu-scaling/

You can see you don't need a top of the line CPU, you simply move the bottleneck from the CPU to the GPU and cards like the 3070 will outshine a 5700XT at it's expected scaling rate despite using a lower end CPU.

So again, you're trying to make a mountain out of a molehill and this overhead situation will not apply to most gamers at their intended game settings.
 
It all depends but there are many examples that are feasible in people minds without thinking about it. I think steam survey still puts 6 cores or lower at most gaming rigs which would not be a good pairing for many gpus. Theres even examples as HU showed a 1660ti not being a good pairing for a 3770k vs the R390.

So even low end gpus have to be scrutinized in cpu scaling for nvidia.

Yeah Steam had 4 core CPUs around 45% I think...but at some point hardware and software have to meet, as in one drives the other to an extent. I'm not defending Nvidia as flawless btw...I've had my issues with them. I just think that realistically speaking it's not a major issue. Primarily because NV probably believes that anyone willing to drop the coin for a cutting edge card is also going to be keeping the rest of their rig up to date.

That might not always be the case, but it's not an unreasonable assumption either.
 
So again, you're trying to make a mountain out of a molehill and this overhead situation will not apply to most gamers at their intended game settings.


Its not greatly outperforming the radeon 5700xt. Especialy when considering the cost. And many people put the emphasis on fps vs image quality because medium settings or the automatic settings that games set themselves at will hit the bottleneck on the cpu.

My only 'agenda' is the truth. Nvidia had a great idea to force thread awareness in games under dx11 at the time. Now that games ar thread aware that driver is a burden.

And its not fixable driver wise. You either have to buy a gpu that has a hardware scheduler or upgrade your cpu.
 
It all depends but there are many examples that are feasible in people minds without thinking about it. I think steam survey still puts 6 cores or lower at most gaming rigs which would not be a good pairing for many gpus. Theres even examples as HU showed a 1660ti not being a good pairing for a 3770k vs the R390.

So even low end gpus have to be scrutinized in cpu scaling for nvidia.


I agree with this, mainly for mainstream and entry level gpu's.
 
Its not greatly outperforming the radeon 5700xt. Especialy when considering the cost. And many people put the emphasis on fps vs image quality because medium settings or the automatic settings that games set themselves at will hit the bottleneck on the cpu.

My only 'agenda' is the truth. Nvidia had a great idea to force thread awareness in games under dx11 at the time. Now that games ar thread aware that driver is a burden.

And its not fixable driver wise. You either have to buy a gpu that has a hardware scheduler or upgrade your cpu.


Not greatly outperforming 5700XT? Look at those charts again and see how the 3070 goes from a deficit at 1080p/low settings to pulling away significantly at 1440p+ high/ultra.

That's called moving the bottleneck from the CPU to the GPU.

Like Lazy8 said, this should be obvious of why this happening. Most gamers will not pair a 3070 or 3080 with a low end CPU to game at 1080p low detail settings.

It's great that AMD GPU's are more friendly for older/lower end CPU owners.

Having experienced this myself with my old 3770k and a 2080Ti, the obvious choice to me was to upgrade the CPU. I normally would never buy a 2080ti for a 3770k system, this happened to be a hand down from when I bought a 3080 for my main gaming system. So i couldn't fully use the 2080ti until the 3770k was replaced with a 10700F. Why would i expect to realistically get the same framerates from a decade old CPU to a new one? This applies to anyone with a low end CPU as well, even a newer one. Like HU stated, don't expect to pair a 3090 with an i3 and expect top of the line gaming performance.
 
Last edited:
FWIW, I think that "most people" would assume that adding something like a RTX3070 to their older 7700K system would bring them higher framerates than their existing AMD RX580, and would also be surprised to learn that their spanking new nVidia GPU brings higher CPU overhead along with it, considerably limiting any performance gain.

I think threads like this are important as a public service and should not be mocked or dismissed.
 
I think threads like this are important as a public service and should not be mocked or dismissed.

I have not seen that happening in this thread but if you do please report it so I can look at it.

All opinions given in thread are very valid and supported by the HU video.
 
Yeah, when I had my office PC still with an Intel 3777k (a 9 year old CPU) I put my old 2080Ti in there and got abysmal performance in many games.



I certainly would have been better off buying a low or mid range AMD card instead and still playing at lower framerates and minimal details , rather than upgrading the CPU to a 10700F and getting far better performance and quality instead. :rolleyes:

If you look at techspot benchies my vega56 is on par with 5600xt.You look at this benches you can pinpoint my rig as on par with 1600x + 5600xt combination.
So tell me what upgrade is worth in your opinion ? I am targeting 1080p ultra high and RTX full implementation and resources left for some years.My vega has almost 4 years in service.
I am looking at Cyberpunk 2077 1080p ultra high....The combination worth considering is 3090 + 5600xt to actually feel a really big leap and still have some performance left for some years.
And look at 1440p ultra high... THe only combination minimum 3600 + 3090 ... How funny isn't it ? :) The 5600x upgrade ofers 4-5fps added to minimum FPS.. There is an iminent fact that one will go under 60fps very quickly in the next years.I am talking a 3090 here and not some mid range or low end stuff...
I don't feel paying for 3000 series especially for the prices right now.Also when the 4000 series drops and i can upgrade the CPU market landscape will look very different.
 
Last edited:
Not really because even in dx11 as HU said most game engines have become thread aware and that explains why the r290 beats a 780ti handily now.

So its actually a problem in dx11 now as well but it wasnt way back when 4-5 years ago when they did that switch.

Its only not a problem if you run a high end cpu 5600x and up.


That's not exactly what they said, nor is that true. For example, the ONLY cpu bound games I have are all DX11 with poor cpu utilisation, which is where the software schedular shows most benefit. Think MSFS2020, ArmA3, Hell Let Loose.
 
FWIW, I think that "most people" would assume that adding something like a RTX3070 to their older 7700K system would bring them higher framerates than their existing AMD RX580, and would also be surprised to learn that their spanking new nVidia GPU brings higher CPU overhead along with it, considerably limiting any performance gain.

I think threads like this are important as a public service and should not be mocked or dismissed.


In the majority of titles, it does. This is good information though, I suspect Nv will put more effort into reducing overhead now.
 
I would upgrade a gpu at current prices only if my mental sanity is depending on gaming.

A 3060ti at 1300 euro ... Only one left!! Order soon.


https://www.amazon.de/dp/B08PFZ28CN...N&ascsubtag=x9MSKS-ScopkKE5lJP02VA&th=1&psc=1

RX6800 1500 euro ... Only one left in stock !

https://www.idealo.de/preisvergleic...n-rx-6800-phantom-gaming-d-16g-oc-asrock.html

Nvidia just removed its mining hash lock on the 3060 driver so dont expect prices to drop anytime soon.

https://twitter.com/ghost_motley/status/1371523980088184832
 
That's not exactly what they said, nor is that true. For example, the ONLY cpu bound games I have are all DX11 with poor cpu utilisation, which is where the software schedular shows most benefit. Think MSFS2020, ArmA3, Hell Let Loose.

You can add BF5 dx 11 to that list per the HU video at the 28 min mark. Who knows how many more? In dx 12 theres no doubt theres a top heavy per loss. In vulkan might not matter if the game already outputs hudnreds of fps...
 
Back
Top