At 1080p medium settings, though. Do you play at 1080p medium? If not, then I wouldn't worry about it much. The overhead issue is directly related to CPU limited scenarios, and with a 3060TI you'd more than likely be GPU limited at all times regardless of your CPU.
It would depend. But the story doesn't change much at 1440p, which is a little closer in difficultly to my 3840x1080 situation. If I'm trying to keep that FPS as close to 144 as I can then I would probably drop to medium settings to pull it off.
Of course what this really means is I need to upgrade the core of my system. But I've known that for awhile, just not a great time to do it.
For an enthusiast class gamer this may be a non issue but if you're an entry level or even midrange gamer it may be an issue and would like to see this improve.
The story does change at 1440p and it changes significantly when the bottleneck shifts from the CPU to the GPU. If someone wants to pair a $800 3080 with a 10 year old CPU or a $100 new bottom of the barrel CPU then that person should stop focusing about being CPU limited with medium details and instead shift the bottleneck from the CPU to the GPU.
That means cranking up the details, applying supersampling AA, DSR, etc.. whatever instead of focusing on medium. Being CPU limited at 130 fps on a 3080 means you can hold that same 130 fps while cranking up the settings until the GPU then becomes the bottleneck.
This is really a mountain of a molehill. Yes you can under certain conditions make a 580 faster than a 3080, but under realistic conditions you can make an 3080 absolutely stomp a 580 into the dirt where it belongs on the exact same machine/display setup.
.... if you have 4 core or mid or low range 6 core ...
Define this, please.
Well as was listed by HU AMD 3600x or lower, intel 4 cores but not sure on what 6 core intel. But certainly on a lot of common cpus.
IMO, some here are minimizing the problem. For many of us, whole system upgrades are not possible, so people put their money where they think the current bottleneck might be, which is often the GPU.
It is not remotely inconceivable that people with an older quad-core CPU will buy a high end nVidia 3000 series card to replace their aging Radeon, only to find that their FPS went down in the process, not realizing that the increased driver overhead has tanked the potential advantage. This phenomenon was unknown to me until recently, and I suspect that the majority of people assume that adding a high end GPU will automatically improve framerates, yet the evidence shows the opposite in cpu-limited scenarios, which really are not that uncommon for the majority of people who can't afford to constantly upgrade to the latest/greatest CPU/MB/RAM/Monitor.
IMO, HWUB is providing important information for the community here and this information should be shared widely so that others can make informed system upgrade decisions.
Its not just the 3080. They simply use it as a shock value on the charts. Its the entire Nvidia lineup thats affected on older or many mid/lower range cpus. Point is if you have 4 core or mid or low range 6 core you can get hit with this.
3060Ti or 3090, it doesn't matter. If you're CPU limited with Nvidia with an old CPU then you shift the bottleneck from the CPU to the GPU. It's funny to see the arguments cling to medium / low settings when cranking up to higher settings makes this "issue" basically disappear.
If cost was really a concern you'd be much better off just getting a console, and you wouldn't be limited to 1080p medium. You'd be running the same game like Watch Dogs Legion at (faux) 4k with much higher fidelity.
Or just quit being stingy and upgrade your CPU. It cost me $500 to upgrade my office 4770k system to an i7 10700F. That's CPU, motherboard, 16GB memory and a $40 cooler, paired with a 2080ti before and after the upgrade. Everything else including power supply and drives the same. Now I can play Watch Dogs 2 and Legion at something higher than 1080p/1440 medium, a much better prospect than switching out the 2080ti for a RX 580 and being stuck with those low details. People for some reason forget that for a gaming system, the CPU is just as important as the GPU.
IMO, some here are minimizing the problem. For many of us, whole system upgrades are not possible, so people put their money where they think the current bottleneck might be, which is often the GPU.
It is not remotely inconceivable that people with an older quad-core CPU will buy a high end nVidia 3000 series card to replace their aging Radeon, only to find that their FPS went down in the process, not realizing that the increased driver overhead has tanked the potential advantage. This phenomenon was unknown to me until recently, and I suspect that the majority of people assume that adding a high end GPU will automatically improve framerates, yet the evidence shows the opposite in cpu-limited scenarios, which really are not that uncommon for the majority of people who can't afford to constantly upgrade to the latest/greatest CPU/MB/RAM/Monitor.
IMO, HWUB is providing important information for the community here and this information should be shared widely so that others can make informed system upgrade decisions.
It actually crossed my mind to pair a 3080 with my ryzen 7 1700.I wanted to play RTX at 1080p full implementation.
10600K?