AMD vs. Nvidia new gen sales - one German reseller

I think a lot of people make a problem for themselves by buying insanely high res monitors, which then requires several factors more graphical power to push. I had no use for high end graphics cards until VR came along. For normal gaming, a mid-range card is more than enough for me.
 
My take on this is that, with a 24 inch screen, 1080P is very good to excellent resolution
depending on eye/screen distance and, as such, the vast majority of people already
have a great setup for their needs.

I say a 27 inch 1440P which is a slight improvement, but smaller than that, the improvement is marginal.

As resolutions creep up slower and slower, the need for high end cards goes down proportionally...

Even with the RX480, builders are showcasing 650$ 1080P full systems :)

This, of course, is great for the consumer.

Yup my 34" needed 1440p

Sent from my VS990 using Tapatalk

Nah, I'll stick to 25" due to higher PPI. I don't like seeing stretch images and less sharp texts over those both. :D
 
I think a lot of people make a problem for themselves by buying insanely high res monitors, which then requires several factors more graphical power to push. I had no use for high end graphics cards until VR came along. For normal gaming, a mid-range card is more than enough for me.

You can run a high res monitor at 1440p or 1080p, AMD scaling is rather good. So for games that don't play let say at 4K may very well at a lower Resolution. The benefit is you will get to play at 4K many games just fine particularly older ones and still play the ones that will not. :yes:
 
You can run a high res monitor at 1440p or 1080p, AMD scaling is rather good. So for games that don't play let say at 4K may very well at a lower Resolution. The benefit is you will get to play at 4K many games just fine particularly older ones and still play the ones that will not. :yes:

Out of curiosity, how is the interpolation when lowering res from 4k down to 1440p or 1080p? I always read about how "blurry" it is... Does it depend squarely on the monitor, graphics card, or both?
 
Out of curiosity, how is the interpolation when lowering res from 4k down to 1440p or 1080p? I always read about how "blurry" it is... Does it depend squarely on the monitor, graphics card, or both?

both, depends on the scaling option being used. Like noko pointed out before amds gpu scaling is pretty good. Turn it off and your at the mercy of whatever scaler your monitor has built in.
 
both, depends on the scaling option being used. Like noko pointed out before amds gpu scaling is pretty good. Turn it off and your at the mercy of whatever scaler your monitor has built in.

Gotcha. I would imagine having the GPU do the scaling would make the most sense... Will read more things.
 
That's a good point about the scaling. I guess things have come a long way there. When I've used non-standard resolutions lately I could barely even tell, which honestly should be the case, if you think about how seamlessly images resize in say Photoshop. The GPU should have plenty of horsepower to run a simple resize algorithm like that.
 
I think a lot of people make a problem for themselves by buying insanely high res monitors, which then requires several factors more graphical power to push. I had no use for high end graphics cards until VR came along. For normal gaming, a mid-range card is more than enough for me.

It's horses for courses really. I have a Fury Pro and game at 1440p with a 144hz freesync monitor and it's absolutely awesome. Fluid game play with no stuttering or tearing at very high FPS. Definitely happy chappy.

TBH 4K is still a long way from being mainstream (if ever?) and I agree that people buy 4K monitors and don't understand the graphics horsepower required to run them for gaming purposes. As for VR the timescale is even longer. It's debatable that it will ever get out of being a niche product for some time until the cost of ownership drops substantially. Might end up like 3D TV's :lol:
 
Out of curiosity, how is the interpolation when lowering res from 4k down to 1440p or 1080p? I always read about how "blurry" it is... Does it depend squarely on the monitor, graphics card, or both?

With AMD Nano, my 4K monitor when scaled by the GPU at 1440p blows away my 1440p IPS monitor. Pixel density of the 4K is way higher (less dark space between the pixels). 1080p is the same way - to me it gives better IQ . Now using the monitor scalar is usually crap, while the newer monitors are better they just don't put that much money or attention into it. So scale using the GPU. So I game at 4K with games that work well (Doom using Vulcan on Ultra keeps in the Freesync range easily, Mirror's Edge Catalyst I have the game render at 80% but still 4k resolution - looks awesome plus numerous old games run flawlessly or actually too fast). Game that don't play well like Deus Ex MD I will play at 1440p using the GPU doing the scaling or I use my 1070 on the 3440x1440p monitor except in this case the Nano and the 1070 are almost identical in performance on that game.
 
Last edited:
With AMD Nano, my 4K monitor when scaled by the GPU at 1440p blows away my 1440p IPS monitor. Pixel density of the 4K is way higher (less dark space between the pixels). 1080p is the same way - to me it gives better IQ . Now using the monitor scalar is usually crap, while the newer monitors are better they just don't put that much money or attention into it. So scale using the GPU. So I game at 4K with games that work well (Doom using Vulcan on Ultra keeps in the Freesync range easily, Mirror's Edge Catalyst I have the game render at 80% but still 4k resolution - looks awesome plus numerous old games run flawlessly or actually too fast). Game that don't play well like Deus Ex MD I will play at 1440p using the GPU doing the scaling or I use my 1070 on the 3440x1440p monitor except in this case the Nano and the 1070 are almost identical in performance on that game.

That is good to know and may actually affect my upcoming monitor purchase...

I've been waiting all year to jump to a decent 2k monitor, but now I'm thinking I can bring some 4k monitors into the mix as options, even if I'm not gaming at that rez.

Also, I must say that the options for decent gaming monitors out there seems to REALLY suck. Everything is a damn lottery as QC seems to be absolutely horrendous for most/all companies.
 
I would buy a fancy new Freesync monitor, but...

I run my sound through HDMI to a receiver. My monitor is hooked up via DisplayPort. I need to run in clone mode as HDMI audio shuts off without a video signal. That means I'm stuck at 1080p 60fps because my receiver doesn't speak 4k or 120/144hz. Looks like I might get around the resolution issue with Virtual Super Resolution enabled on the HDMI interface. In theory.

What I need is either:

1. A PCI-E HDMI audio card that outputs 8-channel PCM with a dummy video signal

2. AMD driver option to keep HDMI port always on with dummy video, but hide that interface from Windows so I don't have a phantom extended desktop
 
Does your receiver have an optical or digital coaxial input? Because if so, an easier method might be to get a sound card that can encode a multi-channel Dolby Digital or DTS signal. I know Creative cards can do that, and there are probably others as well.
 
Back
Top