Showdown Fury X vs 980ti

Status
Not open for further replies.
Here is my take on the head to head fight.


If its pure performance over everything else than yes this round goes to the 980Ti in lower than 4k especially when OCed. To be fair we have not seen unlocked voltages from Fury yet and that could or could not help.

Without dropping extra money on a non reference 980ti you are getting one hot and loud card. I mean it thermal throttles at 84c because of power leakage concerns. Some are only 30 bucks to deal with the but the CLC is an extra $100. so a CLC 980Ti for a CLC FuryX is $749 vs $649. avg temps from reviews have been in the 65C on load for the Fury X with a thermal throttle at 75C. Most reviews never got it to 75C unless they ran furmark.

If its power usage the 980ti wins ish they are within 20-30 watts and trading blows. to truly beat the FuryX you have to OC the 980ti and there goes power draw.

If its price? a card that trades blows with the 980ti especially high end enthusiast cards that are meant for high resolution that comes factory water cooled to make sure that heat isn't an issue (and power leakage) shouldn't it be priced with its direct competition at what is now the enthusiast resolution aka 4k.

so lets break down to resolution and refresh rate on pure fps

1080p 60hz - why the **** are you spending that much money for this res get a 970 or a 290 and be done with it.

1080p 144hz - either should be ok but the 980ti will give you the better time.

1440p 60hz - either card will work 980ti has an advantage

1440p 144hz - you are getting 2 cards deal with it

4k 60 - either card works but really you need 2 cards

4k 144hz - ha see you in like 3 years.

Bottom line if you want to OC a 980ti and still have it be reasonable in noise / heat you are paying $30-$100 more than the reference card. so lets really call the price point $649 for Fury X vs $679 (starting price) for 980Ti
 
Last edited:
Edit: Thanks Gandalf good start to a new thread. :D


I've been lurking a bit as my job has been busy lately. I'll start? So what i just said below is just what you said kinda.

980Ti
-Great performance with 1080-1440-2160p
-6GB frame buffer
-Overclocking is very very good. Around 30-40%+

Fury X
-Great 2160p Performance, not in 1080-1440p
-Only 4GB frame buffer
-Low overclocking ability as of NOW


Ill add to this. The Fury and Fury Nano is what will make this gen with AMD worth it IMHO. It Seems both the Fury and Fury Nano will be 56Cus. Fury clocked at 1050Mhz like the Fury X, and im guessing the Nano @ 800-850. Kombi said something about driver improvements that left everyone questioning it. WHY? The 780Ti beat out the 290x at one point and now in some games the 290x and 980 duke it out. Overall the 980 is a better card (watts/performance) but thats pretty damn good. ESP when the 780 isnt even DX12, as the 290 is.
 
Last edited:
@demo

The large amount of saved CPU time can be put towards other tasks of course, in the region of 50-60% iirc, possibly more. I know you're too good for consoles (your loss ) but thanks to similar tech they're able to do some very impressive stuff considering the hardware they have to work with, things that would make your PC work up a sweat with current API's.


Indeed they are, just wish the PC platform actually got anywhere near the same attention and "close to the metal" programming attention that consoles do....Seems that the general attitude towards the PC version of any title is that as long as it runs well enough, it's good enough and let's not waste time making it better....



And don't get 3x 390x over 2x TI. A G1 TI is only 4% behind 295x2 at 4k. That says it all really. 390x CFX is only what, 10% faster than 295x2? Plus 50% more power and heat for less performance and more issues is not clever.

You need to get this vram obsession out of your head. 6GB is fine for a while to come, easily until your next upgrade. In the rare scenario it's not, are you going to sacrifice performance, driver issues, power, heat, and whatever else just for the extra 2GB of Vram on 390x? Great, you just made your system slower and worse 99.9% of the time, just to cover the other 0.1% when you might need more vram.. Just turn a setting down for heavens sake. If you're that worried be a man and get 12GB Titans.

Btw, remember your vram readings in AB need to be divided by the number of GPU's you're using.. I suspect you aren't using as much vram as you think.


4k is for peasants ( joking)......I'm already at near 5k resolutions using triple 1440p displays in portrait mode (4725*2560 resolution with bezel compensation), and it's 50% more demanding than 4k with ~8 million pixels for 4 k versus 12 million at my resolution, for every frame rendered.


Frame buffer use is nuts at these resolutions, and there's no way cards with 4GB onboard are going to handle this......So mine may be the hot running and energy guzzling 290x, but thank the lord they have 8GB on board....


Spoiler, playing battlefield 4 in ultra mode at that resolution is a 70 to 120 Fps experience, with the odd dip to 50 Fps once a shell from an enemy tank lands on my lap or something....:lol:


The idea is to move to 4K using triple panels, so it goes from 12 million pixels to 24 million for every frame.....I essentially double the workload for each frame rendered, and need that extra memory.

__________________
 
The good thing; at least from what i read; DX12 will combine buffers, so if i ever do get 2 Nanos, ill have 8GB of mem. :)

And you are rocken 2 290x 8GB shadow?
 
My showdown of the two cards.

980 Ti

Buy 2 of them.

Fury X

Spend money on lobotomy instead.

:bleh:

But seriously.

980 Ti
+ Performance at 1440P w/ 144 Hz is boss
+ Game ready drivers for the most part work well
+ SLi is hassle free for the most part or quickly fixed
+ Free game (sold my 2 codes for 54 / 50 after fees)
+ Multitude of cooling options
+ Overclocking performance makes my nuts tingle
- Price
- Heat, was getting used to 70 C temperatures (still to test though)

Fury X
+ Good CF scaling in old games
+ Comparable performance at 4K
+ Cooling performance at load
- Reported crossfire issues/choppiness in some newer games
- No DVI
- No HDMI 2.0
- Forced AIO setup (forcing me to change case if go that route)
- Forced new monitor purchase OR purchase of active DVI adapter possibly causing latency issues
- Noisy pump at idle
- Cooler master products lol
- High price
- 4 GB of ram
- Typically delayed drivers for latest games (especially this year)
 
@demo__________________

Oh I thought you saw the light and were finally ditching the bezels! :p and moving to a single large 4k display, in which case 6GB should cut it for a while.


Not sure I'd be comfortable with 8GB for 4k surround though, that's quite a load. A quick google says GTAV uses up to ~14GB at that res, and I know it's not uncommon for games to go over 5GB on a single 4k display.

If you're going to go for such a display setup you might as well do it properly and get 12GB TitanX's tbh, not only for Vram but also because I don't think quad 290x/390x's will cut it at that res. A single overclocked TitanX is often around 295x2 speeds already.
 
Oh I thought you saw the light and were finally ditching the bezels! :p and moving to a single large 4k display, in which case 6GB should cut it for a while.


Not sure I'd be comfortable with 8GB for 4k surround though, that's quite a load. A quick google says GTAV uses up to ~14GB at that res, and I know it's not uncommon for games to go over 5GB on a single 4k display.

If you're going to go for such a display setup you might as well do it properly and get 12GB TitanX's tbh, not only for Vram but also because I don't think quad 290x/390x's will cut it at that res. A single overclocked TitanX is often around 295x2 speeds already.


I'm actually hoping for larger than 27" 5K displays, which is still 14 million pixels per frame....Something in the 32" range at least, otherwise reading text in 2D within windows might be problematic unless I'm really close to the screen....That's a lot of pixels...LOL.


The triple 4K option would likely be suited for the next 16nm generation, as the GPU's might be close to twice as fast as Fiji or GM200, and HBM 2.0 allows capacities up to 32GB using 4 stacks, which is nuts but I could see a high end gaming card using 16GB.....The Titan-X 16nm successor for instance and going for 1000$ a pop might be a contender.


Cheaper 16nm cards, but still high end ones at the usual 600~700$ price range might move to 8GB of HBM 2.0 by default, and yes there are still bezels to deal with in triple monitor, but there's some good 4K displays at the 1000$ mark using IPS panels that swivel on their stands and have pretty thin ones these days....Asus recently announced one:


http://www.asus.com/Monitors/PA328Q/


PQHXy2lp0qJoMmqc_setting_fff_1_90_end_500.png



jD2SVLdzoF1nRFoB_setting_fff_1_90_end_500.png



As I said....Pretty thin bezels and color calibrated AH-IPS goodness....:drool:
 
Just wondering but how much can the 980 ti oc without voltage adjustments?
 
Just wondering but how much can the 980 ti oc without voltage adjustments?


Heard that 1200 Mhz is easily doable, which is impressive but I do keep in mind that hardly any game really stresses the chip to 100% load, hence why there's so little difference in performance between the Titan-X and GTX980TI, despite the latter having disabled shaders and texture units.....It hardly matters in real world terms.


The real test might be once games really push the hardware hard to force to use all the resources without it to 100%.....The current overclocks might not be so high without increasing voltage, but that might take years and new hardware will be out by then anyhow.
 
Ya that is pretty good. Bit surprised that games dont max the hw tho. You'd think the hw could be max used when facing fps limitations at higher rezs like 4k.
 
Ya that is pretty good. Bit surprised that games dont max the hw tho. You'd think the hw could be max used when facing fps limitations at higher rezs like 4k.


Higher resolutions can stress some parts of the chip, like fill rate and memory bandwidth, but if we're talking about a game that's already out and was developed before the GM200 even existed, the amount of polygon's in game is hardly stressing the geometry engine, or the game simply isn't demanding enough on the shaders to need all of them, and the same goes for the amount of texturing power available.


Changing resolutions doesn't really stress those parts that much more basically....Nothing in the case of geometry.
 
Equal gaming experience.

PLP support
Eyefinity support flexible solutions.
New tech built for the future.

Buying Fury its a no brainer.:)
 
It would be nice if TweakTown did frametime analysis as well. This FPS comparison doesn't really paint the full picture.
 
guru3d still does, they did some fcat testing on there fury review. Bit of a pain with fcat being dvi only.

yea they did it on the 980 it single card but the whole point of it was sli/cfx

as for fury
Note: The AMD Radeon Fury X does not have a DVI output. For FCAT at 2560x1440 (WHQD) we need a Dual-link DVI connector, for which we split the signal to the frame-grabber. This is not possible. We converted the HDMI output to DVI, however that's not a dual-link and as such the highest resolution supported is Full HD. So we had a dilemma, not do FCAT at all, or revert to 1920x1080. We figured you guys would love to see FCAT results, hence we compromised for Full HD over WHQD.

I don't get it

I have a two displayport to dvi-d on the shelf they are not hard to get that will do 1600p

http://www.newegg.com/Product/Produ...ayport_to_dvi-d_adpter-_-12-400-320-_-Product
 
Status
Not open for further replies.
Back
Top