Mother Of God Dual Fury-X GPU Officially Announced

Hapatingjaky

New member
Ef14rYJ.gif


http://www.guru3d.com/news-story/dual-fiji-amd-radeon-pro-duo-announced.html

My God :twitch:

A monster of a card especially in DX12 titles.

Too bad no real reviews as of yet. But dam, I would like to test one of these out.
 
Last edited:
I want to see benchmarks of this thing SO BAD !!!

Also, wondering how flexible the Fury template is. A few tweaks and the new HMB2 or whatever the new mem chips will be can be dropped in? Or would it require a whole new marchitecture and controllers and all that?
 
This product is more for content creation industry and VR and workstations.

As a consumer product it's a sitting duck because Polaris is coming... power consumption and thermals are ridiculous...no wonder it's watercooled.

Imagine the PSU you would need if you had Dual, Dual Fury X Radeons for Quad-Crossfire on a Dual CPU socket Workstation board... 2,000W PSU??!!

The lights would dim and the electric company would either love you or bomb you.
 
This product is more for content creation industry and VR and workstations.

As a consumer product it's a sitting duck because Polaris is coming... power consumption and thermals are ridiculous...no wonder it's watercooled.

Imagine the PSU you would need if you had Dual, Dual Fury X Radeons for Quad-Crossfire on a Dual CPU socket Workstation board... 2,000W PSU??!!

The lights would dim and the electric company would either love you or bomb you.


You know things are bad when the grow op next door complains your drawing too much power.
 
I wanna see benchmarks too...but since this is a content creation workstation product...I really wanna see Quad-Crossfire on a dual CPU socket Workstation board...just to know at least. I mean at $1,495.99 a single card and workstation CPUs costing pretty pennies...man just wanna see the maxed out potential.

Like this old (now) but still supremely powerful Intel dual socket Workstation quad sli/Crossfire (by Asus) capable board.

http://techreport.com/news/27041/asus-dual-xeon-workstation-board-exudes-glorious-excess

Polaris is bringing Fin Fet and 14nm process...I'm expecting comparisons later on.
 
Last edited:
Polaris isn't the bullet you think it is. It looks like it is the low mid parts using HBM1 well Vega in early 2017 is HBM2 and likely the big brother to Polaris. With Navi coming in 2018. AKA AMD is doing a new architecture every year for the next 3 years.
 
Maybe not for a while but by then Fury X will be way over a year old so I wouldn't doubt AMD having a budget part that closes the gap on Fury X.
 
This product is more for content creation industry and VR and workstations.

As a consumer product it's a sitting duck because Polaris is coming... power consumption and thermals are ridiculous...no wonder it's watercooled.

This. Well said.

Polaris isn't the bullet you think it is. It looks like it is the low mid parts using HBM1 well Vega in early 2017 is HBM2 and likely the big brother to Polaris. With Navi coming in 2018. AKA AMD is doing a new architecture every year for the next 3 years.

Seems P11 is using GDDR5 (not X from what they are saying), and P10 might be using HBM1.
 
I think it's funny that this dual card has basically the same radiator as my FuryX. Also, isn't this more of a dual *Nano* card, not a dual FuryX, with slightly dialed back specs?

Also, I've repeatedly heard that dual cards don't do well in VR, due to latency concerns apparently. Yet the PR for this card mentions VR, specifically in reference to LiquidVR: Is this tech supposed to solve the latency issues with dual screens?
 
I think it's funny that this dual card has basically the same radiator as my FuryX. Also, isn't this more of a dual *Nano* card, not a dual FuryX, with slightly dialed back specs?

Also, I've repeatedly heard that dual cards don't do well in VR, due to latency concerns apparently. Yet the PR for this card mentions VR, specifically in reference to LiquidVR: Is this tech supposed to solve the latency issues with dual screens?

I am expecting continued driver improvements from AMD which will address any supposed latency issues or what have you with dual gpu cards and other crossfire setups. They seem pretty strong on this whole VR thing and I assume want to make a good impression.

Also, why is it funny that the radiator is the same? I read this type of commentary in another forum as well and it makes no sense. The radiator on the FuryX is absolute OVERKILL. I can run the fan at 15% speed full time, with temps never exceeding the 50c range on my FuryX. And this is with 24mv added and OC'd to 1100mhz, 545mhz on the vram. There is plenty of headroom to cool a dual GPU setup. Also, for all we know the pumps may have some improvements as well.
 
I think it's funny that this dual card has basically the same radiator as my FuryX. Also, isn't this more of a dual *Nano* card, not a dual FuryX, with slightly dialed back specs?

Also, I've repeatedly heard that dual cards don't do well in VR, due to latency concerns apparently. Yet the PR for this card mentions VR, specifically in reference to LiquidVR: Is this tech supposed to solve the latency issues with dual screens?

latency comes from the pcie bus.

dual gpu cards have a built in plx chip, so its very low latency.
 
I'm still debating getting a pair of these for some quad crossfire insanity.....:nuts:


The only thing that can beat these would be Vega's or a gaming version of Pascal GP100, and neither one may come out this year and only show themselves next year.


I'd strip the stock cooler in a heartbeat and use my own gear......All i'd have to wait for are the water blocks from EK and i'd be good to go.
 
I'm still debating getting a pair of these for some quad crossfire insanity.....:nuts:


The only thing that can beat these would be Vega's or a gaming version of Pascal GP100, and neither one may come out this year and only show themselves next year.


I'd strip the stock cooler in a heartbeat and use my own gear......All i'd have to wait for are the water blocks from EK and i'd be good to go.

With my current setup I don't have room for a 2nd card, so I may seriously consider this thing if I see some announcements (followed by actual progress) toward improved multi-GPU compatibility in games. The idea of having this much power in my small setup is mouth-watering :lol:
 
Ef14rYJ.gif


http://www.guru3d.com/news-story/dual-fiji-amd-radeon-pro-duo-announced.html

My God :twitch:

A monster of a card especially in DX12 titles.

Too bad no real reviews as of yet. But dam, I would like to test one of these out.

crossfire/sli atm isnt a hit for dx12 as developers havent learned to code with ahem newer engines is not here yet

I wont buy crossfire as everything I read its a nightmare as well as sli for games. Give us VEGA HBM2 and we can have som powa with 4k
 
crossfire/sli atm isnt a hit for dx12 as developers havent learned to code with ahem newer engines is not here yet

I wont buy crossfire as everything I read its a nightmare as well as sli for games. Give us VEGA HBM2 and we can have som powa with 4k


There was a time when it was far more immature and having crossfire profiles for the latest games could take time, but over the last couple of years and as long as the user is running the latest drivers, the odds are good that as soon as a new game is released, there's already a profile for it.



Only thing that bugs me is that it's still the brute force approach in that the cards are designed with triple 8 pin power connectors, so with the additional 75 watts coming from the PCI-e slot itself, that's up to 525 watts that can be supplied to one of these cards......Add a second card for quad crossfire, and that can go up to 1050 watts just for the cards....:lol:



I have 2400 watts in total between a pair of 1200 watt power supplies, and they're single rail units for the 12 volt line, so that's 200 amps for the 12 volt rail and more than enough to power even the nuttiest setup, but it's still the brute force approach just the same, and when gulping down that much power, water cooling to keep temperatures in check and the system quiet is mandatory, not optional.


It really is going full retard.....:nuts:
 
Back
Top