AMD Polaris architecture for Arctic Islands / Radeon 400 series GPUs

Agreed but it sounds like they may only hit on of those goals and that is a lower end Polaris part middle of the year. When is Pascal scheduled to hit?
 
You're getting paid to run it, right?:bleh:


Nope, it's my personal setup.......:D


Got fed up of incremental upgrades that Improve performance 20~30% on the CPU side of things, as Intel is under no pressure at all on the higher end / HPC / server market for the last several years, so they take their sweet ass time to release updates.


Until we see what Zen is all about, the major update on Intel's end is the Purley platform in 2018, which is using a brand new socket after socket 2011 with 6 memory channels per socket using DDR4, the 14nm process with skylake EP Xeons with up to 28 cores in one single die, so a dual socket board maxes out with 56 CPU cores and 112 threads using hyperthreading.



That can double the CPU performance I can hit with my current board.......Wake me up in 2~3 years from now, assuming I actually need it by then....:p
 
they also need to release first and hope nvidia hits a delay and is only on 16nm. AMD needs to hard launch the high and midrange cards.

does not seem likely that a delay is happening.
if amd fixes their bottlenecks they might have a chance.
AMD has been to much a tech company vs a company that can execute cards people want. can they change that approach with Polaris? lets hope so.
 
does not seem likely that a delay is happening.
if amd fixes their bottlenecks they might have a chance.
AMD has been to much a tech company vs a company that can execute cards people want. can they change that approach with Polaris? lets hope so.

If they can execute what they are claiming as far as a big bump in performance/per watt they have a very good chance. But they've talked a good game before and we've seen what happens when they don't deliver. I'm cautiously optimistic. Like Gandalf said it would help tremendously being first to market.
 
If they can execute what they are claiming as far as a big bump in performance/per watt they have a very good chance. But they've talked a good game before and we've seen what happens when they don't deliver. I'm cautiously optimistic. Like Gandalf said it would help tremendously being first to market.

aye:)
 
Dropping two nodes could hypothetically mean 3 - 4x performance per die size, same performance for 1/3 - 1/4 the die size, or whatever fraction you make in between. Real world of course, growing pains could occur, but frankly, I'd be very disappointed to see anything less than 2x performance per size\power draw.

Both Nvidia and AMD should be dropping two nodes this year (remember 20nm was skipped due to issues with big chip yields), so we should expect drastic performance gains.

The released info is the kind of vague teaser I'd normally ignore, but after 4+ years of 28nm stagnation, I'm excited. I have a 2160p monitor and I finished playing Assassin's Creed Syndicate at 30 - 35fps with mostly low and medium setting. I'd love to play the next Assassin's Creed Shovelware (not official title) on high at 60.
 
I hear you...I think these next releases are going to be pretty exciting for the 4K crowd from both vendors.
 
by the way if NV decides to stock WC their highend card who wants to bet the NV fans will sing its praises....
 
by the way if NV decides to stock WC their highend card who wants to bet the NV fans will sing its praises....


I sure hope they don't, I don't see why they will need to anyways, if they keep it to the same power envolope and TDP/TBP since cards have gone 2 8 pin. The only reason is if they need to keep GPU temps down. But with 16nm and finFet, that should ease the burden of chip temperatures since they will be using less voltage, and control of leakage over gates is easier. (leakage, voltage, and GPU temps are linked together) This was the reason why 20nm without finFET wasn't a good match for GPU's, the cost savings per transistor, wattage used, just didn't make enough of an impact from a financial point of view.
 
I sure hope they don't, I don't see why they will need to anyways, if they keep it to the same power envolope and TDP/TBP since cards have gone 2 8 pin. The only reason is if they need to keep GPU temps down. But with 16nm and finFet, that should ease the burden of chip temperatures since they will be using less voltage, and control of leakage over gates is easier. (leakage, voltage, and GPU temps are linked together) This was the reason why 20nm without finFET wasn't a good match for GPU's, the cost savings per transistor, wattage used, just didn't make enough of an impact from a financial point of view.


Given that both Nvidia and AMD will try to extract as much performance as they can from both Pascal and Greenland, and do so as long as the yields are half way decent, both are likely to push clock speeds pretty high right from the start, at the expense of power consumption, even with finfet's and the 14~16 nm process.


Remains to be seen if Nvidia hops on the integrated water block / pump just like AMD did on the fury, as if nothing else it made for a cool running card that is pretty silent, and especially useful in multi GPU setups as the system overall can still be fairly quiet even with 2 cards, as the vast majority of the heat is dissipated by the radiators which aren't near the cards themselves, and have a greater surface area that a regular heatsink bolted onto the card directly, never mind that water is a better heat conductor to begin with.



60~65*C under load is pretty impressive for a stock cooler.
 
By looking at what Nvidia did with Maxwell, it appears to me that they will still have an upper hand with new Pascal GPU. I am not Nvidia fanboy as i just bought Crossfire R9 390x. I am just being realistic and also i look after my wallet for which R9 390x was the best choice :)
 
by the way if NV decides to stock WC their highend card who wants to bet the NV fans will sing its praises....

You act like all nVidia customers were bashing the AMD WC setup :nuts: Way to show your true bias. I hate when someone says an entire group of people think/act the same way. People like you disgust and sicken me.
 
Back
Top