AMD Polaris architecture for Arctic Islands / Radeon 400 series GPUs

Ah the revolving talking points gotta love them. By the time said tech is mature Nvidia will have something new, NV loves this brand loyalty who needs forward looking hardware just buy new every year.
 
I also am getting fantastic fps in DOOM with both DX11 and Vulkan so not sure what is your point.

I was talking about async compute on Ashes of Singularity.

I never purchased a 780 Ti since I am not a chump. I am also not purchasing the 1080 for the same reason.

AMD had nothing worth purchasing after 7970 is quite true if you wanted performance, less noise, less heat, less driver issues etc. AMD seems to have stepped up its driver development but the performance on their cards is quite inconsistent (with some games showing great performance to others showing consistently losing to NVidia cards). At least with NVidia you know what you get. With AMD you are at the behest of developers/driver team/some other random excuse to see if you will get the performance you should with the new game release.
 
Oh cmon the 290-390 are cards with a lot of leg. Much better value than their nvidia counterparts. They are 3-5 years cards easy for almost anyone. My 290 non x is 2 years old and I can see it get another 2 years easy with the new apis.
 
Ah the revolving talking points gotta love them. By the time said tech is mature Nvidia will have something new, NV loves this brand loyalty who needs forward looking hardware just buy new every year.
I talk from an enthusiast perspective not the average forward looking gamer.

There are countless people to whom I have recommended a 290X/390X once drivers had become stable and performance was there. Also since NVidia still does not have a decent card at 200 bucks price point. 1060 might finally change that if sold at MSRP.

Either way, I do not wish to continue this conversation with you since you will never see any reason to doubt AMD and its focus on enthusiast GPU consumer market.
 
Oh cmon the 290-390 are cards with a lot of leg. Much better value than their nvidia counterparts. They are 3-5 years cards easy for almost anyone. My 290 non x is 2 years old and I can see it get another 2 years easy with the new apis.
See post below yours to understand what I am saying.
 
Either way, I do not wish to continue this conversation with you since you will never see any reason to doubt AMD and its focus on enthusiast GPU consumer market.
You clearly don't know my post history I have been highly critical of AMD many times. I also predicted (although easy to do if being objective) that GCN especially Fury X would really start to show its muscle going forward.
 
You clearly don't know my post history I have been highly critical of AMD many times. I also predicted (although easy to do if being objective) that GCN especially Fury X would really start to show its muscle going forward.
What is this muscle that you talk about. I still see Fury X getting destroyed in most benches unless you put it against a stock 980 Ti where it sometimes matches a 980 Ti; I believe the only person who is using a 980 Ti at those speeds is a complete ****ing moron!
 
Android adopted vulkan so thats a bit more than nobody.

No one will use Vulkan on Android, do you realize there are thousands of different Android devices available? No mobile dev is going to sit there and optimize for thousands of Android devices with a low-level API. Hell, no mobile dev even cares about graphics on Android!

Also Apple isn't using Vulkan, instead pushing Metal. So no mobile dev will support Vulkan period unless they aren't going to dev for iOS and that would be fantastically stupid, you have to support both Android and iOS. That means most mobile devs will be sticking with OpenGL for the indeterminate future. Maybe Epic will make Infinity Blade 4 on iOS with Metal. No one else will care.
 
Shader intrinsic functions are the equivalent of writing software code in assembler. It's literally coding to the metal, the shader intrinsic functions are literally AMD-specific instructions. There is nothing like that for Nvidia because Vulkan is based on Mantle and that's where the shader intrinsic functions come from since Mantle was AMD's low-level API that nobody ever ended up using except DICE.

Not completely correct.
For starters 'intrinstics' is a pretty poor name for them; really they are just functions which wrap/expose hardware functionality not exposed via other means (aka language lacks a min3() function) - personally when I think 'intrinstics' I think SSE/AVX instruction level stuff because I know better than the compiler about the data.
(Although you'd hope a GPU driver could spot "p = min(a, min(b, c))" and translate that to p = min3(a,b,c) but that's another matter and doesn't cover all cases.)

Secondly; the intrinstics in use aren't Vulkan baked things, they are exposed via a series of extensions AMD released - NV could also release/support them as well if they wanted; likely they have the same functionality in hardware for many things (ballot, lane swizzles, 3 component min/max/mean is all very likely) - however there is nothing AMD specific in the SPIR-V language spec; NV, Intel, Qualcomm, ImgTec etc would have blocked it.

Finally, not directly related to that quote, DX12 can also use driver extensions to enable extra functionality; AMD did this for a time with the same functionality as Vulkan function wise, but pulled it (might have returned, might still be gone) due to driver issues. It is, however, possible for DX12 to get access to the same feature set... I'm also pretty sure that rolling those features/extensions in to DX12 is an upcoming plan along with a new LLVM based compiler.

But the fact it wasnt even there to begin with is kind of mind boggling. ID has been a pretty big fan of nvidia for a long time. In good part I suppose due to their strong open gl drivers but in general over their hw as well. I cant remember how many Carmack presentations talked pretty glowingly of nvidia.

Currently NV doesn't seem to expose extensions to allow those functions to be used in their Vulkan driver so iD can't use them.

I'm sure as soon as NV release the drivers iD will flip the switch (it might even be automatic if NV just use the same extension name as AMD.)

Vulkan has a good chance of being widely used as its not OS specific. And not OS *version* specific at that.

OpenGL wasn't OS specific either, look what happened to that.

Everyone goes on about Vulkan and other OSes like it matters but... truth is.. it doesn't.

Broadly speaking things break down as follows;
- Big company, internal engine, will already support DX11 so that covers older stuff. DX12 logical next point as it gains you both PC and Xbox One market share for one hit. PS4 does its own thing. No need for Vulkan support as a) old OSes covered by DX11 and b) Linux isn't large enough to worry about. Apple doesn't support Vulkan.

- Indie developer or company using 3rd party engine; will use whatever the engine supports; so when UE4 and Unity get API support they will get API support, best API for platform will be picked. DX12 is likely a priority for both 3rd party engines due to Xbox One support.

- Indie developer doing their own thing; likely to pick DX11 or OpenGL. Unlikely to be pushing hardware to such a degree that the mental overhead of using DX12 or Vulkan is worth the hastle.

Vulkan also comes with its own amusing set of issues;
- it seems to lack support for non-identical adapters. So you can do SLI/CFX but only when the adapter is of the same family
- seems to be missing a few 'draw' command types which DX12 and OpenGL (via extensions) support... *twitches*
- Documentation...

... which is a personal bug bear of mine and why, despite liking the look of it in general, I've no real desire to work with it. Doc are behind a 'sign up' wall and the docs you get with the (ever ****ing changing, which is a negative on its own it seems) SDK seems to be a long pdf file. DX12 on the other hand is documented via MSDN pages which don't require you to sign up for anything to 'log on' to read them. (I've complained about this since the (botched) SDK release... it seems unlikey to change.)

Vulkan has some nice concepts, but they are just making it slightly too annoying to bother with.. and with Win10 adoption rates still good and somewhat invitable (Steam Hardware Survey, June, Windows 10 64 bit 42.94%) not using DX12 for Vulkan is either historical (iD) or poltical (foam at mouth frothing anti-MS types), but rarely logical.
 
No one will use Vulkan on Android, do you realize there are thousands of different Android devices available?

And you know this how? Do you know every single Android developer out there to say for certain no one will do it?
Do you realize that even tho there's plenty of Android phones out there the one with GPU's that support Vulkan are few?
Do you realize that by using Unreal or Unity (when it supports Vulkan) for example, you the developer don't need to mess with Vulkan directly?
Do you realize that as the developer of a Android game, you can decide what kind of phones your game will support?
Making a game for Android is no different than making a game for PC the zoo of configurations on PC is also staggering but games are still made.


bobvodka said:
OpenGL wasn't OS specific either, look what happened to that.

Everyone goes on about Vulkan and other OSes like it matters but... truth is.. it doesn't.

It may not matter to you but it matters to others, id software used it, Croteam used it, others are using it, you hated OpenGL but people still used it, you don't care about Vulkan but people will still use it.

bobvodka said:
- Indie developer or company using 3rd party engine; will use whatever the engine supports; so when UE4 and Unity get API support they will get API support, best API for platform will be picked. DX12 is likely a priority for both 3rd party engines due to Xbox One support.

Unreal Engine 4 supports Vulkan, Unity will support Vulkan, why? Is the most used engine for mobile games and low level API's for Android only vulkan.

bobvodka said:
Indie developer doing their own thing; likely to pick DX11 or OpenGL. Unlikely to be pushing hardware to such a degree that the mental overhead of using DX12 or Vulkan is worth the hastle.

Agree.

bobvodka said:
Vulkan also comes with its own amusing set of issues;
- it seems to lack support for non-identical adapters. So you can do SLI/CFX but only when the adapter is of the same family
- seems to be missing a few 'draw' command types which DX12 and OpenGL (via extensions) support... *twitches*

Didn't knew that, about MGPU support, are you really sure about that?
"seems to be missing" you know that is different from "is missing" perhaps you didn't searched well enough? But i also don't know just messing with you. ;)

bobvodka said:
- Documentation...
... which is a personal bug bear of mine and why, despite liking the look of it in general, I've no real desire to work with it. Doc are behind a 'sign up' wall and the docs you get with the (ever ****ing changing, which is a negative on its own it seems) SDK seems to be a long pdf file. DX12 on the other hand is documented via MSDN pages which don't require you to sign up for anything to 'log on' to read them. (I've complained about this since the (botched) SDK release... it seems unlikey to change.)

Documentation, here is a matter of, if you really wanted you would find all the info you want, but you are already dismissing Vulkan initially so not having all the info in a single easy place just scares you away.

But there's plenty of info about implementing Vulkan, along the official documentation, there's youtube tutorials (not plenty of them but they exist), there's the AMD OpenGPU site with blogs about Vulkan, Nvidia i bet has also info on how to implement it has well, there's the khronos site and forum, if you want to ask questions, there's the Vulkan twitter feed, with news about releases and other info, and perhaps others i don't know, books will be eventually written as well, as they were for OpenGL, so supporting Vulkan is not a matter of documentation is a matter of will, and anyone that loves D3D and hates OpenGL, for example will certainly lack the will to learn or support Vulkan.

bobvodka said:
...(foam at mouth frothing anti-MS types), but rarely logical.

You call others anti-MS loonies but you are obviously a very fierce pro-MS, in the end i need to question as well your critical thinking and impartiality in this process.
 
Last edited:
Making a game for Android is no different than making a game for PC the zoo of configurations on PC is also staggering but games are still made.

Oh god.. Android is so so much worse right now - the same hardware can have different bugs depending upon the phone provider, never mind the Android version itself.

And the tools... oh god the tools... or lack there of... Ironically MS seem to be doing the best job of trying to sort this out with their Visual Studio extensions. Google have basically failed multiple times to get any sort of sane tooling together, and the phone vendors continue to do things to dick developers over.

I rocked up at my new job, around 9 months ago, and declared Android to be **** and that I would do everything not to touch bugs on it. Pretty much everyone said it couldn't be that bad... one by one they have worked on bugs and one by one they have realised just what a steaming pile of **** the OS is for developers to work with.

**** still gets produced, sure.. but **** me if I'll have any part in it until Google get their god damn house in order... shiiiit...



It may not matter to you but it matters to others, id software used it, Croteam used it, others are using it, you hated OpenGL but people still used it, you don't care about Vulkan but people will still use it.

As a selling point however it really doesn't matter a great deal; aside from 'future android mobile' Vulkan really lacks a USP - Windows/Xbox has D3D12, PS4 has its own API, Wii/WiiU their own, Apple is doubling down on Metal... so that leave Linux (vanishinly small market share) and Android, which, as noted, 'future'.

OpenGL persisted, but largely via legacy or people paying others to do the work - that's why Linux and OSX ports of things tend to be out souced to other companies.

Unreal Engine 4 supports Vulkan, Unity will support Vulkan, why? Is the most used engine for mobile games and low level API's for Android only vulkan.

I should look again at UE4's support, but I'm pretty sure it wasn't complete and was certainly completed by a contractor outside of Epic. Unity wise I know some guys who work there and thus I know full well where it stands on their roadmap.

While plenty of people have 'signed up' for Vulkan support, until the hardware and software hits the public expect people to continue rolling along with the OpenGL|ES support on Android and Metal on Apple (which is where the money is anyway).


Didn't knew that, about MGPU support, are you really sure about that?
"seems to be missing" you know that is different from "is missing" perhaps you didn't searched well enough? But i also don't know just messing with you. ;)

It's not a huge major thing all told as you could do CFX with two compatible AMD cards, however afaik you currently can't mix and match AMD and NV; the hardware has to be compatible. (I think its the same as 'linked adapter' mode in D3D12, but with restrictions on devices.)
(Although it did put the breaks on an idea I had and was going to implement in the engine at work because of compatibility issues which might arise... so there is that I guess..)

Well, yes, that draw call thing is kinda the point of my documentation rant; I didn't believe it either, but when I went googling I ran in to the wall which is the sign-in bullshit.

Documentation, here is a matter of, if you really wanted you would find all the info you want, but you are already dismissing Vulkan initially so not having all the info in a single easy place just scares you away.

No, it doesn't "scare me away" it's about time usage and accessibility - basically if I can't trivially google your docs then you can **** off, my time isn't worth spending on signing up or trying to read what seemed to be a ****ing spec.

MSDN does it right; details, cross linking and easy to google - if I want to look something up don't make me go digging in a ****ing pdf file or sign up for something - not when a competing product, which was out before you, has everything out there to find with ease.

The choice continued the ARB tradition of brain dead decisions...

But there's plenty of info about implementing Vulkan, along the official documentation, there's youtube tutorials (not plenty of them but they exist), there's the AMD OpenGPU site with blogs about Vulkan, Nvidia i bet has also info on how to implement it has well, there's the khronos site and forum, if you want to ask questions, there's the Vulkan twitter feed, with news about releases and other info, and perhaps others i don't know, books will be eventually written as well, as they were for OpenGL, so supporting Vulkan is not a matter of documentation is a matter of will, and anyone that loves D3D and hates OpenGL, for example will certainly lack the will to learn or support Vulkan.

I like the way you think I don't know about these things :lol:
I have to keep abreast of this stuff, my job basically requires it.. but **** me they don't make it easy.

The official Khronos youtube videos... oh, those make me despair.. GDC content went up with messed up sound and no eta on when they would be fixed... useful when finally fixed however. Then they did a dev day recently, the sound quality was terrible; I could hear the audience more than I could the speaker and had to give up because my ears were bleeding.

The twitter feed, both Khronos and Vulkan generally feels more like marketing hugs than useful information... patting themselves on the back for being awesome in an undeserving manner.

I know about the GPU Open website, it covers both Vulkan and D3D12, which is how I knew about the intrinsics stuff from earlier :)

You are right, it is a matter of 'will' - and I wanted to do stuff with Vulkan because it had a few interesting ideas in there... but **** me they made it hard to get information which sucks the will to do anything with it.. and most professionals learn this **** on their time and apply it back at work so making it hard to support your API is just basically shooting yourself in the feet over and over again.

(And for the record, I started my 3D programming life with OpenGL, at this point I'd say I've been living the ARB/Khronos world for 17 years.. to start with OpenGL vs D3D was a discussion you could have.. but **** up after **** up it make it harder to come up with a reason to use it... Longs Peak being the nail in the coffin for me and that wasn't their last **** up at all. I wanted to support Vulkan, I followed its development as keenly as I followed Mantle and D3D12, despite the arcitects being the same people who declared for a year that 'OpenGL was the best way!' and dismissing Mantle/D3D12/lower level APIs as 'not required'... but they have made it hard... and frankly my time is just worth more... hell, I find writing forum entries to be a better use of my time than trying to penetrate the Vulkan docs.)

You call others anti-MS loonies but you are obviously a very fierce pro-MS, in the end i need to question as well your critical thinking and impartiality in this process.

No, I'm pro-making good choices in areas I care about.
So far MS are doing the best job of that which is why I'd use their stuff. They aren't perfect and there are areas were they could improve... but when it comes to programming support and documentation... they get **** done.

I've not arrived at this point by accident... as mentioned I started with OpenGL and cursed MS plenty when they did dumb things, I played with Linux and tried to code against X11 (mistake! *shudders*). I've used Android phones, supported OpenAL and pushed alternatives when it has seem like the best choice.

To misquote Tim Minchin, if you show me something better I will spin on a ****ing dime and embrace it.. I'm a pragmatist and have very little in the way of loyalty.

(I'll also be supporting out D3D12 and Vulkan code bases going forward at work... a position I put myself forward for in our team because I wanted to do such things... the Vulkan part is just going to be painful is all... *sigh*)
 
Android isn't going anywhere and there a plenty of programmers willing to code for it. It has the largest platform by far for mobile so basically it will be prioritized accordingly.
 
Good reads there bob. I guess Samsungs rumored threat of moving to its own 'tizen' os is due to android issues?

Couple questions. Do you expect those 'shader intrinsics' to be unveiled for DX 12 this summer with the windows 10 update? I thought I had read that mantle had also been looked at by MS for dx 12 as it was for vulkan.

Also some are poopooing zen cpu and thinking itll only match Haswell when it has avx 512 which is only seen in skylake and above. Does that instruction have potential or will it be another FMA3 vs FMA 4 story.
 
Just to help you bobvodka so you don't waist time searching for good Vulkan Tutorials. ;P

[yt]wHt5wcxIPcE&list=PLUXvZMiAqNbK8jd7s52BIDtCbZnKNGp0P[/yt]
 
I guess Samsungs rumored threat of moving to its own 'tizen' os is due to android issues?

Maybe, might just want to do an Apple/MS and have more control over the hardware/software integration.


Couple questions. Do you expect those 'shader intrinsics' to be unveiled for DX 12 this summer with the windows 10 update? I thought I had read that mantle had also been looked at by MS for dx 12 as it was for vulkan.

Maybe, I'm not sure if I'm honest - those functions are certainly on the roadmap but it might be a 'phase two' thing, after the new compiler stuff is sorted, maybe towards the end of the year. It doesn't need to be timed with a major OS update so they can just ship it whenever.

With regards to Mantle; Yes, much like Vulkan MS took it as a starting point, an execution model if you like, but they would have aimed for something which would suit all hardware; so including hardware specific instructions in shaders, if everyone didn't agree, wouldn't have worked. I guess in theory they could have included 'max3' et al and let the driver expand things out but I suspect they just wanted to make minimal changes to the shader language/compiler as that already worked. (They might also have expected the driver to do the transform I mentioned earlier.)

Either way, best not to focus too much on the Mantle of it all :)

Also some are poopooing zen cpu and thinking itll only match Haswell when it has avx 512 which is only seen in skylake and above. Does that instruction have potential or will it be another FMA3 vs FMA 4 story.

Depends on how well it executes it really; if it is dog slow then AVX512 would be avoided on that CPU.
Still, AVX support isn't wide spread really so I wouldn't expect anyone to give it any special attention - SSE3 is about as safe as it gets right now. (see Steam Hardware Survey.)

In short *shrugs* - we'll see how it stacks up when it is released I guess...
 
Damn... SSE3 is from 2004. Things got too complicated after that?

So we need to look at other than new instructions for improved perf in new cpus I suppose like cache amounts and latencies...?
 
No, its to do with market coverage.

SSE2 and SSE3 are covered by around 99% of CPUS.
SSE4.1 and 4.2 drops to 85.51 and 82.19% respectively.
AVX falls to 69.28%

While compilers can automagically make use of those instructions for certain thigns you want to tune/write by hand via intrinsic functions as you know the data flow better than the compiler - an SSE3 version is pretty much a no brainer, but after that adoption drops off.

AVX might well still happen, consoles have it after all, but that's about the limit of it.

Going wider also doesn't always help; memory fetch speed/latency tends to become a limiting factor - an AVX512 instruction will consume a whole cache line on its own and there is, in games, little point in consuming data for ALU faster than the memory subsystem can return it.
(For non-games this could be less of a thing; CPU can idle while waiting, improving power consumption, so AVX would be a win in that regard.)
 
Anywhooo, back on topic. Looking at the 1060 reviews I now feel more comfortable going with a 3rd party 480 8GB.

The cards actually stack up pretty similar to the whole AMD vs Nvidia line-up. In DX12 the RX480 is generally faster. In DX11 the 1060 is generally a bit faster (except in bad AMD games, where even the Fury X is slower than it sometimes :nuts: ).
 
The cards actually stack up pretty similar to the whole AMD vs Nvidia line-up. In DX12 the RX480 is generally faster. In DX11 the 1060 is generally a bit faster (except in bad AMD games, where even the Fury X is slower than it sometimes :nuts: ).

Yup that's how I see it. DX11 the 1060 is slightly faster but DX12/Vulcan the 480 is a little better. I'm really liking the parity now and looks like a 3rd party 480 8GB for me.
 
Back
Top