Announcement

Collapse

Attention! Please read before posting news!

We at Rage3D require that news posts be formatted in a particular way, so before you begin contributing to the front page, we ask that you study the Rage3D News Formatting Guide first.

Thanks for reading!
See more
See less

Are AMD Planning to become the de facto Graphics hardware platform?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Are AMD Planning to become the de facto Graphics hardware platform?

    bit-tech.net published an interesting piece yesterday about the differences between console gaming and PC gaming, and how the DirectX API limits developers. Speaking with AMD's Richard Huddy, a few interesting tid-bits could point to an aggressive new strategy for AMD to grab marketshare:

    'It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'

    Hold on, you might be thinking, weren't shaders supposed to enable developers to be more innovative with their graphics anyway? Indeed they were, and the ability to run programs directly on the graphics hardware certainly enables some flexibility, particularly once we got past the fixed-function shaders of DirectX 8. However, with the exception of a few quirky-looking indie titles, there's no denying that many PC games look very much like one another.

    'The funny thing about introducing shaders into games in 2002,' says Huddy, 'was that we expected that to create more visual variety in games, but actually people typically used shaders in the most obvious way. That means that they've used shaders to converge visually, and lots of games have the same kind of look and feel to them these days on the PC. If we drop the API, then people really can render everything they can imagine, not what they can see – and we'll probably see more visual innovation in that kind of situation.'

    Consoles also have a major bonus over PCs here, which is their fixed architecture. If you program direct-to-metal on the PlayStation 3's GPU, then you know your code will work on every PS3. The same can't be said on the PC, where we have numerous different GPU architectures from different manufacturers that work in different ways.


    Click the link for the full 3-page article.

    Running close-to-metal (CTM) poses problems for portability and compatibility of code, the main benefits of gaming on the PC - piece by piece assembly and upgrades, with a wide variety of price points to suit different consumers.

    For AMD, the Fusion APU strategy gives them a common hardware platform with a single product that can be simply and easily used close-to-metal (CTM). Providing an API that removes a lot of the latencies and interpretation that slows down development of both titles and adds functionality and creativity could offer a method for increasing marketshare with better games and applications for users.

    AMD have openly and frequently stated they are advocates of Open and Common standards, rather than proprietary technology. API's like DirectX become considered Common Standards by marketshare and adoption - if AMD can achieve significant market penetration with their APU strategy, will there be a case for CTM for specialized applications like Games?

    Per title CTM tweaking doesn't make much sense, but in this age where game engines are licensed and used dozens of times, with four or five game engines powering nearly all the triple-A titles of today. If each Engine were CTM tweaked to AMD APU hardware, it could be beneficial for performance.

    Console hardware drives the game engine technology use and methodology of most major developers. Until the gap in features and functionality narrows, Consoles will continue to dictate to PC gaming as games are ported from console to PC and gussied up with a little DX11 Tessellation and SSAO/DoF/Shadows. If rumors of AMD currently working hard on a new design to be the processor for a 2012 Next Gen console are true, it could provide them with the leverage to converge the platform functionality to become the common standard.

    Another method for standardization could be to use hardware virtualization. Providing a hypervisor containing CPU, I/O, and GPU with high performance context switching could enable game engines to run as if there were no underlying OS or API's and run only what they want to run. This maintains compatibility with existing hardware and titles, and allows for the development of a close-to-metal open standard within the hypervisor for accessing the GPU's capabilities directly.

    The second option, using Virtualization to create a CTM environment, leaves the playing field level for all parties to take a crack at it - and for consumers to select the hardware that supports their desired price/performance/feature mix. The first option leads to a path that indicates high priced hardware, forced obsolescence and a slowing of innovation. Of course, all this may not come to pass as Microsoft may be listening to ISV's and IHV's, planning and drafting future improvements to DirectX to keep everyone happy. Sounds likely, doesn't it?

    #2
    I guess I picked a good day to be off ill...

    Firstly, when it comes to performance the biggest issue right now is that NV, AMD and MS have managed to fail to provide decent multi-threaded rendering ability to PC developers. Both sides are blaming the other, NV is a little further along (but still Doing It Wrong(tm)) but AMD have lost intrest it seems and are focusing on their APUs more rather than this pretty critical driver point. DICE highlighted it recently and, afaik, right now aside from Civ5 everyone is seeing a performance LOSS going multi-threaded due to driver issues (I'd be intresting to see where they are seeing this factor of two increase as I've not seen it; DX SDK sample sees a 10fps increase at best, DICE see a loss and our test framework for our next engine performs slower in multi-threaded mode vs single threaded).

    Secondly; stability. If AMD or NV can't write drivers which keep the system stable via a known API then how are developers going to do it? Right now with a recent set of Cat. drivers when I start DoW2:Ret. I know that I'm going to have to sit and wait for a minute or two at some point while the drivers go through an internal reset phase greying out my screen, corrupting my mouse pointer and repeating the greys before recovering to let me play the game and this is via an API which backs onto drivers created by the company in question.

    Thirdly; we most certainly DO use an set of APIs on consoles. One is very close to DX9 (360) and one is very close to OpenGL (PS3); yes there are some extras you can play with (direct command buffer creation on the PS3, direct command buffers on the 360) but this is only doable because we known the hardware. The lack of overhead comes from not having a HAL between the code and the hardware because.. well, you don't need to abstract when the hardware is known.

    So, what do they want to do here? Have developers support 360 (DX), PS3 (PSGL), PC DX, PC AMD and PC NV APIs? The chances of getting NV and Intel behind this idea is unlikely (they have enough problems with OpenGL) and will just fragment PC space even more.

    Even with this idea you are going to need an API on top, then we have the added problem of platform differences. Take the recent change from VLIW5 to VLIW4 on AMD's GPUs; if this had been coded 'to the metal' suddenly all that code which takes advantage of co-issue and known hardware details would be useless and have to be re-coded! No small matter, plus it would require AMD to be VERY open VERY early with their hardware details and to supply hardware to people who want to do this. And then this rests on market share being big enough to make it worth while time and effort wise for develops to do it; when AMD or NV release a new card the new card's market share is effective 0% if the underlaying instruction set/hardware is different, which begs the question what developer is going to sink at least 2+ man months into optimising for hardware which might not generate a significant market share? Now, factor in that hardware changes about every 18 to 24months and frankly workload++ and cost++ to support this.

    Yes, as a developer I'd like to have better performance and a thinner layer to the hardware, but at the same time on a platform like the PC I accept that there is always going to BE a layer there because we can't do everything.

    And if AMD do this then NV will do this and, well, historically who throws more money at devs? Yep, team green. So suddenly all those TWIMTBP games start performing better on NV hardware than AMD, get more exclusive features ("Well, we would have done it for AMD but NV paid us X amount to do the work which paid for the dev time to do it") and AMD's bright idea starts to back fire.

    As for 'games looking the same'; this is down to common engine usage (a problem which would only get WORSE with this setup as more and more companies would fall back to pre-made engines from companies who'd have the time to support 4+ APIs and optimise them) and art work more than it does the API.

    Frankly I think AMD are putting too much on APUs; don't get me wrong as an idea I like them but they are positioning more and more to have them as The One True Way (to the extent they are lagging behind on their DX11 driver work vs NV) and expecting them to work some kind of huge miricle to gain them enough market share to start throwing their weight around and I just don't see it.

    Right now, as we get ready for a new project with an 18+ month time scale on it we are having to fight to drop DX9 support (rendering team want it gone, producers/higher ups not convinced.. insert head banging here) and only support DX11 on the PC. One of the producers has even voiced the idea of dropping PC support; if this kind of idea became reality then it would just make such a choice even easier for them to make; "why are we coding for 4+ APIs when most of our market comes from 2? Let just drop the others...".

    Comment


      #3
      The Problem with everything looking the same is simple, 98% of all games use the Unreal Engine. Which doesn't fully utilize the hardware capabilities in video cards. Maybe with their most recent updated engine that will change but I doubt it. They always show awesome looking tech demos with each generation of the engine but games have never even come close to looking like those tech demos.

      Comment


        #4
        Not only that.. but 90% of the games are developped for machine that have 512meg of ram and 720p.

        PC sometime get better textures, better shadows and more option the tweak perfoamnce (like AF and AA). You can do only so much with 512meg ram (witch is about 1.5gig on Vista/7 PC, without counting video ram)
        Rig : I5 750 (4.1ghz), 8GB Kingston HyperX 1800, Gigabyte GTX 460 in SLI (OC 838/2000), SATA 1TB WD Green, Gigabyte P55M-UD4, 80G Intel Gen2 SSD.

        Comment


          #5
          the problem always lays in the drivers and lazyness of IHVs to bother ...

          no open bug tracker system
          no open on time fixing of issues
          no clear road ahead with milestones defined in public

          in the end You end with titles released but they behave
          each driver differently or fail to work
          each card serie differently or fail to work

          yet You spent N manhours of multiple coders to make sure it works on drivers and hardware provided by IHVs

          maybe it's time to finally concentrate on quality, stability and open driver development instead of messy something
          Ideas are like ocean without borders ! [O] https://twitter.com/FoltynD
          I support Open Game Protocol
          RX 585 1.5GHz /8GHz 8GB vstackedGGDR5, 6xDisplayPort 1.4 (HDMI 2.1 & DVI-D dongles included) with SILENT cooling! cmon baby
          needed CCC features == online up2date Game profiles or ALL features of ATT / RadeonPro (if you fail deliver don't even try say You got technical advantage)

          Comment


            #6
            Originally posted by bobvodka View Post
            Thirdly; we most certainly DO use an set of APIs on consoles. One is very close to DX9 (360) and one is very close to OpenGL (PS3); yes there are some extras you can play with (direct command buffer creation on the PS3, direct command buffers on the 360) but this is only doable because we known the hardware. The lack of overhead comes from not having a HAL between the code and the hardware because.. well, you don't need to abstract when the hardware is known.

            So, what do they want to do here? Have developers support 360 (DX), PS3 (PSGL), PC DX, PC AMD and PC NV APIs? The chances of getting NV and Intel behind this idea is unlikely (they have enough problems with OpenGL) and will just fragment PC space even more.
            The Xbox 360 IS a DirectX 9c based game development platform, nothing more, there is no direct GPU/CPU calls as all games have to comply and comform to Microsoft's rules for their box unlike what was happening with Xbox 1 where some developers were allowed to port OpenGL games and also program custom to the Xbox 1 games that are still having problems with BC.

            Basically any game on Xbox 360 can if the game developer/publisher feels like it be ported over to the PC where it will require a minimum of a SM 3.0 compliant DirectX 9.0c compliant (I know redundant huh?) graphics card to run even at full graphics quality settings.

            The PlayStation 3, like the PlayStation 2, PS1, Sega Saturn, and those like are traditional console systems where programmers have to use custom game development tools to make games, not APIs which is what Microsoft is doing because they generate profit from selling their software and they get a cut on the profit on any Xbox 360 game unlike the PC.
            3 Sony PlayStation 3 60GBs Artic Silver since Jan 2011.
            Sony PlayStation 2, Sega Saturn, X-Eye, DC. Nintendo consoles NES, SNES, N64, GC. NEC Turbo Duo
            AMD A10-6700, AMD A8-3520m, AMD Phenom II X6 1100t X4 980, Athlon X2 6400+, Nvidia GeForce GTX 285 EVGA FTW, GeForce GTS 250 1GB

            "Originally Posted by Napoleonic View Post
            If anything there's just still too many guilible people in the fanbase willing to accept this garbage star wars disney/Kennedy edition."

            Helghast... till the end of days!!
            https://www.youtube.com/watch?v=ruxcT6LEVzk Disney Star Wars SUCKS
            http://www.youtube.com/watch?v=QESGXTFFZXM Kaz PTVD
            http://www.youtube.com/watch?v=dogMXzbz9js
            https://www.youtube.com/watch?v=L--rU3Wq3WU My Bugatti V

            Comment


              #7
              I think that this is the start of a strategy to lockdown GPU hardware from an Instruction-Set POV. Rather than have the compiler have to change every generation for wildly new hardware, new hardware will be added that improves the performance of the existing instruction-set (similar to what the x86 CPU market does).

              If I read into this proclamation correctly, what AMD is saying is "Soon all of our future GPU's will have the same Shader Assembly interfaces. Changes will be mostly about the number of cores, cache, and efficiency."

              From following Beyond3d and seeing such granular changes past DX10, this may have already occurred (?) and they are just trying to push the CTM agenda now that the hardware has been standardized for a couple of generations.
              Windows 10 64-bit 1803
              i3-2130 3.4GHZ
              Zotac GeForce GTX 1050 mini 2GB
              8GB DDR3
              Intel B75 Chipset

              "Videogames are not for us. They're here to entertain the television." -Mel Brooks

              Comment

              Working...
              X