Doom alpha vga benchmark ...

I would disregard both camps mpgu performance, since it clearly looks like neither is working.

Single gpu results are shocking tho. A 7970 outperforms a 980! I wonder if async compute has anything to do with this(FTR im just speculating here)
 
Remember the days when Nvidia just destroyed AMD when it came to OpenGL and now the tables have turned.

But a PC game that is locked at 60FPS and 60Hz with no MultiGPU support when Vulkan is designed with MultiGPU in mind? No thanks, will wait for the actual PC version to be released when the BFG edition comes out in a year.
 
Apparently the benchmark will automatically adjust graphical settings based on the video adapter ID? The only constants between cards are the resolution and AA setting. So for better or worse of either brand, these results mean squat.

One thing that does stick out is the plain bad performance, considering the use of svo voxel tech, which means static lighting with dynamic lighting effects on top of it. Lighting inconsistencies are easily noticeable on the enemies and reminds me of RAGE.
 
isn't amd better at vulkan ?

Id think so but obviously this bench has driver issues starting with lack of sli/crossfire support. I expect the 290 to do well but obviously nvidia will fine tune. Itd be cool that the game still sees the 290 that I paid less than 300$ for get close to a 980 ti tho.
 
Remember though the game is cap'd at 60FPS/60Hz so does it need Crossfire or SLI? Apparently not.

Nvidia needs to make up 4FPS with driver optimization which probably will be done. It's an ID game so the graphics are going to suck anyways judging from the bullshots and canned videos. This game will come down to game play.
 
Id think so but obviously this bench has driver issues starting with lack of sli/crossfire support. I expect the 290 to do well but obviously nvidia will fine tune. Itd be cool that the game still sees the 290 that I paid less than 300$ for get close to a 980 ti tho.
ID Software has always had problems with sli/crossfire support and cf more so

rage sucked with cf
 
There is a big love for OpenGl in ID software.Seems now the new love is Vulkan.
With DX12 and Vulkan and owner of nvidia card i feel in beta stage.... :D
Seems AMD gambled that by releasing Mantle they would turn the industry inside out ...
 
Last edited:
It's an ID game so the graphics are going to suck anyways judging from the bullshots and canned videos. This game will come down to game play.

When did id software games became correlated with bad graphics and bullshots? Serious question because this claim above is not what i remember id Software for.

And the Doom 4 graphics suck! Sometimes i just don't know what makes good graphics to some people, and if some don't know, the guy that made Cryengine what it is, Tiago Sousa, is working on the new idtech 6 engine.

And yes, id software loves OpenGL, but there's was a really important reason historically for it, first id was a supporter of OpenGL from the beginning, even before DirectX existed, second because all id games add to run on LinuX and third the engine add to use almost exclusively, open software, so it could be made open source after some years, like John Carmack wanted.

That didn't prevented tho, others from implementing a D3D render on their version of the idtech engine, for example all Call of Duty and Medal of Honor games based on idtech 3 add a D3D render, "Wolfenstein 2009" on idtech 4 uses a D3D render and "The evil within" on idtech 5 uses a D3D render and last but not lest, on Xbox, all id games use a D3D render.
 
isn't amd better at vulkan ?

Remains to be seen; however they did suck at OpenGL performance much like they sucked at D3D11 performance when compared to NV.

Single gpu results are shocking tho. A 7970 outperforms a 980! I wonder if async compute has anything to do with this(FTR im just speculating here)

Unlikely, although if this time iD used the native API compute instead of using CUDA on NV and nothing on AMD it might help matters (see: Rage release).

Most likely, as I said above, this is a case of compared to their OpenGL drivers the Vulkan drivers don't suck as much.

Remember the days when Nvidia just destroyed AMD when it came to OpenGL and now the tables have turned.

But a PC game that is locked at 60FPS and 60Hz with no MultiGPU support when Vulkan is designed with MultiGPU in mind? No thanks, will wait for the actual PC version to be released when the BFG edition comes out in a year.

NV have always been driver performance kings; they throw resources at it. D3D12 and Vulkan have less performance wiggle room and rely more on what the apps are doing - so when you remove the driver overhead, much like with the D3D12 benchmarks floating about before now, you see a better version of how the games are driving the GPUs.
(as to how optimal the games are; well the IHVs can still advise of course..)

As to the second point; no. Vulkan was not designed the mGPU specifically in mind.
Vulkan serves the same purpose as D3D12; to take an old API, with a less than optimal design and produce something which closer matches the hardware, removes overhead on the GPU side and allows for more control, which does, like with D3D12, bring with it mGPU support.
(Just, ya know, more so because while D3D11 had a degree of suck about it, due to the years of piling on crap from a driver writer's pov OpenGL succccccccccccccccccccccked!)
 
and third the engine add to use almost exclusively, open software, so it could be made open source after some years, like John Carmack wanted.

... which has nothing to do with OpenGL usage which was, as noted, historical.
(In fact, first render was software, OpenGL support was added later via QuakeGL if memory serves.)

Repeat after me; OpenGL is not open source.

They could have open sourced any of the engines regardless of the graphics API used.
 
... which has nothing to do with OpenGL usage which was, as noted, historical.
(In fact, first render was software, OpenGL support was added later via QuakeGL if memory serves.)

Why the tone and where did i said OpenGL was open source? Yes first render was software but we were discussing id's love for this API and so, i add no need to bring the older software render into the picture, nor did i claimed OpenGL was the first render, and it is GLQuake btw but that is just a small detail.

Repeat after me; OpenGL is not open source.

Repeat after me; I need to read better and stop making assumptions.

They could have open sourced any of the engines regardless of the graphics API used.

Yes you could but then you forced users to do a OpenGL render for LinuX or Mac and John didn't wanted that, he was open sourcing is engines for a reason, for people to learn from him, so was important that they add acess to everything, including the render no matter the OS, only OpenGL gave him that option.
 
Any tone found was assumed and injected by yourself - I was simply pointing out perceived faults in your reasoning that the engine could only be open sourced because it used OpenGL - this was heavily implied/linked in the full paragraph that fragment was lifted from.

The open source thing was a misreading on my part, but frankly everyone makes that mistake so often it bares repeating anyway - although my assertion it had nothing to do with the opening sourcing of the engines remains.
 
Back
Top