Official Navi2x/6x00 series thread

we know

you said you would last to the AMD review

but then you blow your load when crashcar24 undid his first button :hmm:


kids today no staying power :p

Well hell yea, and it turned out to be the right choice on several levels, the top being he could actually buy one!:p
 
the 3080 and somewhat the 3090 is in for the death of a thousand cuts in the benchmarks and news


a 6080 xt reference card hits 2.65GHz on air beats a 3090 on ln2

wait till the 25th and the aftermarket cards like the Strix and that Red Devil hit and the reviewers get them and start pushing them

and if the AMD aftermarket cards hit with better stock like some say and stock picks up by the 8th like past AMD cards after the first few weeks
and yes they are both normally slow stock to start and picking up in a few weeks NV didn't this time we will have to wait a month and see if AMD can

but the 6900 xt may well kill the 3080 ti before it is even out
and if they make aftermarket cards like a 6900 xt Red Devil it will
 
the 3080 and somewhat the 3090 is in for the death of a thousand cuts in the benchmarks and news


a 6080 xt reference card hits 2.65GHz on air beats a 3090 on ln2

wait till the 25th and the aftermarket cards like the Strix and that Red Devil hit and the reviewers get them and start pushing them

and if the AMD aftermarket cards hit with better stock like some say and stock picks up by the 8th like past AMD cards after the first few weeks
and yes they are both normally slow stock to start and picking up in a few weeks NV didn't this time we will have to wait a month and see if AMD can

but the 6900 xt may well kill the 3080 ti before it is even out
and if they make aftermarket cards like a 6900 xt Red Devil it will

:lol: :lol: Billy we already heard all that non sense, it’s just not the case.
 
Bots gonna buy all those too bro. Nike, please release a bunch of new jordans and wtf else ever people like to waste money on asap.
 
I'll be trying to get a 6800xt or 6900xt. Using the 3080 till I can get a card worked it out with the buddy who is buying it.

Sent from my GM1917 using Tapatalk
 
So, after watching some more reviews:

It looks to me like for 4K you want RTX 3000 series. For 1440P and below the 6800 series looks good. At least if we're talking about rasterization. I have to wonder whether it's the limited memory bandwidth that's holding the cards back in 4K?

Ray tracing performance is passable, but a little disappointing. The lack of a DLSS equivalent really hurts, since ray tracing is basically unplayable in some games without it. I don't fully cut AMD slack on this based on it being their "first gen of ray tracing", because Nvidia didn't really see much improvement in ray tracing with Ampere, outside it simply scaling with the cards being faster. I think it's still basically 1st gen vs 1st gen and AMD's implementation is not as good. It could be newer drivers and tweaks by game developers can improve it, but the jury is still out on that one.

The efficiency is nice compared to Ampere, however not all places saw the large difference that TechPowerUp did. I guess it depends on the game? Regardless, greater efficiency is still a plus.

Overall I think the cards are priced correctly, and are a decent product, not an amazing product. (However by that same measure, Ampere is also only a decent product.)

It's still impressive that AMD is back in the game and actually is more efficient for a change. This is their most competitive card released in a long time, I'd say going back all the way to 290X released 7 years ago. Hopefully this is the start of consistent improvement on the part of AMD, which means competition will heat up again in the GPU market.

In the end how well the cards do will come down to availability. If the AIBs show up in force next week then AMD will get a lot of sales. If the GPU shortage continues then the real determination will come down to how many cards AMD and Nvidia can deliver, with both selling everything they make.
 
Tessellation disabled.. and I'd chalk a good portion of that score to the LN2 5950X. You'd have to look at graphics scores to compare the GPUs.

AMD has historically been superior in Firestrike, for whatever reason. They are slower in Timespy though.. no clue why.

Yeh exactly, Tess off and who knows what else, in 1080p FS with 5950x @ 5.4. As you know, AMD does unnaturally well in FS for whatever reason and scores don't translate well to other benchmarks or games.

In his comments he states GPU score with Tess on is 'close to 50k' (for reference I just scored 51k), and has this benchmark run hidden - presumably because it doesn't look so hot. Then goes on to say raster gaming is around 3080 level and DXR gaming around 2080TI level. No surprises there. He also predicts single digit gains with 6900XT.
 
[yt]uaxnvRUeqkg[/yt]

@18.02

They are doing something interesting with Infinity cache that they havent announced yet. But some games have had insanely high low 1% scores. IC has some real potential.

Other time stamps:


@7:50 Cudo's to AMD for launching competitive Card. With a trifecta kind of staggered launch: AMD is launching CPU's/GPUs/Consoles.

@9:35 both CPU and GPU collaborated together to bring the RX 6000 series. Better power to performance matrix.

@14:25 AMD will have laptop variants but wouldn't comment about it nor elaborate on it/when/how/etc

@18:02 Reason for using Infinity Cache. In particular they have another announcement they are making about Infinity Cache. He didn't commit to a time frame. But said it will pretty soon (tm).

@19:49 Vram Discussed: Using 16GB Vram and infinity cache and the 6000 series still uses less power then Ampere

@22:00: AMD will ensure that developers will take advantage of AMD Ecosystem between consoles/PC. As developers don't want to worry about a myrid of pc configurations. They prefer a closed form factor. Because AMD is Uarch used in console games developers will use more vram, RT (based on how AMD wants it), and all the other things AMD wants in games.

@26:40 RT/DLSS Discussed. He stated as new games are launched they will improve RT performance. But he emphasized new titles. He also emphasized that when you are coding for the console you are coding for AMD.

@28:30 DLSS Discussed. Originally was going to develop their own API. Developers begged AMD not to create another API (would only work with Radeon). So they are going for an open solution. Developers do not like having to fetch AMD/Nvidia reps to come on site to help code for their games (that's some juicy gossip right there, I didn't know that). They are still working on this with developers which is why it's not ready yet. (Or else they would have launched this API he speaks of.)

@32:25 (several minutes) Smart Access Memory Discussed. AMD never said that SAM wouldn't work with other hardware. AMD simply focused their efforts on this generation of hardware. Validation work, communication protocol work, tweaking, etc. He said they are still undecided on older hardware but they love backward compatibility... They are still evaluating it. MORE PERFORMANCE IS COMING AS IT MATURES. (Is that tied into infinity cached he mentioned earlier??? Hmmm...) But it is clear that Intel has to be involved with their bios, etc to get it to work for Nvidia. It's not just a nvidia thing. It's more then just a driver update. Nvidia told PCWorld they felt that AMD would hard block them. Scott said they wouldn't hard block them. (perhaps soft block them, lol /s)

@37:45 Question: Why are you doing that (allowing Nvidia in for SAM on AMD) when nvidia tried to flip freesync to g-sync as they out marketed AMD in branding. Do you run the risk of Nvidia re-market/rebranding SAM to something Nvida will claim as their own? Answer: To be determined...
(I'm flabbergasted. Perhaps he can't say in front of the camera but I hope AMD isn't dumb enough to let nvidia in w/o paying monthly royalty fees/SAM for rent.)

@38:40 Smart Shift Discussion

@41:00 Infinity Cache just L3 cache? Mainly L3 Cache with special Sauce. He won't reveal exactly what it is yet.

@45:00 Sapphire does not help design reference pcb

@51:45 Availability discussed. They are shipping cards everyday. They like the EVGA queue system. But wouldn't elaborate. AMD website was able to stop scalpers. Other partners were able to stop some of them.

@55:50 RT will be rolled out there the entire RX6000 series stack.

@56:30 Varible Rate Shading Discussed (a few minutes) with Radeon boost with Ray Tracing to improve performance

@57:10 Direct Storage coming...but not elaborated on
 
This part was interesting:

DLSS Discussed. Originally was going to develop their own API. Developers begged AMD not to create another API (would only work with Radeon). So they are going for an open solution.

DLSS may end up dead with devs opting for an open alternative used on consoles. Nvidias only hope would be to market DLSS as having superior IQ.
 
So, after watching some more reviews:

It looks to me like for 4K you want RTX 3000 series. For 1440P and below the 6800 series looks good. At least if we're talking about rasterization. I have to wonder whether it's the limited memory bandwidth that's holding the cards back in 4K?

Ray tracing performance is passable, but a little disappointing. The lack of a DLSS equivalent really hurts, since ray tracing is basically unplayable in some games without it. I don't fully cut AMD slack on this based on it being their "first gen of ray tracing", because Nvidia didn't really see much improvement in ray tracing with Ampere, outside it simply scaling with the cards being faster. I think it's still basically 1st gen vs 1st gen and AMD's implementation is not as good. It could be newer drivers and tweaks by game developers can improve it, but the jury is still out on that one.

The efficiency is nice compared to Ampere, however not all places saw the large difference that TechPowerUp did. I guess it depends on the game? Regardless, greater efficiency is still a plus.

Overall I think the cards are priced correctly, and are a decent product, not an amazing product. (However by that same measure, Ampere is also only a decent product.)

It's still impressive that AMD is back in the game and actually is more efficient for a change. This is their most competitive card released in a long time, I'd say going back all the way to 290X released 7 years ago. Hopefully this is the start of consistent improvement on the part of AMD, which means competition will heat up again in the GPU market.

In the end how well the cards do will come down to availability. If the AIBs show up in force next week then AMD will get a lot of sales. If the GPU shortage continues then the real determination will come down to how many cards AMD and Nvidia can deliver, with both selling everything they make.

I don't think you are necessarily being fair in your view on Ray tracing. Nvidia has been working on their generation 1 for 2 years, and when their ray tracing was first introduced 2 years ago, it's performance was much, much worse than it is today, as well of also running into bugs. So I don't think you can call it first gen against first gen unless you go back and compare day 1 AMD to day 1 Nvidia. Nvidia required multiple patches in BF5 to deal with some of the issues, along with performance enhancements optimizations, which wasn't full illumination. They have also had 2 years to optimize it via drivers. The fact that the Ampere isn't really an improvement is really a huge black mark on Nvidia. Out of the gate, AMD is expected to do full illumination from the get go and shine? Yet, full illumination didn't' even happen for Nvidia until Metro Exodus. Not to mention that Nvidia had their hand in the cookie jar the whole time having all these games optimized for them. Sot heir should be a little more slack given to AMD than what it appears you are giving. Just my opinion.
 
This part was interesting:



DLSS may end up dead with devs opting for an open alternative used on consoles. Nvidias only hope would be to market DLSS as having superior IQ.

they won't but the smart thing for NV would do a preemptive strike and open DLSS to all

but AMD will do it open with Sony and M$ consoles and game devs it will take over just like freesync did

they could scream superior IQ all day but if people can't really see it it won't matter
and if you really want superior IQ you want DLSS off .
 
I don't think you are necessarily being fair in your view on Ray tracing. Nvidia has been working on their generation 1 for 2 years, and when their ray tracing was first introduced 2 years ago, it's performance was much, much worse than it is today, as well of also running into bugs. So I don't think you can call it first gen against first gen unless you go back and compare day 1 AMD to day 1 Nvidia. Nvidia required multiple patches in BF5 to deal with some of the issues, along with performance enhancements optimizations, which wasn't full illumination. They have also had 2 years to optimize it via drivers. The fact that the Ampere isn't really an improvement is really a huge black mark on Nvidia. Out of the gate, AMD is expected to do full illumination from the get go and shine? Yet, full illumination didn't' even happen for Nvidia until Metro Exodus. Not to mention that Nvidia had their hand in the cookie jar the whole time having all these games optimized for them. Sot heir should be a little more slack given to AMD than what it appears you are giving. Just my opinion.

:lol:

you can't it was close to two months after the 2080 ti came out before the first ray tracing game Battlefield V came out and you could do more than a short demo with a 1200 buck RT card

and then it was months between new RT games for NV to tweak them
then with some games RT and or DLSS was added 6 months after the game came out

and lets get real

Ray tracing games you can play right now:


Amid Evil
Battlefield V
Bright Memory
Call of Duty: Modern Warfare (2019)
Control
Crysis Remastered
Deliver Us The Moon
Fortnite
Ghostrunner
Justice
Mechwarrior V: Mercenaries
Metro Exodus
Minecraft
Moonlight Blade
Pumpkin Jack
Quake II RTX
Shadow of the Tomb Raider
Stay in the Light
Watch Dogs Legion
Wolfenstein: Youngblood

https://www.rockpapershotgun.com/2020/10/20/confirmed-ray-tracing-and-dlss-games-2020/

and some of them just plain suck .
 
:lol:

you can't it was close to two months after the 2080 ti came out before the first ray tracing game Battlefield V came out and you could do more than a short demo with a 1200 buck RT card

and then it was months between new RT games for NV to tweak them
then with some games RT and or DLSS was added 6 months after the game came out

and lets get real



https://www.rockpapershotgun.com/2020/10/20/confirmed-ray-tracing-and-dlss-games-2020/

and some of them just plain suck .


And another matter is ....
When only a 700-800 $ GPU + fancy CPU can play RT at more than 60 fps at 1080p or 1440p and usually this guys have 4k stuff... What do you do with native resolution if one wants to play RT smooth ? Change the monitor ? Downscaling from native resolution is looking bad from what i know ?
 
And another matter is ....
When only a 700-800 $ GPU + fancy CPU can play RT at more than 60 fps at 1080p or 1440p and usually this guys have 4k stuff... What do you do with native resolution if one wants to play RT smooth ? Change the monitor ? Downscaling from native resolution is looking bad from what i know ?

You wait for the 3085. Then you wait for the 4080, then the 4085 and so on.
 
Back
Top