Die shot that kind of reveals they are going with one GCD only. Looks like the 7950xtxxx refresh will be just more speed faster ram and more Infinity Cache.
Hey guys, I just wanted to remind you call that we don’t allow hotlinking on Rage3D. If you would like to post an image, please upload it to your preferred image hosting provider first.
Pax, I have edited your die shot and PowerColor design posts with new image links. Bill, I have edited your last post since it contained a quote from pax.
Posted this before watching it, lol. I appreciate GN and their general thoroughness and no-nonsense reporting. IMO, Steve's getting a little too hung up on the marketing BS, especially given his channel's target audience. At the end of the day all that really matters is performance, price and the user experience.
[edit] And the thumbnail was changed
Last edited by DeathKnight; Nov 3, 2022, 01:20 PM.
AMD Ryzen 7 5800X | Fractal Design Celsius S24 AIO | Gigabyte X570 Aorus Elite | 2 x 16GB G.SKILL Ripjaws V DDR4-3200 (14-14-14-34) B-Die | MSI 3080 VENTUS 3X OC 10G | Creative Sound Blaster Z | Samsung 970 EVO 500GB M.2 NVMe SSD | WD Black SN750 1TB M.2 NVMe SSD | Samusng 850 Evo 500GB SSD | Seagate Barracuda 2TB | WD 640GB Black | Asus BW-12B1ST 12X Blu-ray Burner | Fractal Design Define R5 case | EVGA SuperNOVA 750 G2 PSU | Win 10 Pro | 4K w/Hisense 55" U8H Mini LED Quantum ULED 4K TV HT: Hisense 55" U8H Mini LED Quantum ULED 4K TV | Onkyo TX-SR605 A/V receiver | JBL EC35 center | 2 JBL E30's (fronts, bi-amped) | 2 JBL N24II's (rears) | Homemade Sonosub w/Dayton 12" driver (extension to 14Hz), BASH 300w amp | Panasonic DP-UB820 4K Blu-ray player | Toshiba HD-A2 HD DVD player | Xbox Series X Camera Gear: Nikon D7000 DSLR | Nikon 16-85mm F3.5-5.6 VR | Nikon 50mm F1.8 G | Sigma 105mm F2.8 EX DG Macro | Slik Pro 700DX tripod legs with Cullmann Magnesit 35Nm ballhead
Unlike last time, they avoided direct comparisons with Nvidia. This looks to be a slower card but priced where the performance should be in relation to a 4080/4090.
Unlike last time, they avoided direct comparisons with Nvidia. This looks to be a slower card but priced where the performance should be in relation to a 4080/4090.
True, but for $999 its a steal even if its 15% slower than 4090.
AMD Ryzen 7 5800X - ASUS TUF Gaming B550-PLUS - 16GB Corsair LPX 3200Mhz - 1TB WD Black SN850 - Nvidia RTX 4080 FE 16GB - Seasonic Focus GX-850 - Dell S2716DG - Be Quiet 500DX
True, but for $999 its a steal even if its 15% slower than 4090.
I don't even think it will be as fast as that. Because if it was they'd have priced it accordingly. They even avoided direct comparisons with the 3080/3090/3090Ti. Probably because if they did, we'd know (well indirectly) how it compares to the 4090, and so far the 4090 is a considerable generational leap over the 3090/3090 Ti (let alone 3080) that probably wouldn't show the 7900XTX in good light.
People thought the 4090 availability would be plentiful and wouldn't be a problem getting one, WRONG!
Same will happen to these cards. I'm referring to USA purchasing btw.
I didn't watch the meeting. Did they say anything on combating scalpers?
Member of the Glorious PC Gaming Master Race-"Doesn't any game maker know how to make a PC feeling game anymore? I hate all this console afterbirth crap we're getting lately."
1.7x uplift in raster puts it directly inline with a 4090
1.6x RT (which looked like 1.4-1.6x puts that portion inline more with the 4080 16 gig
the 999 pricing is just AMD knowing they dont have the mindshare or marketshare to f*** around and wanted to punch NV in the teeth. which I applaud.
Main rig: look at system spec tab
Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools
HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)
In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.
I know....I know........Keep my day job :-)- catcather
1.7x uplift in raster puts it directly inline with a 4090
1.6x RT (which looked like 1.4-1.6x puts that portion inline more with the 4080 16 gig
the 999 pricing is just AMD knowing they dont have the mindshare or marketshare to f*** around and wanted to punch NV in the teeth. which I applaud.
Also the price to make them is A LOT cheaper than AD102. But it sucks, that we can't pre-order or setup an account on AMD Direct. Bots will buy them all in about 1 nanosecond.
AMD Ryzen 7 5800X - ASUS TUF Gaming B550-PLUS - 16GB Corsair LPX 3200Mhz - 1TB WD Black SN850 - Nvidia RTX 4080 FE 16GB - Seasonic Focus GX-850 - Dell S2716DG - Be Quiet 500DX
1.7x uplift in raster puts it directly inline with a 4090
1.6x RT (which looked like 1.4-1.6x puts that portion inline more with the 4080 16 gig
the 999 pricing is just AMD knowing they dont have the mindshare or marketshare to f*** around and wanted to punch NV in the teeth. which I applaud.
We already know from 3rd party testing the 4090 is 2x or more faster than a 6950.
AMD's best case scenario is from their internal marketing slide. That is not inline with a 4090.
Their whole speel from the beginning was about performance per watt, coupled with the fact that the flagship draws only 380 watts, pretty much indicates they're not aiming for the performance crown this time around.
If the 7900XTX was in fact remotely competitive with the 4090, they would have flaunted it. There's a stark contrast in this presentation compared to their last launch in which they openly admitted to trading blows with the 3080 and had a bunch of benchmarks to show it. Not this time around, and I'm pretty sure they have an internal testing lab with Nvidia's 4090 laying around not collecting dust.
4090 is roughly 70% faster than a 3090TI in Raster
7900XTX is roughly 70% faster (slides) than a 6950XT in Raster
6950XT and 3090TI were roughly equal at Raster
7900XTX is using 5nm node, with a 355w TDP/TGP. I would assume the 5nm node will scale well with increased power limit and voltage - better than the 4090, which scales pretty poorly above ~400w.
AIB cards will hopefully have higher power limits to allow higher clocks on the 7900XTX.
If the RT can be at least above a 3090TI, it won't be bad for $600 less.
Other thing to consider is that FSR is inferior to DLSS, at least right now.
Sitting here with my 3080TI, and quite frankly it's a difficult decision. Do I save the $600+, but lose DLSS and a reduced RT performance increase?
If RT turns out to be around Ampere, then AMD made a mistake.. again. Hoping that won't be the case.
Originally posted by curio
Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe
We already know from 3rd party testing the 4090 is 2x or more faster than a 6950.
On average 4090 is 69.4% faster than the 6950XT in 4K. That is not 2x. It all depends on the game, some games might be 50% faster, others 90%. Your blanket statement is just wrong.
How much an uplift in RT on 4090 is due to DLSS3? Also I think FSR3 has frame generation but they werent clear on that. Would make sense to have 2x the frames over FSR 2.2 tho.
Price is compelling. Gonna wait for reviews tho.
I talked to the tree. Thats why they put me away!..." Peter Sellers, The Goon Show
Only superficial people cant be superficial... Oscar Wilde
Ignore List: Keystone, Andino... -My Baron, he wishes to inform you that vendetta, as he puts it in the ancient tongue, the art of kanlee is still alive... He does not wish to meet or speak with you...-
"Either half my colleagues are enormously stupid, or else the science of darwinism is fully compatible with conventional religious beliefs and equally compatible with atheism." -Stephen Jay Gould, Rock of Ages.
"The Intelligibility of the Universe itself needs explanation. It is not the gaps of understanding of the world that points to God but rather the very comprehensibility of scientific and other forms of understanding that requires an explanation." -Richard Swinburne
FSR3 is DLSS3. Fake frames, but much less outrage for some reason .. I wonder why
4090 vs 3090TI RT uplift is massive, just like the raster uplift, with no DLSS at all. This isn't something you have to guess on.. there's plenty of information out there.
Originally posted by curio
Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe
Ya googled it quick and raw seems about 70% faster RT than 3090.
One thing tho that 355w is far lower than expected. A lot of leakers said 420-450w. I think they left a lot to the AIB's to run on for much higher clocked cards unlike nvidia's reference model. It doesnt seem like a 450w AIB model would be that outrageous.
With FSR3 and an AIB boosted card it could be a lot faster.
I talked to the tree. Thats why they put me away!..." Peter Sellers, The Goon Show
Only superficial people cant be superficial... Oscar Wilde
Ignore List: Keystone, Andino... -My Baron, he wishes to inform you that vendetta, as he puts it in the ancient tongue, the art of kanlee is still alive... He does not wish to meet or speak with you...-
"Either half my colleagues are enormously stupid, or else the science of darwinism is fully compatible with conventional religious beliefs and equally compatible with atheism." -Stephen Jay Gould, Rock of Ages.
"The Intelligibility of the Universe itself needs explanation. It is not the gaps of understanding of the world that points to God but rather the very comprehensibility of scientific and other forms of understanding that requires an explanation." -Richard Swinburne
On average 4090 is 69.4% faster than the 6950XT in 4K. That is not 2x. It all depends on the game, some games might be 50% faster, others 90%. Your blanket statement is just wrong.
You can pick and choose your average but it's already been show the 4090 can be CPU limited even at 4k. And when it's not, it absolutely trounces all the prior gen cards.
By the way, you do know you're comparing independent reviews to AMD's best case marketing slide, don't you? Or do you have a link to their 13 game average that I missed in the presentation?
NVs own slides said ~70% raster improvement over the 3090ti.
The 4090 is CPU limited at 4K in what games? On what CPU? I've yet to see that in anything besides games that have historically been CPU limited anyway.
Originally posted by curio
Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe
And another one:
It's also important to note that the 55% average includes games that are still hitting CPU bottlenecks, like Flight Simulator. The RTX 4090 barely drops in performance going from 1080p ultra to 1440p ultra to 4K ultra — which is part of why Nvidia's DLSS 3 Frame Generation technology is so exciting, but we'll get to that in a bit.
In the eight individual test results, the RTX 4090 beats the 3090 Ti by anywhere from 11% (Flight Simulator) to 112% (Total War: Warhammer 3). Those are the two outliers, with the remaining six games falling in a tighter range of 46% (Far Cry 6, Red Dead Redemption 2) to 70% (Forza Horizon 5).
It's also worth noting what DLSS 2 in Quality mode does for performance in the four games that support it. Flight Simulator performance drops 4%, again due to the CPU limited nature. Horizon Zero Dawn only gains 10%, Watch Dogs Legion gets a 13% boost, and Red Dead Redemption performance improves by 14%. The RTX 3090 Ti saw up to a 35% increase in performance with DLSS 2 Quality mode, so again there are clear CPU bottlenecks coming into play, even at 4K.
I mean, it's quite clear the 4090 is an absolute monster and even held back at 4k in many games, so you need to take that average over a 6950 and look at it real close. There's probably even more "free" 4k performance to be had with a 4090 with faster CPU's around the corner.
I hate the idea that interpolation is becoming the next thing... I'm really not interested in inserting junk fake frames for a BS boost to the frame rate. But I get why AMD is doing it. nVidia will market the crap out of it otherwise and hold it over everyone as something that proves they are better. So **** it.
I'm more interested if AMD is going to go the XeSS style route with the resolution scaling portion. A non-AI accelerated path and an enhanced AI path like XeSS. Or if they manage to make improvements that make it not really worth it.
-Trunks0 not speaking for all and if I am wrong I never said it. (plz note that is meant as a joke)
System: Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit
I hate the idea that interpolation is becoming the next thing... I'm really not interested in inserting junk fake frames for a BS boost to the frame rate. But I get why AMD is doing it. nVidia will market the crap out of it otherwise and hold it over everyone as something that proves they are better. So **** it.
I'm more interested if AMD is going to go the XeSS style route with the resolution scaling portion. A non-AI accelerated path and an enhanced AI path like XeSS. Or if they manage to make improvements that make it not really worth it.
There's nothing wrong with it IMO as long as latency is not affected for the worse.
Ive turned this on my old LG TV and enjoyed it even with the added latency it brought. Glad AMD is going this route too, so now we have a better native method rather than the TV without the drawbacks of traditional frame interpolation.
I hate the idea that interpolation is becoming the next thing... I'm really not interested in inserting junk fake frames for a BS boost to the frame rate. But I get why AMD is doing it. nVidia will market the crap out of it otherwise and hold it over everyone as something that proves they are better. So **** it.
I'm more interested if AMD is going to go the XeSS style route with the resolution scaling portion. A non-AI accelerated path and an enhanced AI path like XeSS. Or if they manage to make improvements that make it not really worth it.
Yeah the last couple posts are why this frame generation garbage is such an issue. Spiderman framerate might look high but it plays terribly. It works well in MSFS2020, and that's about it.
Looks good for numbers, but those numbers don't really matter.
Originally posted by curio
Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe
I don't even think it will be as fast as that. Because if it was they'd have priced it accordingly. They even avoided direct comparisons with the 3080/3090/3090Ti. Probably because if they did, we'd know (well indirectly) how it compares to the 4090, and so far the 4090 is a considerable generational leap over the 3090/3090 Ti (let alone 3080) that probably wouldn't show the 7900XTX in good light.
We'll know for sure in a month though.
Yea I'm skeptical, definitely looks like a better card than the 4080 but really not in the same league as the 4090 and priced accordingly.
You can pick and choose your average but it's already been show the 4090 can be CPU limited even at 4k. And when it's not, it absolutely trounces all the prior gen cards.
By the way, you do know you're comparing independent reviews to AMD's best case marketing slide, don't you? Or do you have a link to their 13 game average that I missed in the presentation?
You said 4090 is 2x. You are wrong. I gave you a average test from Techspot. I guess I picked and chose it, o wait I didn't, Steve and Tim did.
I never said anything about the AMD presentation, I was fact checking your statements. I could give a crap about how much people love companies cards, its just false statements do very little to help the community.
AMD Ryzen 7 5800X - ASUS TUF Gaming B550-PLUS - 16GB Corsair LPX 3200Mhz - 1TB WD Black SN850 - Nvidia RTX 4080 FE 16GB - Seasonic Focus GX-850 - Dell S2716DG - Be Quiet 500DX
You said 4090 is 2x. You are wrong. I gave you a average test from Techspot. I guess I picked and chose it, o wait I didn't, Steve and Tim did.
I never said anything about the AMD presentation, I was fact checking your statements. I could give a crap about how much people love companies cards, its just false statements do very little to help the community.
That's funny, it's about 2x faster when not CPU limited. Less than when it is. You're the one who brought up averages, which includes CPU limited cases, not me.
That's funny, it's about 2x faster when not CPU limited. Less than when it is. You're the one who brought up averages, which includes CPU limited cases, not me.
You may wanna reread those links I posted.
Sorry I apologize. I keep forgetting 4k is CPU limited by tomshardware benchmarks. Ill reread them ASAP.
AMD Ryzen 7 5800X - ASUS TUF Gaming B550-PLUS - 16GB Corsair LPX 3200Mhz - 1TB WD Black SN850 - Nvidia RTX 4080 FE 16GB - Seasonic Focus GX-850 - Dell S2716DG - Be Quiet 500DX
Sorry I apologize. I keep forgetting 4k is CPU limited by tomshardware benchmarks. Ill reread them ASAP.
I mean, you can make light of it all you want, but it's pretty standard knowledge a 4090 is going to be CPU limited in many cases starting from 1440p and bleeding into 4k. So that will skew your "13 game average" results, and many sources already acknowledge this.
If AMD really thought they had a 4090 competitor even if just in rasterization, they'd have shown it (and priced it accordingly)...especially after all the digs they took during the presentation.
But it's pointless to talk about this further until the same independent reviewers that benchmarked the 4090 also do the same for the 7900XTX. So we'll see until then.
I mean, you can make light of it all you want, but it's pretty standard knowledge a 4090 is going to be CPU limited in many cases starting from 1440p and bleeding into 4k. So that will skew your "13 game average" results, and many sources already acknowledge this.
But 7900XTX is not CPU limited? How do we know?
If AMD really thought they had a 4090 competitor even if just in rasterization, they'd have shown it (and priced it accordingly)...especially after all the digs they took during the presentation.
The Digs were funny, yes they would have shown them losing in benchmarks. Does it matter if the card is priced at $999. Remember 4870?
But it's pointless to talk about this further until the same independent reviewers that benchmarked the 4090 also do the same for the 7900XTX. So we'll see until then.
OK we are on the same page finally. Im going to bed, but good times my friend.
AMD Ryzen 7 5800X - ASUS TUF Gaming B550-PLUS - 16GB Corsair LPX 3200Mhz - 1TB WD Black SN850 - Nvidia RTX 4080 FE 16GB - Seasonic Focus GX-850 - Dell S2716DG - Be Quiet 500DX
Comment