Official RTX 30x0 thread

The first video of his that I watched, he claimed that he had a source at Nvidia and he was 95% confident that the info was correct. In that video he stated that:
The 3090 had 12GB of VRAM at 18GB/s speeds(Actually has 24GB at 19.5 GB/s speeds)
The 3090 was 7nm. (Is actually 8nm)
The 3090 had 5,376 cuda cores (actually has 5,248 with double FP32=10946)
3090 sample consumed 220 to 230 watts. (Actually 350watt TDP)
4x the ray tracing performance. (Actually 2X)
4X to 5X the performance of a Titan RTX in the game Minecraft. (Actually 2X to 3X)

He was waaaaaaay off on all of this.

Here is a link to his video. He updated his information based on what other rumors were stating and had no real info. His source was playing him.
https://www.youtube.com/watch?v=oCPufeQmFJk

Things change and nobody expects his predictions to be perfect but his source was totally off on everything.


Yes this is what AIBs and other sources give people. I am just saying that different samples are different then actual final chips. His ONE source was off, that happens A LOT. He also said 3090 would have 21Gb/s chips, and he was wrong.
Do we know for sure its 8nm? It was 7nm right before launch because press info was given as the 5284 and 4352 as the cards cuda numbers and samsung 7nm (can't confirm).

As Steve from GN, said it can do FP+FP or FP+INT. Is that a fact? Well a 30TF FP32 3080 would be 2x faster than a 2080Ti, and that is not the case.
 
No benchmarks have leaked, how is that possible?
I'll let you in on that secret. The AIB partners have all been prepping their cards for months now. They have the products, engineering boards for a while. NVIDIA however, has not released a driver that works with anything other than the test software they supply. So get this, I am writing this article on September 1st, hours before the presentation, and still, the board partners have no idea what the performance is going to be like. We need to advance on that as the board partners even do not know the thermal capacity effect of their products. NVIDIA has provided them with test software that will work with the driver. Basically, these are DOS-like applications that run stress tests. No output is given other than PASS or FAIL. We know the names of these test applications: NVfulcrum test and NVUberstress test. For thermals, there is another unnamed stress test, but here again, the board partners can only see PASS or FAIL. Well, we assume they have tested with thermal probes. What this paragraph, well, to show you the secrecy that NVIDIA applied for this Ampere project.

https://www.guru3d.com/articles-pages/geforce-rtx-3080-and-3090-what-we-know,1.html

lots of smoke and mirrors
other than price all we know is the smoke NV is blowing out their ass

when are reviews out ?
 
Damn!

The ROG Strix uses three 8 pin connectors which can pull 400 watts. For OCing headroom no doubt.
 
https://www.guru3d.com/articles-pages/geforce-rtx-3080-and-3090-what-we-know,1.html lots of smoke and mirrors other than price all we know is the smoke NV is blowing out their ass when are reviews out ?



That's not good. Remember the weird graphical issues with the 2080ti when it launched. Not refined drivers, they did fix it but i hope we don't see this again!

https://forums.evga.com/Another-death-2080-TI-11GP42383kR-m2889936.aspx

Guess they are rushing out fast, before Navi 21? Lets hope AMD has good drivers now.
 
Nvidia did go on record that Ampere is their largest generational leap. That has to mean something.

I would like to think so but I'm not so sure just yet. If you include RTX and DLSS performance then sure, of course it's going to be a larger jump than previous generations that didn't have those features or didn't have enough hardware to support those features properly. I'm just curious how much of a generational leap it is when it comes to games like Red Dead Redemption.
 
PEOPLE really? I have to say this again this is for OFFICIAL CONFIRMED INFORMATION about Ampere cards. Do not post rumor mongers or things about unannounced products in this thread. I am all for memes and debate but this is not the thread for that.

If you are posting links to legitimate reviewers doing initial analysis that is one thing the people who trade in nothing but rumors and leaks doesnt belong here. Nor is this the place to rationilze what you are or are not buying.

next time I have to say this I am doing thread cleaning.
 
PEOPLE really? I have to say this again this is for OFFICIAL CONFIRMED INFORMATION about Ampere cards. Do not post rumor mongers or things about unannounced products in this thread. I am all for memes and debate but this is not the thread for that.

If you are posting links to legitimate reviewers doing initial analysis that is one thing the people who trade in nothing but rumors and leaks doesnt belong here. Nor is this the place to rationilze what you are or are not buying.

next time I have to say this I am doing thread cleaning.


We might need to wait for the 17th to continue this thread then..... :bleh:
 
Assuming there are not reviews before the 17th. If not that's a dick move NV. They will sell out day one for probably weeks, and people can either end up stupid happy or feeling jaded by independent reviews.

There's also the fact that none of the vendors seem to be doing pre-orders. That sucks!
 
I would like to think so but I'm not so sure just yet. If you include RTX and DLSS performance then sure, of course it's going to be a larger jump than previous generations that didn't have those features or didn't have enough hardware to support those features properly. I'm just curious how much of a generational leap it is when it comes to games like Red Dead Redemption.

YES! Exactly what I was thinking. All games nvidia "benched" have raytracing+DLSS enabled. I want to know RDR2 4k max gfx frames. That will be a truer indicator.
 
Damn!

The ROG Strix uses three 8 pin connectors which can pull 400 watts. For OCing headroom no doubt.


I noticed this too. I'm wondering just how much OC headroom there is. The reference model with 12-pin is using equivalent of 2 x 8-pin. Is it going to be severely power limited in comparison? Like, will the Strix actually be able to make use of that extra power for 24/7 gaming rigs (not suicide benchmarks under LN2)? There are 2080ti's with 3 x 8-pins as well, but the difference for real world gaming clocks was negligible.
 
Yeah I’m eye balling a Strix but it comes with 3 8 pin connectors....... I’ll need to have a deep and meaningful conversation with my Corsair 850i and convince her that she’ll be OK. :cry:


I really don’t want to upgrade my PSU. Please tell me 850w will be OK. :cry:
 
I'm in the same boat. Honestly, 850w should be plenty, and the HXi is platinum rated so I think it'll be fine TBH..
 
Mines gold rated, as it's the RM. Moving from one 8 pin to 3 8 pins is making me nervous.
 
Last edited:
I'm going for flounders edition. I'd be very surprised if, as usual, there is any significant difference for gaming rigs with 24/7 OC. I'm sure reference (under water too) will clock just fine with only a single poor mans 12-pin. Those 3 plug jobbies are really for 3dmark chasers with exotic cooling.

Also, it's far more efficient for your PSU and system to run say an 850w at 80% capacity, than 1600w PSU at 40% capacity. You actually use less power and create less heat. IIRC you really want to load your PSU north of 60%, at least.
 
Back
Top