Nvidia RTX 30XX Rumor thread

AMD couldn't care less about SLI or anything cutting edge regarding PC gaming. They are with Microsoft selling consoles. If you don't believe that then fine but I disagree. They will continue to make CPU's and GPU's that are a year late and slower than the competition to keep PC progress to a minimum in the gaming sector while trailblazing in the console and server markets. Been this way for decades and their intentions are obvious.
 
AMD couldn't care less about SLI or anything cutting edge regarding PC gaming. They are with Microsoft selling consoles. If you don't believe that then fine but I disagree. They will continue to make CPU's and GPU's that are a year late and slower than the competition to keep PC progress to a minimum in the gaming sector while trailblazing in the console and server markets. Been this way for decades and their intentions are obvious.

That would have made sense (and still does with the GPU side) but their CPU side is shitting over Intel at the moment in all things except gaming, and that is also pretty close.

Intel are trying to damage control by releasing new CPU (new lol) but those are just the same cpu's with more cores, and hotter.
 
That's a good call. I think the 3080TI will be close to a 1080TI SLI setup in performance and it can kill 4K so you will be good to go at that resolution. I can't believe how long Nvidia is waiting between card launches these days for the same 30% increase at double the price though. So silly. The 3080TI will deliver the same performance at the same price as a 1080TI SLI setup from 4 years prior. We have literally gone nowhere for half a decade.


Moore's law is slowing down and why Nvidia added immersion/fidelity hardware through the rt and tensor cores to add value and differentiation, imho.
 
That would have made sense (and still does with the GPU side) but their CPU side is shitting over Intel at the moment in all things except gaming, and that is also pretty close.

Intel are trying to damage control by releasing new CPU (new lol) but those are just the same cpu's with more cores, and hotter.


AMD's new line of chips are powering the next gen consoles and making them a pretty penny. They have a new line of server chips that is smoking Intel. That much we can agree on. In the mainstream sector they have done the same thing they always do. They released a couple of 8 core 16 thread chips that got smoked by year old Intel chips that have been improving at a snails pace for a decade and so they released a server chip with more cores to make it look like they are trying to compete.

The truth is that they released a slower line of mainstream chips a year late just like they always do. It's intentional and their GPU's do the same thing. A year late they will release a card that is a little slower than Nvidia's year old card at the same price. Both CPU's and GPU's follow the same pattern because they have no interest in pushing PC gaming forth because it will cause the PC platform to pull away from their consoles. They have worked with Microsoft to make the drivers as close to the metal as possible which just happened to kill SLI (One of the PC's main advantages over console). Next year their consoles will be very close to a PC in terms of graphics and resolution. AMD is not the incompetent baby it wants us to believe it is. It's using a toxic method of beating it's competition by not competing with them. Same thing apple does in the phone sector.
 
Last edited:
Moore's law is slowing down and why Nvidia added immersion/fidelity hardware through the rt and tensor cores to add value and differentiation, imho.

I agree, technology is slowing down at the moment. I also don't mind RTX effects. I really enjoyed Nvidia's PCSS shadows and it's hairworks effects and HBAO shading. Like most new effects, I knew that RTX would not be playable yet on the 2080 RTX line of cards and that is fine. When Nvidia released Shader Model 3.0 for the first time, I knew that it's HDR lighting was not going to be used very much with the cards of that time. It eventually became a major staple of graphics. I see the benefits of Real Time Ray Tracing in all of the render work that I do and I know that what is being done right now is the tip of the iceberg.

My problem is that SLI has been killed off which has set me back half a decade in terms of performance because of the slower improvements to technology and despite the lack of performance improvements Nvidia just keeps increasing it's prices which are completely unjustified. This happens because of a lack of competition. If Intel doesn't do something to compete with Nvidia the GPU sector of PC gaming is going to go bust. I keep hoping Nvidia will jump into the CPU market to give Intel the push it needs to improve but so far they are too distracted with AI and the car industry which is understandable.
 
I don't know if I agree with that. Most RTX games I've played at my res have been 90 FPS+ with RTX enabled.

The issue is the difference in performance. If you're getting 90 FPS with RTX enabled, and 150+ fps with it disabled, then the RTX part is too slow. Also, I don't consider 90 fps to be that great for a $1200 card. That's more like $500 card territory, if not $300.

The other thing is the lower end RTX cards perform even worse. So, even if the 2080 Ti is borderline acceptable in RTX that doesn't say much for the rest of the line-up.
 
Last edited:
no high end AMD card this year it looks like by AMD's CES

glad I got 1500 bucks in cash from tips in the draw for a 3080 ti :bleh:
 
I wonder if the 3080TI will come out at the same time as the 3080 like last time or if they plan to wait a year before they release the TI like they used to do. I am also bent over waiting for insertion because I decided to disassemble my custom liquid cooling loop in order to simplify my setup. I didn't feel like taking apart my cards so I'm using the built in GPU on the 9900K at the moment:lol: No gaming for me for the next few months!

I enjoyed the benefits of my cooling loop but I just didn't feel like keeping up with it. My radiators were starting to show signs of corrosion and my pump was only going to last so long. I had used it for over 3 years but I felt like doing away with it. I decided to rebuild my rig with an AIO for the CPU. I'll probably go back to a dual GPU setup and depending on how hot it gets I might look into some AIO's for those at a later time.

It looks like Nvidia's founder's edition cards went with a dual fan cooler rather than the old school blower for the RTX line of cards so I have rebuilt my new system with air movement in mind hoping I won't have to switch to water but most likely I will with the long renders that I do. Most of the aftermarket air coolers were 2.5 slots thick so you couldn't SLI them even if you wanted to.

Whatever Nvidia eventually comes out with I'll probably end up buying it and it will probably a$$ rape me financially.
 
Here is an optimistic bit of speculation regarding the 3080.
https://www.pcgamer.com/looking-forward-to-nvidias-rtx-3080-and-the-next-generation-7nm-gpus/

The 2080TI is about 30% faster than a 1080TI so if the 3080 were to be 40-50% faster than a 2080TI it would be a worth while boost in performance to all of us 1080TI users. It would probably best my 1080TI SLI setup by itself which would perform a perfect 4K 60FPS on all modern titles. A 50% bump in Ray Tracing performance would make RTX effects playable at 4K on some titles.

If the Crypto boom fizzled out and with the PlayStation 5 looming later in the year, you never know maybe the prices will come down.:hmm:
 
I will get a 3080 Ti if it has HDMI 2.1, even if it has only marginal performance improvement over a RTX 2080 Ti.
 
You would think that it would considering 8K TV's have hit the scene and some 4K 120hz panels as well. It's still a few months out before it's launch so I would hope they include HDMI 2.1.
 
Felt like I just bought my TV yesterday but I think it's been like 5-6 years now.

Will get a 4k 120hz LG or Samsung display once it has the appropriate HDMI specs for native 4k 120hz and priced around 1k (without sacrificing color quality or HDR quality). Might have to wait a few more years for that.
 
Felt like I just bought my TV yesterday but I think it's been like 5-6 years now.

Will get a 4k 120hz LG or Samsung display once it has the appropriate HDMI specs for native 4k 120hz and priced around 1k (without sacrificing color quality or HDR quality). Might have to wait a few more years for that.

I'm in the same boat. I've had my TV for about 4 years. G-sync and 120hz 4K panels are going to become the norm over the next few years. I'll probably buy a new screen once the dust has settled. I'm in no rush for 8K but 120hz would be nice especially if GPU's can push beyond 60FPS on a regular basis. The next generation of cards after the 3080 series will probably be when I do it which could end up being more than 2 years from now.
 
I'm in the same boat. I've had my TV for about 4 years. G-sync and 120hz 4K panels are going to become the norm over the next few years. I'll probably buy a new screen once the dust has settled. I'm in no rush for 8K but 120hz would be nice especially if GPU's can push beyond 60FPS on a regular basis. The next generation of cards after the 3080 series will probably be when I do it which could end up being more than 2 years from now.

I don't need 4k 120hz but a bigger freesync premium pro and g sync window I do want

if the freesync premium pro and g sync window on these new screens is say 40 to 90 FPS I'll buy a 55 inch one this year
that will put most all games in that window at 4k with a 2080 ti or a 3080 ti

but it must work with both freesync premium pro and g sync
 
If my FPS are slipping around between 35 and 90 it feels terrible even with Gysnc because the game is actually being rendered at differant speeds and you can feel it. For me, getting a perfectly stable frame rate is more important than a high refresh rate. I would rather play at a solid 30FPS than play at 30-45FPS.

Right now we can't get a stable 60FPS at 4K so it's not high on my list. I think the 3080TI will be able to hit that mark but stabalizing 80FPS minimums is still a ways away. When it's possible then I will be happy to take advantage of that with a higher refresh rate monitor and a frame limiter. Having a higher window as you described it is spot on and I see what you mean.

If SLI worked it would be more realistic later this year. Games like Stalker that are old and support SLI are hitting FPS limits down to 60FPS on the CPU due to a lack of multi threading and new games are GPU limited with little or no SLI support. Games keep getting more intensive and the demands usually boil down to playing at lower resolutions at higher refresh rates or playing at modern resolutions around 60FPS. The jump between 4K and 8K is large enough that it might be possible to hit 100FPS at 4K before 8K is playable so that will probably be the time I jump on it. It's never been high on my list but if the GPU and CPU power are there then I will make sure my monitor can take advantage of it at that time.

With the few games that did support SLI this year I still had some GPU limitation at times but I also observed CPU limitations in the 70FPS range at 4K once the GPU limitations were removed on some games. I don't think the performance will be there for me and what I want in terms of FPS stability until the entire PC system from top to bottom has moved on to the next level. I know people that feel a huge difference right now with frame rates that fly from 35 to 144 and they like it so it's probably just a personal preference on how much you get from it and when. If your monitor updates 144 times a second and your GPU only updates 35 to 60 times a second then it's going to feel the same as if it were at 60hz. Big fluctuations between 60 and 120 FPS bother me and I don't want to feel them. A solid 80 or 90FPS on a high refresh rate monitor would be super nice though.
 
Last edited:
Back
Top