Go Back   Rage3D » Rage3D Discussion Area » Gaming and Computing Forums » General Hardware
Rage3D Subscribe Register FAQ Members List Calendar Mark Forums Read

General Hardware Talk about PCs/Macs, motherboards, CPUs, sound cards, RAM, hard drives, networking and everything else about computer hardware!

Reply
 
Thread Tools Display Modes
Old Feb 11, 2020, 07:33 PM   #61
Advertisement (Guests Only)

Login or Register to remove this ad
bill dennison
Radeon Arctic Islands
 
Join Date: Jan 2007
Location: United States phoenix
Posts: 20,456
bill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppers


Default

Quote:
Originally Posted by the_sextein View Post
I would . I'm looking at buying 2x 3080TI and if I had them both running at 16X on PCIE 3.0 I would be fine with that. PCIE 4.0 isn't going to offer any performance benefits if you ask me. You can tell by how the 2080TI reacts to 8X and 16X PCIE speeds. It's not limited by more than 2FPS at 8X so there isn't any bottleneck that I can see. SLI maybe but with dual 16X ports you won't see any limitations until PCIE 5.0 comes around. Maybe around the time the 4080TI comes out you would see some benefit with SLI but by then 5.0 will be out.
we don't know yet

i sure there will be a small difference between 3.0 and 4.0 with the 3080 ti at 4k and more so with SLI

there was with 7970 CFX and 2.0 and 3.0 PCIE and 1600p at the time NV never did get pcie 3.0 working right on the x79

but then i'm all set for a 3080 ti at 4k i have 4.0 now

but i'm not paying NV 3000+ for two 3080 ti
bill dennison is online now   Reply With Quote
Old Feb 11, 2020, 07:48 PM   #62
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

You are right, I don't know for certain but at 16X speeds PCIE 4.0 can deliver 16GT/s. If it drops down to 8X mode when you hook up a second card it should cut the bandwith in half to 8GT/s.

PCIE 3.0 can deliver 8GT/s at 16X. Since the Z490 can deliver 16X speeds to both cards at once you should see the exact same GPU though put. Both platforms offer 40 PCIE lanes so SLI should not be effected one way or the other.

Also, since the 2080TI is not effected by PCIE 3.0 at 8X speeds and the 3080TI will not be 2X as fast, it's a pretty safe bet that you will see no performance difference at all in both single or dual card mode.

the 4080TI will probably come out 2.5 years from now if Nvidia continues the way they are and by then PCIE 5.0 will be out.

Last edited by the_sextein : Feb 11, 2020 at 07:56 PM.
the_sextein is offline   Reply With Quote
Old Feb 13, 2020, 07:55 AM   #63
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

It looks like I was mistaken. I have a friend in the valley that has never told me wrong and maybe it was me who misinterpreted what was said but the dual 16X PCIE 3.0 and quad channel RAM support is apparently not going to happen.

A few weeks ago Intel PCIE 4.0 implementation was cancelled and I was told that Intel was bringing it's hedt features to the mainstream in order to compete. Looks like it is bringing 40PCIE lanes to the party with higher single core and multi threading performance but this makes the platform launch much less exciting to me.

It will achieve 5ghz all core which will be the same single core speed that Intel's 9900K has had since 2018. The multi threading performance will increase 30% if all cores are engaged but how many mainstream activities actually support or need more than 8 cores anyway?

Also, while I don't think a 3080TI or even a 4080TI will push PICE 3.0 X16 to it's limit, Dual 3080TI's forced down to PCIE 3.0 X 8 mode will probably show some loss at the GPU connection. I figure Nvidia's HD bridges will handle the cross talk without issues.

The problem for me is now that we are in 2020 and the 3080TI is closer to launch, I think buying a system that won't be able to handle daul 4080TI's would be a problem. I think at least 1 GPU upgrade after the initial purchase of the system should be possible.

Personally, If I were in the market to buy a new rig right now I think I would wait until next year. AMD has a real chance at delivering better single and multi threaded performance with PCIE 4.0 capability at the start of the year.

If Intel is releasing 10nm+ tiger lake mobile in 3Q of this year then there is a possibility that they are preparing at 10nm++ Tiger lake desktop CPU to spoil AMD's launch next year. If they were to release an 8 core 125w TDP monster on 10 nm++ next year with PCIE 4.0 and 6Ghz clocks then that would also be worth waiting for. My stance is to wait because both platforms have underwhelming value at the moment. While Intel may be a little faster and AMD may have a PCIE 4.0 which could benefit some multi GPU users, I think next year we will see solid releases from both companies that bring superior levels of performance across the board with maxed out features at their respective launches. They are probably worth waiting for.

As of now, performance is still hovering around the late 2018/9900K level of performance for mainstream applications. It's not worth buying into in 2020 with a new level of performance less than a year away.

Last edited by the_sextein : Feb 13, 2020 at 08:43 AM.
the_sextein is offline   Reply With Quote
Advertisement (Guests Only)
Login or Register to remove this ad
Old Feb 13, 2020, 09:58 AM   #64
OverclockN'
Radeon Arctic Islands
 
Join Date: Jun 2003
Location: United States Iowa
Posts: 23,125
OverclockN' can recite pi backwardsOverclockN' can recite pi backwardsOverclockN' can recite pi backwardsOverclockN' can recite pi backwardsOverclockN' can recite pi backwardsOverclockN' can recite pi backwardsOverclockN' can recite pi backwards


Subscriber
Default

Where is there actual, verifiable proof that a 2080Ti is even remotely close to saturating the 8x bandwidth? I've looked before and couldn't find anything convincing.

If there isn't, why are we assuming a 3080Ti will be a problem?
__________________
In the middle of difficulty lies opportunity...Albert Einstein
-------------------------------------------------------
OverclockN' is offline   Reply With Quote
Old Feb 13, 2020, 11:25 AM   #65
bill dennison
Radeon Arctic Islands
 
Join Date: Jan 2007
Location: United States phoenix
Posts: 20,456
bill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppers


Default

Quote:
Originally Posted by OverclockN' View Post
Where is there actual, verifiable proof that a 2080Ti is even remotely close to saturating the 8x bandwidth? I've looked before and couldn't find anything convincing.

If there isn't, why are we assuming a 3080Ti will be a problem?
never said it will be a problem

but they said the same thing going from 2.0 to 3.0 and at high resolutions it did mater a few FPS and more so on CFX/SLI

at 4k and a single 3080 ti it may only be 2 to 6 FPS depending on the game but since i have 4.0 i'll take it


only problem is Intel's lack of innovation and just pumping more and more watts in


Quote:
Intel needs AMD CPUs in order to test their future PCIe 4.0 SSDs
Read more: https://www.tweaktown.com/news/69702...sds/index.html

Last edited by bill dennison : Feb 13, 2020 at 11:39 AM.
bill dennison is online now   Reply With Quote
Old Feb 13, 2020, 11:43 AM   #66
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

Even if the 2080TI saturates PCIE 3.0 at 8X it would take double that to saturate a 16X port. The 3080TI will probably be 30 to 40% faster. Even the 4080TI will probably be fine on PCIE 3.0 if you ask me. I might see a handful of frame rate loss or minor utilization deficiencies with dual 3080TI's in 8X mode when I render but we will see. If AMD and Intel really push the limit next year I may upgrade after 3 years of ownership if I don't see 100% utilization on both cards during renders.

I highly doubt it will be a problem though as I saw no PCIE issues with 3 GPU's and 10,000 cuda cores at 100% utilization on my current system. Due to the way cards share data in multi GPU it's still very possible that there will be no loss even in X8 mode. (Each card produces half of a frame). I don't think I would trust PCIE 3.0 at 8X speeds for daul 4080TI's though and that is why I would hold off on the new Intel platform if I were in the market. Because I'm a multi GPU user and I want to upgrade my dual graphics setup at least once after I build the system before moving on.

I'm more interested in the specs for tiger lake to be honest. I'm just curious what kind of approach Intel is going to take. In my opinion, more cores is a waste of time for the mainstream until consoles move on 5 years from now. I would like to see a traditional high clock CPU or at the very least, an 8 core CPU with massive efficiency. There is no point in buying a 32 core chip so 24 cores can sit around doing nothing but sucking power and creating heat for no reason. If we are going to move forward it has to be through IPC and frequency, for the next 5 years anyway. AMD is going to bring an IPC and frequency boost with the 4000 series and I hope Intel does the same. The last thing I want to see is a 16 core mainstream processor or some other gimmick to make up for a lack of real CPU power.

Last edited by the_sextein : Feb 13, 2020 at 12:57 PM.
the_sextein is offline   Reply With Quote
Old Feb 15, 2020, 04:36 PM   #67
Kain
Radeon Caribbean Islands
 
Join Date: Jan 2001
Location: Dubai, UAE
Posts: 4,335
Kain once won a refrigerator on 'The Price is Right'Kain once won a refrigerator on 'The Price is Right'Kain once won a refrigerator on 'The Price is Right'Kain once won a refrigerator on 'The Price is Right'Kain once won a refrigerator on 'The Price is Right'Kain once won a refrigerator on 'The Price is Right'


Default

Quote:
Originally Posted by OverclockN' View Post
Mine is on auto.

I literally put the hardware together and went straight to gaming. I did try undervolting just a tiny bit once, but it blue screened as soon as I ran Prime95. It wasn't playing nice just from the one change, but I haven't tried anything else.
That's why your gaming temps are what he gets during stress testing. He is in the 50s C during gaming.
__________________
Intel Core i7-3930K | Corsair Hydro Series H100i PRO | GIGABYTE GA-X79-UD3 | 16 GB Corsair Dominator Platinum DDR3 2133 MHz | EVGA GeForce GTX 1080 Ti FTW3 GAMING | Creative Sound Blaster X-Fi Titanium HD | Crucial m4 256 GB | Western Digital Caviar Black 2 TB | Pioneer BDR-206DBK | EVGA SuperNOVA 850 P2 | Corsair Obsidian Series 650D | ASUS ROG Swift PG279Q | 2 x JBL LSR305 + 1 x JBL LSR310S | Microsoft SideWinder X4 | Logitech G502 Proteus Spectrum | SteelSeries QcK mini | Microsoft Windows 10 Pro 64-bit
Kain is offline   Reply With Quote
Old Feb 15, 2020, 06:23 PM   #68
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

BTW, I'm not saying dual 4080TI's will be cut short on PCIE 3.0 at 8X. I have no idea but that is why I wouldn't buy into it at this point. I would need assurance that my costly GPU's would not be clipped by a system before buying into it.
the_sextein is offline   Reply With Quote
Old Feb 15, 2020, 07:12 PM   #69
bill dennison
Radeon Arctic Islands
 
Join Date: Jan 2007
Location: United States phoenix
Posts: 20,456
bill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppers


Default

Quote:
Originally Posted by the_sextein View Post
BTW, I'm not saying dual 4080TI's will be cut short on PCIE 3.0 at 8X. I have no idea but that is why I wouldn't buy into it at this point. I would need assurance that my costly GPU's would not be clipped by a system before buying into it.
depends on who's type of ray tracing wins



and even if i have a RTX 2080 ti i bet NV loses this one to as it is another one of their proprietary POS that only works for them
all their proprietary crap dies in the end but they keep doing it

Intel's that works well on Ryzen also and with any GPU
and uses the CPU and GPU so it will up the PCIE traffic most likely a lot

Quote:
World of Tanks will put your multi-core CPU to better use ray tracing
https://www.pcgamesn.com/world-of-ta...cing-encore-rt

Last edited by bill dennison : Feb 15, 2020 at 07:20 PM.
bill dennison is online now   Reply With Quote
Old Feb 15, 2020, 07:40 PM   #70
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

Thanks for that, interesting to see.

If nobody is making new effects and Nvidia is going to take the time and trouble to do it then I can see why they make it just for their hardware. However, I agree with you that if a company like Intel later makes another version of what Nvidia is doing and it will work on all hardware....it will win. Game developers will support effects that support the largest pool of customers.

It's still too early to tell just what kind of effects this will have though and we don't really know if the 2080TI is actually limited at 8X. I've seen a test that showed a 2% difference but that could simply be the result of clamping the PCIE bus at 8X for all we know.

That being said, I don't like to take chances. My plan is to upgrade to dual 3080TI's and ride it out until the 4080TI rolls around about 2.5 years from now. By then PCIE 5.0 will be supported by all major hardware. The thing with my setup is I'm using a 60hz monitor so my FPS never go over 60 and the dual cards cut it up into 30FPS a piece. So I don't see my system having a problem at all with 3080TI but it's too soon to say what the 4080TI will offer and what kind of effect RTX will have or if it will even work with SLI ect. If it's between running a game at 45FPS on a single card vs running it at 60FPS on a dual GPU setup then I would just disable RTX. I'd disable RTX on a single card configuration as well if it knocks the FPS down below 60. Later on it will lead to game changing graphics but right now it's not all that impressive and kills performance on the current gen cards.

My biggest concern regarding dual GPU isn't SLI since it's hardly supported. My real concern is when both GPU's run at 100% utilization for number crunching on renders. This results in RAW output from the PCIE port to the PCIE lanes.

Last edited by the_sextein : Feb 15, 2020 at 07:50 PM.
the_sextein is offline   Reply With Quote
Old Feb 15, 2020, 08:20 PM   #71
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

Personally, since the GPU is much faster at physics and ray tracing I'd rather them find a way to add support for a dedicated GPU that can do that. In my case a 3rd card that runs in the 3rd PCIE slot. If I have a choice to use CPU or GPU ray tracing it will always be GPU and right now Nvidia is the only one that can do it. It's up to AMD to make their own version that supports only their GPU's or both theirs and Nvidia's if they want to crush Nvidia's implementation. The only reason they would have to do that though is if Nvidia's implementation is better than their own and that would surly piss off Nvidia customers. If their implementation is superior they would probably want to keep it to themselves just like Nvidia. So it really is down to them to compete with their own technology just like Nvidia does. If both companies have their own implementation will developers even bother with RTX if they aren't being paid by Nvidia and AMD to implement it? Once Intel gets into the GPU business it's going to get even worse so maybe they will all have to work on that together in the future.

The thing about Intel's RTX is that I could probably use it with SLI if Nvidia refuses to support it on the GPU with multi GPU enabled. So it will be a cool option to have. The most cutting edge power demanding RTX effects are going to need a GPU though and I would rather see a dedicated card for RTX than have Nvidia waste half of it's rasterization progress on RTX.

Last edited by the_sextein : Feb 15, 2020 at 08:46 PM.
the_sextein is offline   Reply With Quote
Old Feb 15, 2020, 09:00 PM   #72
bill dennison
Radeon Arctic Islands
 
Join Date: Jan 2007
Location: United States phoenix
Posts: 20,456
bill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppersbill dennison has a basement full of buried neg-reppers


Default

Quote:
Originally Posted by the_sextein View Post
Personally, since the GPU is much faster at physics and ray tracing I'd rather them find a way to add support for a dedicated GPU that can do that. In my case a 3rd card that runs in the 3rd PCIE slot. If I have a choice to use CPU or GPU ray tracing it will always be GPU and right now Nvidia is the only one that can do it. It's up to AMD to make their own version that supports only their GPU's or both theirs and Nvidia's if they want to crush Nvidia's implementation. The only reason they would have to do that though is if Nvidia's implementation is better than their own and that would surly piss off Nvidia customers. If their implementation is superior they would probably want to keep it to themselves just like Nvidia. So it really is down to them to compete with their own technology just like Nvidia does. If both companies have their own implementation will developers even bother with RTX if they aren't being paid by Nvidia and AMD to implement it? Once Intel gets into the GPU business it's going to get even worse so maybe they will all have to work on that together in the future.

The thing about Intel's RTX is that I could probably use it with SLI if Nvidia refuses to support it on the GPU with multi GPU enabled. So it will be a cool option to have. The most cutting edge power demanding RTX effects are going to need a GPU though and I would rather see a dedicated card for RTX than have Nvidia waste half of it's rasterization progress on RTX.
no one cares about NVidia customers not even NVidia or the prices would not be nuts

and with the core wars going on right now and how bad a hit on the GPU RT is at 4k and how bad DLSS sucks at 4k
and all those unused cores for cheap most gamers will have a 16c/32t CPU soon in the next year or so

or have one of the new gaming console and that this will most likely work on new gaming consoles also
bill dennison is online now   Reply With Quote
Old Feb 15, 2020, 10:40 PM   #73
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

I don't think consoles will be able to do anything significant especially with the CPU's they have. Either way I don't see the CPU being very useful for RTX compared to a GPU or a need for more cores in gaming. I'll buy a 16 core CPU if it has monstrous 8 core processing power but the extra cores won't do anything other than add heat and power usage that none of us need. If extra cores can be used for watered down RTX then that is better than nothing but it would be better off on a GPU in my opinion. Tagging CPU cores is a good way to introduce micro stutter in games. A dedicated card is what I want. Once consoles move on to 16 core chips then we will see games that actually utilize them and it won't be a gimmick to sell week chips to sheep. That's five year away at this point. If RTX slows down a 2080TI I would hate to see what it would do to a 9900K let alone a PS5.

Last edited by the_sextein : Feb 15, 2020 at 10:52 PM.
the_sextein is offline   Reply With Quote
Old Feb 16, 2020, 07:31 AM   #74
SirBaron
Hallowed are the Ori
 
Join Date: Apr 2003
Location: Germany Niedersachsen
Posts: 24,355
SirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single boundSirBaron can leap small-ish buildings in a single bound


Default

Quote:
Originally Posted by the_sextein View Post
I don't think consoles will be able to do anything significant especially with the CPU's they have.
Next gen while not PC destroying will have a custom variant of Ryzen 3700. Plus they will have hardware based RT so the CPU wont even be doing it anyway.
__________________
My Twitch Channel Unbiased Gaming!

PS4/PC Streaming - Streaming PS Indie Titles + Infamous, Metal Gear, and Killzone Shadow Fall

Fantards the scourge of the universe:
SirBaron is offline   Reply With Quote
Old Feb 16, 2020, 12:07 PM   #75
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

I haven't done a lot of research on the next gen consoles but with 8 cores in such a small enclosure I kind of figured it would be like a 3700X with lower cpu core speeds to keep the heat down.

I doubt the GPU in the PS5 will be able to outstrip a 2080TI and RTX is a problem for that GPU. I figure consoles will utilize minor first generation RTX effects at 30FPS using the GPU. It will probably be more of a buzz word than anything substantial graphically.
the_sextein is offline   Reply With Quote
Old Feb 16, 2020, 02:03 PM   #76
Gandalfthewhite
Hardware Enthusiast
 
Join Date: Mar 2003
Location: United States Middle Earth also known as AZ
Posts: 9,272
Gandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled withGandalfthewhite is not someone to be trifled with


Subscriber
Default

you do realize both sony and MS have stated they are targeting 4k60 as the base FPS target of the console and both are looking at 10+ Tflop GPUs right? heck they are talking about letting devs drop to 1080P to target 120 fps in games where it makes sense (FPS). This new gen console is basically targeting a system perf of a ~3700 with a RT enabled ~5700 and some form of ssd (it sounds like pcie 4.0 m.2 drive to be honest) and 16 gigs of ram for the system.

They are getting very close to offering a high mid tier experience at a sub $600 price point. And with single system tuning requirements it will be much easier to do proper optimizations for RT. But devs will choose the type of RT they use my guess is more global illumination and possibly reflections vs some of the other stuff you can do with it.
__________________
Main rig: look at system spec tab
Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

I know....I know........Keep my day job :-)- catcather
Gandalfthewhite is offline   Reply With Quote
Old Feb 16, 2020, 02:13 PM   #77
the_sextein
Radeon Evergreen
 
Join Date: May 2004
Posts: 1,916
the_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird singsthe_sextein knows why the caged bird sings


Default

Yeah I realize and I have made comparisons before that consoles are quickly catching up to PC's due to the lack of competition on the desktop. However, there is only so much they can pack into a small console and there is only so much power they can push into a $600 box while taking a loss.

Based on past statements on upcoming consoles I'm willing to go out on a limb here and say that those statements will relate to a very small percentage of games like fighting games or indie games.

I seriously doubt you are going to see Red Dead Redemption 2 at 4K 60FPS unless they give us options to turn down the graphics settings lower than a PlayStation 4. If the 2080TI costs $1200 and can't push 4K 60FPS on new games without RTX enabled then there is simply no way it's going to happen on a $600 console with RTX enabled.

The 2080TI gets about 43FPS at 4K with Red Dead Redemption. So games that look as good or better which come out on next gen consoles are going to run at 30FPS unless they have a card more powerful than a $1200 PC graphic card. AMD's most powerful card costs more than a console and can't beat a 1080TI so I just don't see it happening. IF RTX is used it will be minor and the frame rate will be 30FPS at best.

Last edited by the_sextein : Feb 16, 2020 at 02:49 PM.
the_sextein is offline   Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel Core i9-10900K: 10C/20T CPU + Z490 platform drops in April 2020 acroig General Hardware 11 Dec 17, 2019 09:33 AM
AMD Ryzen 5 1600 vs Intel Core i7-7800X: 30 Game Battle! Hexa-Core Head-to-Head badsykes General Hardware 3 Jul 23, 2017 11:00 AM
Intel Pentium vs. Core i3 vs. Core i5 vs. Core i7 paired with Fury X; R9 390; R9 380 badsykes General Hardware 9 Aug 23, 2016 02:42 PM
Intel Pentium vs. Core i3 vs. Core i5 vs. Core i7 – What do Gamers get by spending mo badsykes General Hardware 4 Jul 26, 2016 08:46 PM
Quad core + GPU 2.2x faster than 8-core CPU to complete Abaqus direct solver caveman-jim Front Page News 4 May 24, 2012 11:18 PM


All times are GMT -5. The time now is 01:12 PM.



Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
All trademarks used are properties of their respective owners. Copyright ©1998-2011 Rage3D.com
Links monetized by VigLink