Nvidia RTX 2080 Ti graphics cards are dying on a lot of users

acroig

Just another Troll
"Concerns are mounting over the failure rate of Nvidia’s RTX 2080 Ti graphics card, with increasing numbers of reports of dead and dying cards from early adopters. Some display issues involving artifacting and instability immediately after being installed, whilst others begin to show signs of degradation after a few days, despite a lack of manual overclocking or voltage manipulation."

https://www.digitaltrends.com/computing/nvidia-rtx-2080-ti-graphics-cards-dying/
 
"Concerns are mounting over the failure rate of Nvidia’s RTX 2080 Ti graphics card, with increasing numbers of reports of dead and dying cards from early adopters. Some display issues involving artifacting and instability immediately after being installed, whilst others begin to show signs of degradation after a few days, despite a lack of manual overclocking or voltage manipulation."

https://www.digitaltrends.com/computing/nvidia-rtx-2080-ti-graphics-cards-dying/

I wonder if some of these people are causing the problem with trying to run the system with a low quality power supply or an under powered one. Considering it has 2 8 pin connectors indicates it will pull over 300 watts under full load.

Some of the issues that are described in that article are exactly the same problems Vega64 owners had when using a inferior power supply or an under powered one. Most issues like this disappeared once they replaced their power supplies with a quality, higher wattage one.
 
I wonder if some of these people are causing the problem with trying to run the system with a low quality power supply or an under powered one. Considering it has 2 8 pin connectors indicates it will pull over 300 watts under full load.

Some of the issues that are described in that article are exactly the same problems Vega64 owners had when using a inferior power supply or an under powered one. Most issues like this disappeared once they replaced their power supplies with a quality, higher wattage one.

Excellent point, have not seen any issues with mine yet.
 
No issues here either, and im still using a many year old corsaire 850 watt power supply from my 1080 sli days.
 
I wonder if some of these people are causing the problem with trying to run the system with a low quality power supply or an under powered one. Considering it has 2 8 pin connectors indicates it will pull over 300 watts under full load.

Some of the issues that are described in that article are exactly the same problems Vega64 owners had when using a inferior power supply or an under powered one. Most issues like this disappeared once they replaced their power supplies with a quality, higher wattage one.

some of the crashing yes has been fixed from better PSU's or making sure it is using two pcie power wires from two different rails and not one of those crap pcie power wires with two 8 pin plugs

but not all and not the artifacting or some of the blue screening there have been lots of RMA's

sound to me since it is almost all reference pcb's and with all the delays and low stock they had a bad first run of 2080 ti reference pcb's
most likely passed it by now
 
Last edited:
The usuall suspects:

1.) They OC'd past what they should have. This is probably the majority of the failures. They pushed the card too far and killed it. I've seen this time and time again.

2.) They used a PSU that did not meet the AMP requirements on the rail. They use a 550w PSU from the old system which barely met the amp requirements from their old card. It worked for a little bit until the rail gave out and or tripped damaging the card. Apparently 38amps to 42amps is required for the 2080Ti. ( See this a lot )

3.) They modified the card either by replace the thermal paste or cooler and damaged it in the process. ( Jay2Cents, Linus come too mind )

4.) They put it in a case with poor ventilation. ( Rarely seen this one )

5.) Lemon from the factory ( I've had this happened too me ).
 
it does make me wonder again what is going to happen if and when they turn the other half of the chip on for ray tracing and dlss and it starts drawing more power :hmm:
 
It will probably draw less power with dlss, "upscaling" from 1440p will probably be less demanding on the card than real 4k.
 
It will probably draw less power with dlss, "upscaling" from 1440p will probably be less demanding on the card than real 4k.

Not necessarily, as you will be using the tensor cores as well as the streaming cores when using DLSS, where traditional AA just uses the streaming cores. We really don't know how efficient the Tensor cores are when it comes to power draw. We may find out that it is power hungry.
 
The Titan V has been out for over a year with more tensor cores than 2080ti, and was designed to run deep learning algorithms on both tensor and cuda cores for hours/days/weeks at a time. Not a peep of an issue. I suspect adding tensor/rt workload on the 2080Ti will not be any different, certainly not an issue raised by reviewers who did have access to DLSS/RTX demos to run.
 
No issues here either, and im still using a many year old corsaire 850 watt power supply from my 1080 sli days.

Yes, but you have more brains than most and wouldn't try to run it on a much smaller power supply like many do.

We are talking about those that have 600 watt power supplies, do the math for their complete system, and believe that the can run that 600 watt power supply balls to the wall at it's peak output without any problem because the math says they can. Then blame everything but the power supply, or their stupidity for thinking that you can run a power supply at it's max output constantly without it sooner or later giving you the middle finger. All it would take in this type of situation, is a problem with the local power grid to cause system instability as a maxed out power supply would have NO room to compensate for the dirty power coming into the building.
 
The Titan V has been out for over a year with more tensor cores than 2080ti, and was designed to run deep learning algorithms on both tensor and cuda cores for hours/days/weeks at a time. Not a peep of an issue. I suspect adding tensor/rt workload on the 2080Ti will not be any different, certainly not an issue raised by reviewers who did have access to DLSS/RTX demos to run.

I think you may have misunderstood. If you are not running anything on the tensor cores, they are just sitting their doing nothing (idle) they will draw significantly less power than when they are actually being used. We are not saying it will be a problem, we are just simply saying that the card will most likely draw more power when the whole Chip is being used rather than just half the chip which is the current situation for 2080Ti owners.

I do get what you are saying, but also, keep in mind, the Titan V isn't normally being used to game at 4k while doing all those deep learning algorithms. It is possible when doing both simultaneously, the power draw will be higher. But until we actually have games out for reviewers to fully test this on, we really won't know.
 
Last edited:
Yes, but you have more brains than most and wouldn't try to run it on a much smaller power supply like many do.

Thanks :lol:
We are talking about those that have 600 watt power supplies, do the math for their complete system, and believe that the can run that 600 watt power supply balls to the wall at it's peak output without any problem because the math says they can. Then blame everything but the power supply, or their stupidity for thinking that you can run a power supply at it's max output constantly without it sooner or later giving you the middle finger. All it would take in this type of situation, is a problem with the local power grid to cause system instability as a maxed out power supply would have NO room to compensate for the dirty power coming into the building.

Yeah, 600watts isn't enough these days for any high end graphics card. Or it might be teetering on the edge, and overclocking just a tiny bit might be the point of instability/artifacting.

I think you may have misunderstood. If you are not running anything on the tensor cores, they are just sitting their doing nothing (idle) they will draw significantly less power than when they are actually being used. We are not saying it will be a problem, we are just simply saying that the card will most likely draw more power when the whole Chip is being used rather than just half the chip which is the current situation for 2080Ti owners.

I do get what you are saying, but also, keep in mind, the Titan V isn't normally being used to game at 4k while doing all those deep learning algorithms. It is possible when doing both simultaneously, the power draw will be higher. But until we actually have games out for reviewers to fully test this on, we really won't know.
Well, that will be the first thing I check when these demos make their appearance. RTX might be off for abit unless someone makes a Vulcan RTX demo, the 1809 update from MS is suspended until they work out the deleted files issue, and that version is required for ray tracing.
 

Your welcome!

Yeah, 600watts isn't enough these days for any high end graphics card. Or it might be teetering on the edge, and overclocking just a tiny bit might be the point of instability/artifacting.

You know that, I know that, but the internet is full of people who say the numbers say different.. then bitch when it doesn't work. :lol:

Saw tons of it when the Vega64 came out.

Well, that will be the first thing I check when these demos make their appearance. RTX might be off for abit unless someone makes a Vulcan RTX demo, the 1809 update from MS is suspended until they work out the deleted files issue, and that version is required for ray tracing.


Agreed, I will be watching just because it will be interesting to see the results, but I would suggest taking any results that are derived from demo's with a grain of salt. Because demo's that are designed to show off a new technology usually give better performance/results than you will get in real world gaming.

We have seen it time and time again in the past:

"XYX looks great" (runs out and buys XYZ, fires up the first full game released with XYZ) "WTF this XYZ piece of **** is buggy and looks like crap.. something must be broken because it doesn't look or perform like the demo" :lol:


Kind of like the hamburger you see in commercials.. looks great.. when you actually get the real thing.. not so great.. :lol: gotta love marketing. :D
 
Your welcome!



You know that, I know that, but the internet is full of people who say the numbers say different.. then bitch when it doesn't work. :lol:

Saw tons of it when the Vega64 came out.




Agreed, I will be watching just because it will be interesting to see the results, but I would suggest taking any results that are derived from demo's with a grain of salt. Because demo's that are designed to show off a new technology usually give better performance/results than you will get in real world gaming.

We have seen it time and time again in the past:

"XYX looks great" (runs out and buys XYZ, fires up the first full game released with XYZ) "WTF this XYZ piece of **** is buggy and looks like crap.. something must be broken because it doesn't look or perform like the demo" :lol:


Kind of like the hamburger you see in commercials.. looks great.. when you actually get the real thing.. not so great.. :lol: gotta love marketing. :D

Some even on this forum found that out the hard way.
 
Don't forget the "gamer" trend of buying the cheapest, crapiest, $65 mini-ITX mobo to pair with your top tier CPU and GPU.
 
hell my 1080 ti took my 850w PSU but it was old

hope my new rm1000i and my hx1200i are all right

Yeah but it met the AMP requirement for the card.

See people shy from buying a large PSU because they think its going to pull 1000w. Just because the PSU has the ability to pull 1000w doesn't mean the system will pull 1000w.
 
Back
Top