Announcement

Collapse
No announcement yet.

Official AMD Radeon RX 7000 Series (RDNA 3) Thread

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #81
    Originally posted by bill dennison View Post
    i think MCM chiplets will require HBM .
    How come?

    Comment


      #82
      Originally posted by acroig View Post
      How come?
      because no infinity cache will be big enough for MCM chiplets

      i think they will need memory as close as posable and that means on die HBM, and where I think they have been going with HBM and infinity cache from the start steeping stones to MCM

      think 2 or 4 or more chiplets all working on a frame at the same time off die memory will be too slow add too much latency

      Comment


        #83
        I truly hope they make infinity cache something that works then, because right now it's useless.

        Bill you're obsessed with MCM
        Originally posted by curio
        Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
        "If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe

        Comment


          #84
          Originally posted by Nunz View Post
          I truly hope they make infinity cache something that works then, because right now it's useless.

          Bill you're obsessed with MCM
          yep

          we all should be

          we are at 5nm and GPU's chips are getting bigger every gen
          how much longer before they run out of nodes that work ?

          or GPU's that are so big they cost 2000+ bucks

          Comment


            #85
            I don't think MCM will make costs go down. If there's one thing that's been a constant in the world, it's that things don't get cheaper. AMD/NV will just take the extra profit margin. Look at the costs of the 5000 series; AMD seen that Intel doesn't have a 12-core or 16-core direct comparison chip, so they jacked the prices up by a nice chunk. The only thing that keeps prices reasonable is competition, but if both companies jack it up, then we're all screwed.

            I don't think it'll get cheaper. We've seen prices on the low and mid-range chips go up every year, to the point where people think the 3070 and 6800 are good values.
            Originally posted by curio
            Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
            "If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe

            Comment


              #86
              Originally posted by Nunz View Post
              I don't think MCM will make costs go down. If there's one thing that's been a constant in the world, it's that things don't get cheaper. AMD/NV will just take the extra profit margin. Look at the costs of the 5000 series; AMD seen that Intel doesn't have a 12-core or 16-core direct comparison chip, so they jacked the prices up by a nice chunk. The only thing that keeps prices reasonable is competition, but if both companies jack it up, then we're all screwed.

              I don't think it'll get cheaper. We've seen prices on the low and mid-range chips go up every year, to the point where people think the 3070 and 6800 are good values.
              50 bucks is "jacked the prices up by a nice chunk"
              16-core from 3950x for 749 to 799 for a 5950X


              before Ryzen Intel was charging 1000+ for 4, 6 and 8 core CPU's
              and if Intel had Ryzen instead of AMD that 5950X would be 1500 to 2000+


              I think it will get a little cheaper but it will more slow the rise in prices

              Comment


                #87
                AMD made a billion dollars more in Q4, and multiple billions in 2020
                All of that can translate to eye-popping dollar signs: the chipmaker just announced its Q4 and full-year 2020 earnings, and it’s adding billions of dollars wherever you look:

                $3.2 billion in revenue this quarter, up 53 percent from $2.1 billion last quarter
                $1.78 billion in profit this quarter, up 948 percent from $170 million last quarter*
                $9.76 billion in revenue this past year, up 45 percent from $6.7 billion in 2019
                $2.49 billion in profit this past year, up 630 percent from $341 million in 2019*
                *Those profits include “a fourth quarter income tax benefit of $1.30 billion associated with a valuation allowance release,” according to the company.
                https://www.theverge.com/2021/1/26/2...-earnings-2021

                yea i think we will see RDNA 3 in 10 to 14 months
                they got the cash to do it now and they need to get the jump on NV

                Comment


                  #88
                  Originally posted by bill dennison View Post
                  https://www.theverge.com/2021/1/26/2...-earnings-2021

                  yea i think we will see RDNA 3 in 10 to 14 months
                  they got the cash to do it now and they need to get the jump on NV
                  Bill this a comment from the article you posted.

                  And while AMD CEO Lisa Su did claim that the new Radeon 6000 cards had “launch quarter shipments three times larger than any prior AMD gaming GPU priced above $549,” that’s not much of a claim. The company’s Vega cards were also insanely hard to find, and to our knowledge AMD never previously sold a single gaming GPU above $550. The Radeon HD 7970 launched at $549, excluding it from AMD’s claim, and though the 6990 and 7990 cost $700 and $1,000 respectively, they were niche cards that had two GPUs each.

                  And before anyone shouts Radeon VII that was a failed Instinct card that only got launched because of the stupid prices of the RTX20 series. All the profit has come from console sales, EPYC ,Ryzen and mobile chips.

                  It's also interesting to note that the ASP on desktop processors dropped due to the mix of sales. Which translates to people like me who ended up buying a 3800X Zen 2 because I couldn't buy a 5800X. Also don't forget the fiasco surrounding Zen 3 not working on X470 M/B's and they needed a new bios which is only now being released.

                  If AMD want to keep making money all they need to do is release the 6700XT, 6700 and any mid-range card below because that is where the big sales and money are. They don't need an enthusiast 4K card. Lisa Su is a very intelligent business person and not a 4K enthusiast like you I'm afraid.
                  Ryzen 7 3800X, ASUS Prime X470 Pro, KFA2 RTX 3090 SG, 16GB Crucial DDR4 LPX 3000 Ram, iiyama G-Master GB3466WQSU 3440x1440 freesync 144hz, 250gb Samsung SSD, 750mb Seagate SSHD, 2TB Seagate Barracuda H/D, 1 TB Samsung H/D, 850w PSU, Windows 10

                  Comment


                    #89
                    fact of life the top high end cards sell the midrange cards

                    if AMD can't come out with a winning high end card to beat NV their midrange won't sell well


                    flat out they need to beat the 3090 and smoke the 3080 soon or what ever NV's top cards are at the time
                    to get market buzz and sell the midrange money makers


                    take last time as AMD doesn't have 6000 midrange out yet

                    most people didn't say to themselves hay NV has the best fastest card in the 2080 ti but i can't afford that 1200 bucks monster so i'll buy a midrange RX 5700 that is slower than NV's last top card the 1080 ti
                    no they went for the 2060/70/80 that they could afford and why AMD lost a crap ton of market share

                    the masses may not buy the wining top card but the tend to buy the winning brand name of that top card .

                    and AMD needs to get there soon

                    Comment


                      #90
                      Originally posted by Nunz View Post
                      Overclocking the memory won't fix an issue with the bus width not being wide enough to handle more throughput.
                      If the buswidth is 128bit and the memory clock is 10ghz, it's still the same memory bandwidth as a 256bit bus with 5ghz memory, so I'm not sure what you mean by handling throughput better. If RDNA2 is bandwidth constrained, it would reflect it with near linear performance increase at 4k by overclocking the memory. But that is not what is happening. A 7% boost in memory clocks is fetching one or two frames per second.


                      The better way to look at it is that the 6800 is scaling better than the 3000 series at lower resolutions because of the ''garbage'' infinity cache. Not because it is memory bandwidth constrained at 4k, but because at resolutions below 4k, much less data has to leave the package. If the 3080 had a 128mb cache, it would probably show a similar delta between 1440p and 4k.

                      A deeper dive and investigation is needed. But there is no evidence to suggest RDNA2 is bandwidth bottlenecked.
                      i10400
                      WX3200 Radeon Pro
                      Sound BlasterX G6 + Sony MDR 7506
                      LG 43UD79-B

                      Comment


                        #91
                        Originally posted by Nunz View Post
                        HBM is just another money sink that has nose-dived AMD before. They don't need it. GDDR6 puts out very good bandwidth, especially if they use Samsung memory chips instead of Hynix/Micron. If they had used a 384-bit bus, or maybe a 352-bit bus, we wouldn't be talking about HBM. Look at GDDR6X and the bandwidth it's putting out; I don't care for HBM at all.

                        HBM seemed genius when GDDR4 AND GDDR5 were slow as piss, but GDDR6 and 6X dramatically increased memory throughput and I think it shows that the memory bandwidth is not the limiting factor right now. Combine the raster performance of the 6900XT with a good sized bus instead of the gimped garbage AMD put on the card and it likely changes much of what we see in 4K performance.
                        HBM2 is cheaper at this point then gddr6x. Rumors are starting to make the rounds that NV cards are getting another price bump because of gddr6x production woes. At this point the HBM2 is too expensive rhetoric is out of date, especially with HBM2e being a thing. The real question is can production of HBM keep up with demand?

                        I don't disagree about the bus width being the downside of the current 6x00 cards.
                        Main rig: look at system spec tab
                        Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


                        HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

                        In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

                        I know....I know........Keep my day job :-)- catcather

                        Comment


                          #92
                          I don't typically chime in on "rumor" threads like this because my friend works on the CPU side.. but..

                          There is a reason RDNA2's IMC supports HBM2. Its the same IMC used in their CDNA arch. Don't count HBM2 out just yet.
                          Be a pirate.

                          Comment


                            #93
                            Originally posted by bill dennison View Post
                            fact of life the top high end cards sell the midrange cards

                            if AMD can't come out with a winning high end card to beat NV their midrange won't sell well


                            flat out they need to beat the 3090 and smoke the 3080 soon or what ever NV's top cards are at the time
                            to get market buzz and sell the midrange money makers


                            take last time as AMD doesn't have 6000 midrange out yet

                            most people didn't say to themselves hay NV has the best fastest card in the 2080 ti but i can't afford that 1200 bucks monster so i'll buy a midrange RX 5700 that is slower than NV's last top card the 1080 ti
                            no they went for the 2060/70/80 that they could afford and why AMD lost a crap ton of market share

                            the masses may not buy the wining top card but the tend to buy the winning brand name of that top card .

                            and AMD needs to get there soon
                            Another way of putting this is the masses barely know what they are doing and just buy whatever company name they recognize.

                            Comment


                              #94
                              AMD should highlight its main brand name over that of Radeon at this point. People know AMD for cpus they wont hesitate as much over a gpu if they see that main brand on the box.

                              But honestly 2 main gpu makers doesnt cut it. If we want out of the perenial crazy banana pants prices we'll need intel and some chinese maker Ive read about at some point in there.
                              I talked to the tree. Thats why they put me away!..." Peter Sellers, The Goon Show
                              Only superficial people cant be superficial... Oscar Wilde

                              Piledriver Rig 2016: Gigabyte G1 gaming 990fx. FX 8350 cpu. XFX RX 480 GTR Cats 22.7.1, SoundBlaster ZXR, 2 x 8 gig ddr3 1866 Kingston. 1 x 2tb Firecuda seagate with 8 gig mlc SSHD. Sharp 60" 4k 60 hz tv. Win 10 home.

                              Ryzen Rig 2017: Gigabyte X370 K7 F50d bios. Ryzen 5800X3D :). 2 x 8 ddr4 3600 (@3200) Cas 16 Gskill. Sapphire Vega 64 Reference Cooler Cats 22.4.1. 1700 mhz @1.1v. Soundblaster X Ae5, 32" Dell S3220DGF 1440p Freesync Premium Pro monitor, Kingston A2000 1TB NVME. 4 TB HGST NAS HD. Win 11 pro.

                              Ignore List: Keystone, Andino... -My Baron, he wishes to inform you that vendetta, as he puts it in the ancient tongue, the art of kanlee is still alive... He does not wish to meet or speak with you...-
                              "Either half my colleagues are enormously stupid, or else the science of darwinism is fully compatible with conventional religious beliefs and equally compatible with atheism." -Stephen Jay Gould, Rock of Ages.
                              "The Intelligibility of the Universe itself needs explanation. It is not the gaps of understanding of the world that points to God but rather the very comprehensibility of scientific and other forms of understanding that requires an explanation." -Richard Swinburne

                              www.realitysandwich.com

                              www.plasma-universe.com/pseudoskepticism/

                              Comment


                                #95
                                Originally posted by pax View Post
                                AMD should highlight its main brand name over that of Radeon at this point. People know AMD for cpus they wont hesitate as much over a gpu if they see that main brand on the box.

                                But honestly 2 main gpu makers doesnt cut it. If we want out of the perenial crazy banana pants prices we'll need intel and some chinese maker Ive read about at some point in there.
                                we don't need anything from china .

                                the prices are more the fabs at this point i think we need lots more fabs and most in north America

                                Comment


                                  #96
                                  Originally posted by bill dennison View Post
                                  fact of life the top high end cards sell the midrange cards

                                  if AMD can't come out with a winning high end card to beat NV their midrange won't sell well

                                  flat out they need to beat the 3090 and smoke the 3080 soon or what ever NV's top cards are at the time
                                  to get market buzz and sell the midrange money makers

                                  take last time as AMD doesn't have 6000 midrange out yet

                                  most people didn't say to themselves hay NV has the best fastest card in the 2080 ti but i can't afford that 1200 bucks monster so i'll buy a midrange RX 5700 that is slower than NV's last top card the 1080 ti
                                  no they went for the 2060/70/80 that they could afford and why AMD lost a crap ton of market share

                                  the masses may not buy the wining top card but the tend to buy the winning brand name of that top card .

                                  and AMD needs to get there soon
                                  TBH I'm not convinced with that argument that having a halo 4k card sells more mid-range their is far more to it than that. I agree (and you've mentioned numerous times ) my last three GPU's were mid-range AMD cards in the scheme of things. During that time Nvidia always had the halo card but that never bothered me. There were a couple of things that made me stick with AMD.

                                  Firstly I had a freesync monitor so I was in the AMD ecosystem. Plus at the time Nvidia didn't support it. Secondly I looked at price/performance ratio and AMD's cards always seemed better in that respect. We're all individuals so lumping us all together as a homogeneous mass, as Nagorak mentions you're making the assumption we don't know what we're doing which is a bit disingenuous and sounds like you're looking down on people who buy mid-range cards as stupid.

                                  Two other things you ignore are if you already own an Nvidia card it's simply a plug and play upgrade whereas buying a new AMD card means getting used to newer things and driver settings. This is quite off putting for a lot of people who are not used to it.

                                  TBH I'm still finding my way around the Nvidia driver panel, it's a lot different to AMD's Adrenalin software. You shouldn't underestimate that as its a steep learning curve. I'm replicating the optimal settings in Geforce Experience because I haven't used an Nvidia card for a decade so any help I can get is welcome.

                                  The other thing you're ignoring is AMD's (justified or not) reputation for drivers. Whether it's true or not people just believe it and use it as a reason not to buy an AMD card. I've had my issues to be sure and I agree that Nvidia have their own issues but the myth that AMD drivers are bad has become a fact people believe without questioning.

                                  There's a lot of things AMD need to do and just releasing a halo card is not going to fix them.
                                  Ryzen 7 3800X, ASUS Prime X470 Pro, KFA2 RTX 3090 SG, 16GB Crucial DDR4 LPX 3000 Ram, iiyama G-Master GB3466WQSU 3440x1440 freesync 144hz, 250gb Samsung SSD, 750mb Seagate SSHD, 2TB Seagate Barracuda H/D, 1 TB Samsung H/D, 850w PSU, Windows 10

                                  Comment


                                    #97
                                    Originally posted by LordHawkwind View Post
                                    TBH I'm not convinced with that argument that having a halo 4k card sells more mid-range their is far more to it than that. I agree (and you've mentioned numerous times ) my last three GPU's were mid-range AMD cards in the scheme of things. During that time Nvidia always had the halo card but that never bothered me. There were a couple of things that made me stick with AMD.

                                    Firstly I had a freesync monitor so I was in the AMD ecosystem. Plus at the time Nvidia didn't support it. Secondly I looked at price/performance ratio and AMD's cards always seemed better in that respect. We're all individuals so lumping us all together as a homogeneous mass, as Nagorak mentions you're making the assumption we don't know what we're doing which is a bit disingenuous and sounds like you're looking down on people who buy mid-range cards as stupid.

                                    Two other things you ignore are if you already own an Nvidia card it's simply a plug and play upgrade whereas buying a new AMD card means getting used to newer things and driver settings. This is quite off putting for a lot of people who are not used to it.

                                    TBH I'm still finding my way around the Nvidia driver panel, it's a lot different to AMD's Adrenalin software. You shouldn't underestimate that as its a steep learning curve. I'm replicating the optimal settings in Geforce Experience because I haven't used an Nvidia card for a decade so any help I can get is welcome.

                                    The other thing you're ignoring is AMD's (justified or not) reputation for drivers. Whether it's true or not people just believe it and use it as a reason not to buy an AMD card. I've had my issues to be sure and I agree that Nvidia have their own issues but the myth that AMD drivers are bad has become a fact people believe without questioning.

                                    There's a lot of things AMD need to do and just releasing a halo card is not going to fix them.
                                    it is the same as CPU's it took 3 years of top CPU's to get people to start dropping intel
                                    and it will be the same with GPU's
                                    the 6800 xt is a good start but they need more now with RT

                                    I would take either a 3080 Strix or a 6800 xt Strix at this point as i don't really care about RT it just not ready for prime time even after 2+ years
                                    and without RT they are about the same at 4k but the 6800 xt uses less watts
                                    but they will both only be about 25% to 30% at 4k over what i have

                                    but RT is all the buzz even if it will take 4 years to be ready for prime time so AMD needs it




                                    i wish NVidia control panel was like Adrenalin software and that POS Geforce Experience was gone

                                    the hole lot of that ATI/AMD bad driver thing started with Nvidia Focus Group pushing it around the net
                                    some of it was deserved at times but mostly it was NV had better SLI drivers in more games than Crossfire up till the 290x
                                    single card is about the same and always has been they both have problems now and then with some games

                                    Comment


                                      #98
                                      I remember the ugly CP on my 1060. Do not miss it. Nor the Linux hell I went through.
                                      i10400
                                      WX3200 Radeon Pro
                                      Sound BlasterX G6 + Sony MDR 7506
                                      LG 43UD79-B

                                      Comment


                                        #99
                                        Originally posted by SuperGeil View Post
                                        I remember the ugly CP on my 1060. Do not miss it. Nor the Linux hell I went through.
                                        it's the same one it has not changed since windows 3.0

                                        Comment


                                          Infinity cache with AI acceleration?

                                          This confirms they are going all in on memory on chip, but the patent indicates they don't just want memory to function as superfast cache - they want memory modules to also accelerate AI.

                                          CDNA 2? RDNA 3?
                                          i10400
                                          WX3200 Radeon Pro
                                          Sound BlasterX G6 + Sony MDR 7506
                                          LG 43UD79-B

                                          Comment


                                            Originally posted by SuperGeil View Post
                                            I remember the ugly CP on my 1060. Do not miss it. Nor the Linux hell I went through.
                                            Originally posted by bill dennison View Post
                                            it's the same one it has not changed since windows 3.0
                                            FPS with RTX/DLSS > Cute CP.

                                            Comment


                                              Im not interested in turning a pretty good Radeon architecture thread this into another Nv marketing circus show.
                                              i10400
                                              WX3200 Radeon Pro
                                              Sound BlasterX G6 + Sony MDR 7506
                                              LG 43UD79-B

                                              Comment


                                                Originally posted by SuperGeil View Post
                                                Im not interested in turning a pretty good Radeon architecture thread this into another Nv marketing circus show.
                                                This is a forum, you make a statement, someone refutes it. That's how it works, verstehen?

                                                Comment


                                                  Originally posted by acroig View Post
                                                  This is a forum, you make a statement, someone refutes it. That's how it works, verstehen?
                                                  That is an interesting way to refute a statement about a CP being ugly. You should not be cognitively bothered by it.

                                                  EDIT: you get a moderator to ban me from speaking German to another German and you proceed to condescend me with German. I know you don't like me, but trust me, I'm not going to affect Nvidia's sales too much
                                                  i10400
                                                  WX3200 Radeon Pro
                                                  Sound BlasterX G6 + Sony MDR 7506
                                                  LG 43UD79-B

                                                  Comment


                                                    Originally posted by SuperGeil View Post
                                                    That is an interesting way to refute a statement about a CP being ugly. You should not be cognitively bothered by it.
                                                    The fact that I disagree with you and explained the reasoning behind it is not nV marketing circus, it's an opinion.

                                                    Have the last word on this.

                                                    Comment


                                                      Originally posted by acroig View Post
                                                      FPS with RTX/DLSS > Cute CP.
                                                      preparation DLSS sucks KAC's hairy ass

                                                      Comment


                                                        Originally posted by bill dennison View Post
                                                        preparation DLSS sucks KAC's hairy ass
                                                        Maybe.... but the FPS are still up there.

                                                        Stop arguing or I'll ask Exposed to chime in about DLSS.

                                                        Comment


                                                          Originally posted by acroig View Post
                                                          The fact that I disagree with you and explained the reasoning behind it is not nV marketing circus, it's an opinion.

                                                          Have the last word on this.
                                                          But it's not a disagreement about the CP being ugly because you would have said otherwise. It's cleverly ambiguous in that regard. Did you mean, ''you are right, the CP is ugly, but we have Ray tracing and DLSS'' or did you mean ''no, you are wrong, the control panel is pretty because we have Ray tracing and DLSS.''? either way, it makes no sense.

                                                          I think you know what you are doing.
                                                          i10400
                                                          WX3200 Radeon Pro
                                                          Sound BlasterX G6 + Sony MDR 7506
                                                          LG 43UD79-B

                                                          Comment


                                                            Originally posted by acroig View Post
                                                            Maybe.... but the FPS are still up there.

                                                            Stop arguing or I'll ask Exposed to chime in about DLSS.
                                                            i like RT for the most part when it in not turning clear glass into a funhouse mirror like is 2077


                                                            but if i have to turn on more preparation DLSS than quality at 4k i will turn RT off also and run without both .

                                                            and i get better FPS in 2077 at 4k without RT & preparation DLSS quality than with both on
                                                            and i have been playing 2077 with both off

                                                            and performance & balanced are just bad .
                                                            if i wanted to play at 1080p i wouldn't have paid 2000+ bucks for this 4K OLED 55" LG CX

                                                            hell lets go back to a CGA CRT to get FPS for RT

                                                            what i really need a new card for is HDMI 2.1
                                                            Last edited by bill dennison; Jan 29, 2021, 12:11 PM.

                                                            Comment


                                                              Originally posted by bill dennison View Post
                                                              i like RT for the most part when it in not turning clear glass into a funhouse mirror like is 2077


                                                              but if i have to turn on more preparation DLSS than quality at 4k i will turn RT off also and run without both .
                                                              Not to beat a dead horse but DLSS is quite good on Cyberpunk.

                                                              Comment


                                                                Originally posted by acroig View Post
                                                                Not to beat a dead horse but DLSS is quite good on Cyberpunk.
                                                                quality is ok at 4k but is not much of a FPS boost over DLSS off

                                                                and being 36 inches away from a 4K OLED 55" LG CX i don't like performance & balanced .

                                                                Comment


                                                                  Originally posted by bill dennison View Post
                                                                  preparation DLSS sucks KAC's hairy ass
                                                                  All the games I play with Ray tracing don't offer DLSS. Only Ultra RT. OK not 4K but 1440p but if it comes with the card why not use it? TBH 1440p native with Ultra RT looks pretty good to me.

                                                                  Do you really think AMD's version of DLSS is going to be any better? I think the jury might be out on that one. Let's see I guess.
                                                                  Ryzen 7 3800X, ASUS Prime X470 Pro, KFA2 RTX 3090 SG, 16GB Crucial DDR4 LPX 3000 Ram, iiyama G-Master GB3466WQSU 3440x1440 freesync 144hz, 250gb Samsung SSD, 750mb Seagate SSHD, 2TB Seagate Barracuda H/D, 1 TB Samsung H/D, 850w PSU, Windows 10

                                                                  Comment


                                                                    Originally posted by LordHawkwind View Post
                                                                    All the games I play with Ray tracing don't offer DLSS. Only Ultra RT. OK not 4K but 1440p but if it comes with the card why not use it? TBH 1440p native with Ultra RT looks pretty good to me.

                                                                    Do you really think AMD's version of DLSS is going to be any better? I think the jury might be out on that one. Let's see I guess.
                                                                    no and i won't use AMD's either if it's not

                                                                    and 1440p is less than the 30 inch 2560x1600 i started using in 2007
                                                                    i can deal with

                                                                    in 4 years 2 to 3 generation RT will be great when GPU's can do it without preparation DLSS

                                                                    till then it is great to look at for a bit then turn off and play the game if it is like 2077 and even that you can play at 30 to 35 FPS RT on and with G-sync or 35 to 40 FPS without RT
                                                                    DLSS quality splits the difference

                                                                    Comment


                                                                      Originally posted by bill dennison View Post
                                                                      quality is ok at 4k but is not much of a FPS boost over DLSS off

                                                                      and being 36 inches away from a 4K OLED 55" LG CX i don't like performance & balanced .
                                                                      Wow you sit three feet away form a 55" screen wow. I sit a few feet away from my 27" monitor and it seems big

                                                                      I have a 2018 LG 4K 50" HDR TV in the living room and sit maybe 8 to 10 feet away and it's difficult to tell the difference when watching 4K output. Now HDR I can tell the difference straight away very nice indeed. The Xbone X looks good with HDR content but again I'm up to 10 feet away.

                                                                      Good for Sports sims and racing but need my 1440p monitor for first person shooters and keyboard.
                                                                      Ryzen 7 3800X, ASUS Prime X470 Pro, KFA2 RTX 3090 SG, 16GB Crucial DDR4 LPX 3000 Ram, iiyama G-Master GB3466WQSU 3440x1440 freesync 144hz, 250gb Samsung SSD, 750mb Seagate SSHD, 2TB Seagate Barracuda H/D, 1 TB Samsung H/D, 850w PSU, Windows 10

                                                                      Comment


                                                                        Originally posted by bill dennison View Post
                                                                        preparation DLSS sucks KAC's hairy ass
                                                                        Billy and his non sense posts. I see no issues while using DLSS. No reason to be so upset about it Billy!

                                                                        Comment


                                                                          Originally posted by Nascar24 View Post
                                                                          Billy and his non sense posts. I see no issues while using DLSS. No reason to be so upset about it Billy!
                                                                          i don't like it sorry if that offends you but i don't really give a Sh*t

                                                                          and i wasn't the one that brought NV DLSS into a RDNA 3 Rumor Discussion thread







                                                                          ..............

                                                                          Could AMD build its next RDNA graphics card out of chiplets?
                                                                          https://www.pcgamer.com/amd-mcm-gpu-...s-card-rdna-3/


                                                                          .........
                                                                          AMD Files Patent for Chiplet Machine Learning Accelerator to be Paired With GPU, Cache Chiplets
                                                                          This could give AMD a modular way to add machine-learning capabilities to several of their designs through the inclusion of such a chiplet, and might be AMD's way of achieving hardware acceleration of a DLSS-like feature.
                                                                          https://www.techpowerup.com/277856/a...e-chiplets?amp

                                                                          and if it is not better than DLSS 2 Q i won't use it either
                                                                          Last edited by bill dennison; Jan 29, 2021, 03:48 PM.

                                                                          Comment


                                                                            Originally posted by bill dennison View Post
                                                                            i don't like it sorry if that offends you but i don't really give a Sh*t

                                                                            and i wasn't the one that brought NV DLSS into....
                                                                            My bad, let's move on, the CP argument seemed silly so I brought up other points.

                                                                            Comment


                                                                              Originally posted by acroig View Post
                                                                              Maybe.... but the FPS are still up there.

                                                                              Stop arguing or I'll ask Exposed to chime in about DLSS.

                                                                              Comment


                                                                                , I'll do the same.

                                                                                Comment


                                                                                  Originally posted by acroig View Post
                                                                                  My bad, let's move on, the CP argument seemed silly so I brought up other points.

                                                                                  You knew what you were doing and what was going to happen You were not that insecure about ugly CP comments.

                                                                                  Originally posted by SuperGeil View Post
                                                                                  Im not interested in turning a pretty good Radeon architecture thread this into another Nv marketing circus show.
                                                                                  Back to RDNA3
                                                                                  i10400
                                                                                  WX3200 Radeon Pro
                                                                                  Sound BlasterX G6 + Sony MDR 7506
                                                                                  LG 43UD79-B

                                                                                  Comment

                                                                                  Working...
                                                                                  X