Announcement

Collapse
No announcement yet.

Announcement, & Talking About, FSR Thread

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Announcement, & Talking About, FSR Thread

    AMD dropping the hammer with support for everything back to rx500 series to now and AND Nvidia 1xxx series to now. As well as consoles this is big news. 10 developers / engines on board right now with more to come.

    https://videocardz.com/newz/amd-fide...d-geforce-gpus
    Main rig: look at system spec tab
    Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


    HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

    In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

    I know....I know........Keep my day job :-)- catcather

    #2
    Those are very impressive numbers they are marketing.

    I hope it is true..... does anyone think it will be that close?

    I would assume the example they give (God fall) is their best one, anyone have ideas or thoughts on what will be the average increase?

    I am not familiar enough with DLSS numbers, how do these numbers compare?

    In terms of image quality, the link also claims 'almost identical' image to the native 4k one (i assume at the ultra setting.... not sure on the other options). Is that similar for DLSS?

    on a scale of 1 to 10 how true do you feel this announcement is?
    O.D.

    I propose a toast....

    Comment


      #3
      [yt]eHPmkJzwOFc[/yt]

      I remain skeptical. But the wide support going way back is uber sweet.
      -Trunks0
      not speaking for all and if I am wrong I never said it.
      (plz note that is meant as a joke)


      System:
      Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

      Comment


        #4
        Just to get this straight. This is to be baked into games and will not be a "Radeon Settings" option.. right?
        <---Computer #1 in System Specs Button

        Computer#2), MSI 970 Gaming, FX 8350, Antec 750w True, 128GB Sandisk SSD, 1TB WD Black, Win 10pro x64
        Computer #3) MSI A88X-G45, AMD A10-7850K, IGP, AMD 2x4GB 2133 Ram (R938G2130U1K),Thermaltake 550w, Win10pro

        Comment


          #5
          Originally posted by Gandalfthewhite View Post
          AMD dropping the hammer with support for everything back to rx500 series to now and AND Nvidia 1xxx series to now. As well as consoles this is big news. 10 developers / engines on board right now with more to come.

          https://videocardz.com/newz/amd-fide...d-geforce-gpus
          This is awesome news, TY.

          Comment


            #6
            Originally posted by Hardwood View Post
            Just to get this straight. This is to be baked into games and will not be a "Radeon Settings" option.. right?
            I haven't read anything yet on where the control over the quality settings will be located, but given that it supports GTX and RTX cards it's a safe bet it won't be a "Radeon Settings" option.

            Comment


              #7
              [yt]9UoghWAZ_L0[/yt]


              HU seems a bit sceptical if it will actually be competitive to DLSS 2.0


              So far, from AMDs own slide, looks more like DLSS 1.0.


              It's welcome especially to non RTX owners, but a software scalar likely won't ever give the same kind of qualitive output as an AI trained hardware scalar can do.

              Comment


                #8
                Originally posted by Exposed View Post
                It's welcome especially to non RTX owners, but a software scalar likely won't ever give the same kind of qualitive output as an AI trained hardware scalar can do.
                Yeah but at this point it makes RT feasible on Radeon cards without the huge FPS hit.

                Comment


                  #9
                  Very smart showcasing on a Gtx 1060.
                  Really enjoy 3d gaming flexibility; a gamer's best friend!

                  Comment


                    #10
                    From what people are saying via leaks it's in-between dlss 1 and dlss 2 but way easier to implement and with such broad support seems like it will be a way easier sell for implementation.

                    I view it as the new freesync vs gysnc and we all see how that went.

                    Sent from my GM1917 using Tapatalk
                    Main rig: look at system spec tab
                    Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


                    HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

                    In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

                    I know....I know........Keep my day job :-)- catcather

                    Comment


                      #11
                      Originally posted by Gandalfthewhite View Post
                      F.
                      I view it as the new freesync vs gysnc and we all see how that went.

                      Sent from my GM1917 using Tapatalk
                      How has that gone? GSync hardware modules are still superior to FreeSync and monitors with those modules still sell extremely well with a price mark-up.
                      Originally posted by curio
                      Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
                      "If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe

                      Comment


                        #12
                        Originally posted by Nunz View Post
                        How has that gone? GSync hardware modules are still superior to FreeSync and monitors with those modules still sell extremely well with a price mark-up.
                        Given at this time there are around 91 actual gsync monitors vs the hundreds of freesync monitors and that Nvidia has had to target marketing at gsync compatible to actually be relevant. And that now with HDMI 2.1 starting to pick up steam pushing it further out of that market . Sure it may be the " better" solution but it lost the war.

                        Sent from my GM1917 using Tapatalk
                        Main rig: look at system spec tab
                        Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


                        HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

                        In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

                        I know....I know........Keep my day job :-)- catcather

                        Comment


                          #13
                          Many of those hundreds of monitors are literally trash though.

                          Freesync premium/premium pro monitors are fine, but anything predating that isn't even remotely comparable to gsync monitors in terms of quality standards.
                          Last edited by Mangler; Jun 1, 2021, 09:03 AM.

                          Comment


                            #14
                            Drop the analogy and if you want to have a feesync/gsync debate, create a new thread please.


                            **Further comments on the feesync/gsync debate will be deleted from this thread**
                            -Trunks0
                            not speaking for all and if I am wrong I never said it.
                            (plz note that is meant as a joke)


                            System:
                            Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

                            Comment


                              #15
                              Originally posted by Trunks0 View Post
                              Drop the analogy and if you want to have a feesync/gsync debate, create a new thread please.


                              **Further comments on the feesync/gsync debate will be deleted from this thread**
                              What about Vsync, Adaptive Sync?
                              Intel Core i9 10900K @ 5.2GHz, Asus Maximus XII Apex, GSkill Trident-Z Royal DDR4 3200MHz 32GB CAS11, Asus Strix 3080Ti OC, Creative Labs SXFI Theater, Samsung 970 Evo Plus 1TB, Corsair AXI 1500i PSU, ThermalTake View 71, Corsair K95 Platinum RGB, Corsair Dark Core RGB SE, Acer Predator X34, Windows 10 Professional X64

                              Comment


                                #16
                                Originally posted by Hapatingjaky View Post
                                What about Vsync, Adaptive Sync?
                                -Trunks0
                                not speaking for all and if I am wrong I never said it.
                                (plz note that is meant as a joke)


                                System:
                                Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

                                Comment


                                  #17
                                  anyway the other news going around is that it looks better than TAA performs better than TAA and is easier to implement than dlss and works on the last 5 years of GPUs from AMD and Nvidia if they decide to use it. AND supports both PS5 and Xbox series S/X.


                                  I have a feeling the new UE5 Temporal reconstruction stuff is actually based on or the foundation for FSR.
                                  Main rig: look at system spec tab
                                  Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


                                  HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

                                  In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

                                  I know....I know........Keep my day job :-)- catcather

                                  Comment


                                    #18
                                    I'm pretty excited it works on GCN, because my R9 Fury is GCN and this could potentially help me wait out the current supply/demand and pricing issues with GPU's. Assuming it will work on my R9 Fury. At the moment, they just say as far back as RX500 series. But CGN never really changed much feature wise.... going to be interesting

                                    *late edits!*
                                    Oh and if it gets implemented on anything I play of course.
                                    -Trunks0
                                    not speaking for all and if I am wrong I never said it.
                                    (plz note that is meant as a joke)


                                    System:
                                    Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

                                    Comment


                                      #19
                                      Bring it on, AMD.

                                      And it's not proprietary.Good.
                                      Intel 10600K @4.9GHz, Nvidia RTX 3070(ZOTAC Twin Edge),MSI MPG 490 Gaming Edge, Corsair Vengeance 16GB 3200MHZ,LG 27GL850-B.

                                      Comment


                                        #20
                                        Originally posted by SIrPauly View Post
                                        Very smart showcasing on a Gtx 1060.


                                        i think i can hear the grinding of the jacket's teeth from here

                                        Comment


                                          #21
                                          Great news, at least we can compare it to DLSS if both are supported.
                                          CROSSHAIR X670E HERO / R9 7950X3D / RTX 4090 GAMING OC / TRIDENT Z5 NEO RGB 6000 CL30 / SAMSUNG 980pro 1TB / 2x SAMSUNG 980 1TB / H150i ELITE LCD / ATH-A2000Z / HX1200 / AW3821DW 38" / LG C2 OLED evo 55" / Enthoo 719 / K70 MKII + Zowie S2 / K57 + Harpoon / Xbox Series X Controller / REVERB G2 V2
                                          ____________________

                                          Comment


                                            #22
                                            Originally posted by bill dennison View Post


                                            i think i can hear the grinding of the jacket's teeth from here
                                            Originally posted by demo View Post
                                            Great news, at least we can compare it to DLSS if both are supported.
                                            If nVidia really wanted to... They can open DLSS up to work without the tensor cores... They choose not to.

                                            Look no further than Control. The DLSS in that game launched with shader based DLSS and was later patched to DLSS 2.0 and switched to using the tensor cores.
                                            -Trunks0
                                            not speaking for all and if I am wrong I never said it.
                                            (plz note that is meant as a joke)


                                            System:
                                            Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

                                            Comment


                                              #23
                                              I was under the impression moving to tensor cores gave IQ and performance advantages under 2.0+, so I wouldn't consider that a negative.
                                              CROSSHAIR X670E HERO / R9 7950X3D / RTX 4090 GAMING OC / TRIDENT Z5 NEO RGB 6000 CL30 / SAMSUNG 980pro 1TB / 2x SAMSUNG 980 1TB / H150i ELITE LCD / ATH-A2000Z / HX1200 / AW3821DW 38" / LG C2 OLED evo 55" / Enthoo 719 / K70 MKII + Zowie S2 / K57 + Harpoon / Xbox Series X Controller / REVERB G2 V2
                                              ____________________

                                              Comment


                                                #24
                                                It would still give a performance boost. The image quality would be the same, it just may not be as performant
                                                -Trunks0
                                                not speaking for all and if I am wrong I never said it.
                                                (plz note that is meant as a joke)


                                                System:
                                                Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

                                                Comment


                                                  #25
                                                  IQ advantages in the sense it can run more complex AI algorithms for better reconstruction - sure, at better performance.
                                                  CROSSHAIR X670E HERO / R9 7950X3D / RTX 4090 GAMING OC / TRIDENT Z5 NEO RGB 6000 CL30 / SAMSUNG 980pro 1TB / 2x SAMSUNG 980 1TB / H150i ELITE LCD / ATH-A2000Z / HX1200 / AW3821DW 38" / LG C2 OLED evo 55" / Enthoo 719 / K70 MKII + Zowie S2 / K57 + Harpoon / Xbox Series X Controller / REVERB G2 V2
                                                  ____________________

                                                  Comment


                                                    #26
                                                    Was expecting more, but it has terrible image quality from what they showed. It's just basic upacaling, hell those. On console side will be most. Disappointed they were hoping for a dlss to boost those performances.

                                                    However what they have now is already as good if not better.

                                                    At least 3D Cache on zen3+ looks good.

                                                    Hopefully AMD works on this more and put some actual money into a dlss competitor.
                                                    Fantards the scourge of the universe:

                                                    Comment


                                                      #27
                                                      Originally posted by bill dennison View Post


                                                      i think i can hear the grinding of the jacket's teeth from here
                                                      Stop it Billy, you dont like DLSS 2, how in the world would you like this?

                                                      Comment


                                                        #28
                                                        Originally posted by Nascar24 View Post
                                                        Stop it Billy, you dont like DLSS 2, how in the world would you like this?
                                                        i don't and have said i think it is not good for 4k IQ AMD or NV

                                                        but i just love that AMD's works on more NV and older hardware ( Pascal maybe more ) than NVidia's own DLSS does
                                                        plus the game consoles


                                                        AMD will catch up to NV quickly in IQ and then NV's dlss will be dead because it only works on RTX cards
                                                        Last edited by bill dennison; Jun 2, 2021, 08:26 AM.

                                                        Comment


                                                          #29
                                                          Originally posted by bill dennison View Post
                                                          AMD will catch up to NV quickly in IQ and then NV's dlss will be dead because it only works on RTX cards
                                                          Sad part is nV could expand DLSS to other gen cards using shader cores but......

                                                          Comment


                                                            #30
                                                            That’s a classic Bill opinion thinking AMD will equal NV’s DLSS “quickly” .. I don’t see that happening, at least for a while.
                                                            Originally posted by curio
                                                            Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
                                                            "If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe

                                                            Comment


                                                              #31
                                                              Pretty harsh article form wccftech but I think he makes a few mistakes:

                                                              https://wccftech.com/no-amds-fsr-fid...u-should-care/

                                                              FSR does not use any machine learning or inference and while it is an amazing tool to have in the absence of a DL system - it is not comparable in any way to an AI-powered image upscaling system. The former will always have a quality cost associated with it while the latter can actually get to a point where it would be impossible to see differences between native and AI-upscaled images. With the non-DL implementation AMD has rolled out with FSR, you are looking at quality that is worse than DLSS 1.0 on the highest preset. Performance presets should impact quality even more.
                                                              id read that DLSS up to version 1.9 was also a spatial upscaler. They gave up on it and moved on to ai/dl in 2.0. It doesnt mean it cant work well it may simply need a lot more work before we can say its a good quality. Also with game engines like UE5 putting in their own upscaling tech I suspect we may be using that more often than those provided by the gpu vendors as time goes on.

                                                              He also used the 1060 demo which used medium setting to draw a comparison. It was enabled for the 1060 but not optimized.

                                                              AMD's Scott Herkelman has stated that they have no intention of optimizing FSR for NVIDIA GPUs and that NVIDIA should do that work. While it would have been a completely reasonable expectation in normal circumstances, the fact that AMD expounded on NVIDIA support, absorbed a ton of good press on this and is now basically back tracking makes it seem like a bait and switch situation. This also implies that FSR for NVIDIA users will be optimized only for Godfall unless NVIDIA wants to adopt the technology (which, in my opinion, they absolutely should for non-RTX cards).


                                                              Hardly fair when nvidia doesnt provide any upscaler for gtx cards. I dont see how we can expect amd to help write nvidia drivers. Its probably not even legal.
                                                              I talked to the tree. Thats why they put me away!..." Peter Sellers, The Goon Show
                                                              Only superficial people cant be superficial... Oscar Wilde

                                                              Piledriver Rig 2016: Gigabyte G1 gaming 990fx. FX 8350 cpu. XFX RX 480 GTR Cats 22.7.1, SoundBlaster ZXR, 2 x 8 gig ddr3 1866 Kingston. 1 x 2tb Firecuda seagate with 8 gig mlc SSHD. Sharp 60" 4k 60 hz tv. Win 10 home.

                                                              Ryzen Rig 2017: Gigabyte X370 K7 F50d bios. Ryzen 5800X3D :). 2 x 8 ddr4 3600 (@3200) Cas 16 Gskill. Sapphire Vega 64 Reference Cooler Cats 22.4.1. 1700 mhz @1.1v. Soundblaster X Ae5, 32" Dell S3220DGF 1440p Freesync Premium Pro monitor, Kingston A2000 1TB NVME. 4 TB HGST NAS HD. Win 11 pro.

                                                              Ignore List: Keystone, Andino... -My Baron, he wishes to inform you that vendetta, as he puts it in the ancient tongue, the art of kanlee is still alive... He does not wish to meet or speak with you...-
                                                              "Either half my colleagues are enormously stupid, or else the science of darwinism is fully compatible with conventional religious beliefs and equally compatible with atheism." -Stephen Jay Gould, Rock of Ages.
                                                              "The Intelligibility of the Universe itself needs explanation. It is not the gaps of understanding of the world that points to God but rather the very comprehensibility of scientific and other forms of understanding that requires an explanation." -Richard Swinburne

                                                              www.realitysandwich.com

                                                              www.plasma-universe.com/pseudoskepticism/

                                                              Comment


                                                                #32
                                                                Originally posted by Nunz View Post
                                                                That’s a classic Bill opinion thinking AMD will equal NV’s DLSS “quickly” .. I don’t see that happening, at least for a while.
                                                                i give it a year two tops till it is close enough to keep the same standard that works on the new game consoles and PC both AMD and NV for the game devs not to want to spend the extra time and money to program a different code just for NV RTX

                                                                oh i'm sure NV will pour cash on a game dev here and there but it won't last


                                                                then RDNA 3 chiplets in a year & RDNA 4 in 2.5 to 3 years may well make it not needed much it the about same time frame

                                                                Comment


                                                                  #33
                                                                  It's going to be needed once true ps5/xsx games appear and not crossgen stuff.

                                                                  There's always going to be something to destroy our fps.
                                                                  Fantards the scourge of the universe:

                                                                  Comment


                                                                    #34
                                                                    Here's a full resolution image of "quality mode" on a 1060.

                                                                    Unfortunately, this doesn't look too promising. The right side of the screen where quality is enabled is noticeably blurrier especially in terms of texture detail, like by the columns. Was DLSS 1.0 this bad? Can't remember, but it was rightfully slammed for its blurriness especially in BF5.

                                                                    Don't know what "ultra quality" looks like yet, we'll find out soon enough. But it does look like AMD has go through their "DLSS 1.0" moment before they get their "DLSS 2.0" act together eventually down the road.

                                                                    Comment


                                                                      #35
                                                                      Originally posted by bill dennison View Post
                                                                      i give it a year two tops till it is close enough to keep the same standard that works on the new game consoles and PC both AMD and NV for the game devs not to want to spend the extra time and money to program a different code just for NV RTX

                                                                      oh i'm sure NV will pour cash on a game dev here and there but it won't last


                                                                      then RDNA 3 chiplets in a year & RDNA 4 in 2.5 to 3 years may well make it not needed much it the about same time frame
                                                                      Oh Bill.. you would want something that matches console quality rather than exceeds it.
                                                                      Originally posted by curio
                                                                      Eat this protein bar, for it is of my body. And drink this creatine shake, for it is my blood.
                                                                      "If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe

                                                                      Comment


                                                                        #36
                                                                        Originally posted by Exposed View Post
                                                                        Here's a full resolution image of "quality mode" on a 1060.

                                                                        Unfortunately, this doesn't look too promising. The right side of the screen where quality is enabled is noticeably blurrier especially in terms of texture detail, like by the columns. Was DLSS 1.0 this bad? Can't remember, but it was rightfully slammed for its blurriness especially in BF5.

                                                                        Don't know what "ultra quality" looks like yet, we'll find out soon enough. But it does look like AMD has go through their "DLSS 1.0" moment before they get their "DLSS 2.0" act together eventually down the road.

                                                                        Are you sure it isn't the left side that has quality enabled? Seriously, I admit my eyes are older and tired, but the portion of the image on the left looks better to me. Detail on the walkway stones...actually ....everything.

                                                                        To make a point, not start an argument, I would say new consoles easily beat that in games I've seen.
                                                                        Last edited by Lazy8s; Jun 2, 2021, 02:38 PM. Reason: spelling

                                                                        Comment


                                                                          #37
                                                                          yeah where I see the image distortion on the floor of the right the vegetation on the left looks like garbage. Hopefully folks will get access to be able to
                                                                          give reactions in motion soon.
                                                                          Last edited by Gandalfthewhite; Jun 2, 2021, 02:33 PM.
                                                                          Main rig: look at system spec tab
                                                                          Storage Server: Dual AMD Opteron 6120 CPUs, 64Gigs ECC Ram 50TB usable space across 3 zfs2 pools


                                                                          HOURGLASS = most appropriate named ICON/CURSOR in the Windows world :-)

                                                                          In a dank corner of ATI central, the carpet covered with corn flakes, the faint sound of clicking can be heard........Click......click, click............as the fate of the graphics world and the future of the human race hangs in the balance.

                                                                          I know....I know........Keep my day job :-)- catcather

                                                                          Comment


                                                                            #38
                                                                            I bet that blurriness would be nicely cleaned up with Radeon Image Sharpening. nVidia has a similar driver side feature doesn't it?
                                                                            -Trunks0
                                                                            not speaking for all and if I am wrong I never said it.
                                                                            (plz note that is meant as a joke)


                                                                            System:
                                                                            Asus TUF Gaming X570-Pro - AMD Ryzen 7 5800x - Noctua NH-D15S chromax.Black - 32gb of G.Skill Trident Z NEO - Asus DRW-24F1ST DVD±RW - Samsung 850 Evo 250Gib - 4TiB Seagate - PowerColor RedDevil Radeon RX 7900XTX - Creative AE-5 Plus - Windows 10 64-bit

                                                                            Comment


                                                                              #39
                                                                              Originally posted by Nunz View Post
                                                                              Oh Bill.. you would want something that matches console quality rather than exceeds it.
                                                                              i want it off and the power to do RT on both AMD & NV at 4k without and kind of DLSS

                                                                              but i think AMD will make sure the PC looks better than the new consoles as they will have more power to do so

                                                                              .....

                                                                              as for this you can't tell by still pictures on the net

                                                                              you need to see it live, DLSS 2.0 looks great in still pictures but drops some in motion

                                                                              Comment


                                                                                #40
                                                                                4k and lower resolutions with just native and no anti-aliasing component suffer when in motion Bill.
                                                                                Really enjoy 3d gaming flexibility; a gamer's best friend!

                                                                                Comment

                                                                                Working...
                                                                                X