• narc0tic_bird@lemm.ee
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      4
      ·
      1 year ago

      Nvidia isn’t the only horse in town. AMD (and to an extent Intel) usually offer much better value at these mid-range (and dare I say “low-end” at like $200) price points.

      And while Nvidia probably still sells more GPUs than AMD (for whatever reason there are actually people out there buying 4060 (Ti) cards), it’s not like AMD doesn’t sell any cards. The 7800 XT was priced very well from AMDs standpoint because it was just at the edge of what people thought was actually solid price to performance. It probably sold and still sells quite well.

      • QuantumSparkles@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Can anyone give me a suggestion for what cards I should be looking at to get a little over ps5 graphics without breaking the bank? It’s been a while since I worked on my last pc and I’m really lost these days

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          This article claims your baseline should be:

          • RX 6600XT - I have the 6650XT and I think this is fair
          • RX 7600
          • A750
          • GTX 3060
          • GTX 2070 super - from this LTT forum post

          Those should all be about as good or a little better than the PS5.

          That said, YMMV may vary because games for console may be better tuned for console hardware than for PC, even if the hardware is equivalent. So maybe to up a step to be safe. If you want RTX, do NVIDIA, otherwise AMD or Intel will probably offer better value.

          I paid a little over $200 for my RX 6650XT, so expect to pay $200-300 to match or slightly exceed the PS5.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          edit-2
          1 year ago

          While current Nvidia cards are certainly more efficient, RDNA3 still improves efficiency over RDNA2, which itself was actually more efficient than Ampere (mostly due to Ampere being based on the Samsung 8nm process).

          A 7800 XT is more efficient than both a 6800 XT and an RTX 3080, with the RTX 4070 being the most efficient in this performance ballpark.

          I feel like you’re blowing this way out of proportion.

          • lowleveldata@programming.dev
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            5
            ·
            1 year ago

            What is the right proportion? 7800 XT uses 25% more power than 4070 (200W vs 250W). It seems outstanding to me.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              1 year ago

              Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.

              To really understand what’s going on, you need to look at something like watts per frame.

              • DavLemmyHav@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                4
                ·
                1 year ago

                The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  1 year ago

                  No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.

                  So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.

            • narc0tic_bird@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              You think 50 watts difference will noticeably heat up your room? You must have a tiny room then or the difference will hardly be measurable.

    • Coelacanth@feddit.nu
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      That’s more or less true. NVIDIA knows they’re holding aces with DLSS+Frame Gen which is just strictly superior to FSR and so they’ll probably try to bully the market into accepting current pricing. Better Ray Tracing performance for NVIDIA cards might also be a factor if we’ll start seeing more and more games where it really makes a difference, like Alan Wake 2.

      What they end up doing with the rumoured Super series coming next year will be a good indication of where we’re at I think.