• narc0tic_bird@lemm.ee
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      4
      ·
      1 year ago

      Nvidia isn’t the only horse in town. AMD (and to an extent Intel) usually offer much better value at these mid-range (and dare I say “low-end” at like $200) price points.

      And while Nvidia probably still sells more GPUs than AMD (for whatever reason there are actually people out there buying 4060 (Ti) cards), it’s not like AMD doesn’t sell any cards. The 7800 XT was priced very well from AMDs standpoint because it was just at the edge of what people thought was actually solid price to performance. It probably sold and still sells quite well.

      • QuantumSparkles@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Can anyone give me a suggestion for what cards I should be looking at to get a little over ps5 graphics without breaking the bank? It’s been a while since I worked on my last pc and I’m really lost these days

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          This article claims your baseline should be:

          • RX 6600XT - I have the 6650XT and I think this is fair
          • RX 7600
          • A750
          • GTX 3060
          • GTX 2070 super - from this LTT forum post

          Those should all be about as good or a little better than the PS5.

          That said, YMMV may vary because games for console may be better tuned for console hardware than for PC, even if the hardware is equivalent. So maybe to up a step to be safe. If you want RTX, do NVIDIA, otherwise AMD or Intel will probably offer better value.

          I paid a little over $200 for my RX 6650XT, so expect to pay $200-300 to match or slightly exceed the PS5.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          edit-2
          1 year ago

          While current Nvidia cards are certainly more efficient, RDNA3 still improves efficiency over RDNA2, which itself was actually more efficient than Ampere (mostly due to Ampere being based on the Samsung 8nm process).

          A 7800 XT is more efficient than both a 6800 XT and an RTX 3080, with the RTX 4070 being the most efficient in this performance ballpark.

          I feel like you’re blowing this way out of proportion.

          • lowleveldata@programming.dev
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            5
            ·
            1 year ago

            What is the right proportion? 7800 XT uses 25% more power than 4070 (200W vs 250W). It seems outstanding to me.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              1 year ago

              Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.

              To really understand what’s going on, you need to look at something like watts per frame.

              • DavLemmyHav@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                4
                ·
                1 year ago

                The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  1 year ago

                  No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.

                  So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.

            • narc0tic_bird@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              You think 50 watts difference will noticeably heat up your room? You must have a tiny room then or the difference will hardly be measurable.

    • Coelacanth@feddit.nu
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      That’s more or less true. NVIDIA knows they’re holding aces with DLSS+Frame Gen which is just strictly superior to FSR and so they’ll probably try to bully the market into accepting current pricing. Better Ray Tracing performance for NVIDIA cards might also be a factor if we’ll start seeing more and more games where it really makes a difference, like Alan Wake 2.

      What they end up doing with the rumoured Super series coming next year will be a good indication of where we’re at I think.

  • narc0tic_bird@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    Probably because pricing in the mid-range is at least somewhat okay-ish again.

    This, and people being desperate for new GPUs at some point because system requirements for newer games have skyrocketed. People probably finally upgrade from their trusty old GTX 1060 because many new games barely even work on low settings anymore.

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      I’m at the point where if it doesn’t run on my steam deck then I’m not playing it.

      I think games coming out now, did not know how big the handheld market would be.

      I think we will see future aaa games have graphical options to ensure their games will run on handhelds.

      • Jrockwar@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I’ve gone as far as to “downgrade” my desktop computer to a combination of MacBook Pro + Steam Deck. The MacBook is heaps faster for any workload other than gaming, so now my most powerful computer fits in my backpack. The Steam deck is such a joy to play with, and thanks to the microSD slot I don’t have to worry about disk space requirements anymore. Yes, it’s not as fast in terms of raw performance, but I don’t care. I can play now on my bed, sofa, or garden. If it doesn’t run on the deck, I don’t care for it. I already have way too many games I haven’t finished.

        • M500@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I’d say I’m in a similar situation but my desktop was hardly more powerful than the steam deck 😂

          Now my desktop is just a general use computer.

      • taladar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Not so sure about that. AAA games have been notoriously bad about cross-platform support which is offered by almost every single developer indie game at this point. They seem very inflexible.

    • retrieval4558@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I’ve got a 1660 super that I paid WAY too much for like 3 years ago which is still definitely adequate but I do get the itch to upgrade…

      • SaltySalamander@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        This is why I never buy mid-range. The 1080Ti I bought 6 years ago is really only now showing its age. Buy top or near top-end, that itch holds off quite a bit longer.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Yeah, I finally upgraded when midrange AMD GPUs returned to $200-300 (6700XT and 6650XT for my wife and me). That same class of card was easily twice the price just two years ago.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    This is the best summary I could come up with:


    According to the GPU sales analysts at Jon Peddie Research, things may finally be evening out.

    Its data shows that GPU shipments have returned to quarter-over-quarter and year-over-year growth after two years of shrinking sales.

    This is the second consecutive quarter this has happened, which “strongly indicates that things are finally on the upswing for the graphics industry.”

    When comparing year-over-year numbers, the biggest difference is that Nvidia, AMD, and Intel all have current-generation GPUs available in the $200–$300 range, including the GeForce RTX 4060, the Radeon RX 7600, and the Arc A770 and A750, all of which were either unavailable or newly launched in Q3 of 2022.

    JPR warned against reading too much into the sales increase, noting that it “largely reflects a cleaning out and straightening up of the distribution channel.”

    In other words, supply and demand are syncing back up, but the overall market for PCs and the components that go in them is still expected to continue its gradual decline.


    The original article contains 491 words, the summary contains 164 words. Saved 67%. I’m a bot and I’m open source!