• Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The triple whammy of semiconductor shortage, pandemic and cryptocunts has really fucked PC gaming for a generation. The price is way out of line with the capabilities compared to a PS5.

      I’m still on a 1060 for my PC, and it’s only my GSync monitor that saves it. Variable frame rates really is great for all PC games tbh. You don’t have to frig about with settings as much because Opening Bare Area runs at 60fps, but the later Hall of a Million Alpha Effects runs at 30. You just let it rip between 40 and 80, no tearing, and fairly even frame pacing. The old “is this game looking as good as it can on my hardware while still playing smoothly?” question goes away, because you just get extra frames instead, and just knock the whole thing down one notch when it gets too bad. I’m spending more time playing and less time tweaking and that can only be a good thing.

  • LetMeEatCake@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    GPU prices being affordable is definitely not a priority of AMD’s. They price everything to be barely competitive with the Nvidia equivalent. 10-15% cheaper for comparable raster performance but far worse RT performance and no DLSS.

    Which is odd because back when AMD was in a similar performance deficit on the CPU front (Zen 1, Zen+, and Zen 2), AMD had absolutely no qualms or (public) reservations about pricing their CPUs where they needed to be. They were the value kings on that front, which is exactly what they needed to be at the time. They need that with GPUs and just refuse to go there. They follow Nvidia’s pricing lead.

      • justsomeguy345@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        something many people overlook is how intertwined nvidia, intel and amd are. not only does the personnel routinely switch between those companies but they also have the same top share holders. there’s no natural competition between them. it’s like a choreograhped light saber fight where all of them are swinging but none seem to have any intention to hit flesh. a show to make sure nobody says the m word.

      • LetMeEatCake@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I agree, it’s just strange from a business perspective too. Obviously the people in charge of AMD feel that this is the correct course of action, but they’ve been losing ground for years and years in the GPU space. At least as an outside observer this approach is not serving them well for GPU. Pricing more aggressively today will hurt their margins temporarily but with such a mindshare dominated market they need to start to grow their marketshare early. They need people to use their shit and realize it’s fine. They did it with CPUs…

  • Redderthanmisty@lemmygrad.ml
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    AMD’s your friend now, but they’re only undercutting NVIDIA like this to get on top of the market. Once they’ve done that, it will be NVIDIA doing the undercutting, and AMD will be the one clamping down and exploiting their position.

    It has happened time and time again.

    Don’t simp for corporations. They’ll never return the favour.

  • ciko22i3@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I recently bought a used Titan xp and found out it doesn’t support DLSS, but much weaker and only 2 years newer 2060 supports it

  • phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’ll never go for Nvidia ever again.

    I’ve been a Linux only user for over twenty years now and Nvidia is the fucking devil. Their drivers range in quality anywhere from “ugh” to “wtf!” and my current Nvidia card (it’s a loan) gives me continuous screen artifacts and kwin (screen manager) crashes. AMD drivers just work.

  • Alto@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Corporations aren’t your friend.

    My rig is full AMD (5800x/5700xt), but that’s purely because they happened to be the better value at the time. The second they get a lead in the consumer GPU market (which they likely will since nvidia simply doesn’t care about it vs the ML market now) prices are going to rise again.

    And don’t pretend that these prices are anything resembling affordable. That would be when you could get a legitimately mid-range card for ~$150 (rx580).

  • Mojo@ttrpg.network
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Have I just had bad luck with my AMD products?
    I’ve had four Nvidia GPU/Intel CPU computers with no issues.
    I’ve had three AMD GPU/AMD CPU computers and they all have been loud and hot and slightly unstable. A bit cheaper sure, but I rather have a silent and stable experience.
    This has made me see amd as the inferior lowbudget crap. But maybe I have just bought from the wrong manufacturer or something.

    • OADINC@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I can’t speak of older stuff, but my Ryzen 5 5600x and RX 6800 have been great. I’ve had this pc for a year now, and only have had the GPU drivers crash twice. That is about on par with my older gtx 1070

  • gerryflap@feddit.nl
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    My problem when buying my last GPU is that AMD’s answer to CUDA, ROCm, was just miles behind and not really supported on their consumer GPUs. From what I se now that has changed for the better, but it’s still hard to trust when CUDA is so dominant and mature. I don’t want to reward NVIDIA, but I want to use my GPU for some deep learning projects too and don’t really have a choice at the moment.

    • Beefalo@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’ve become more and more convinced that considerations like yours, which I do not understand since I don’t rely on GPUs professionally, have been the main driver of Nvidia’s market share. It makes sense.

      The online gamer talk is that people just buy Nvidia for no good reason, it’s just internet guys refusing to do any real research because they only want a reason to stroke their own egos. This gamer-based GPU market is a loud minority whose video games don’t seem to rely too heavily on any card features for decent performance, or especially compatibility, with what they’re doing. Thus, the constant idea that people “buy Nvidia for no good reason except marketing”.

      But if AMD cards can’t really handle things like machine learning, then obviously that is a HUGE deficiency. The public probably isn’t certain of its needs when it spends $400 on a graphics card, it just notices that serious users choose Nvidia for some reason. The public buys Nvidia, just in case. Maybe they want to do something they haven’t thought of yet. I guess they’re right. The card also plays games pretty well, if that’s all they ever do.

      If you KNOW for certain that you just want to play games, then yeah, the AMD card offers a lot of bang for your buck. People aren’t that certain when they assemble a system, though, or when they buy a pre-built. I would venture that the average shopper at least entertains the idea that they might do some light video editing, the use case feels inevitable for the modern PC owner. So already they’re worrying about maybe some sort of compatibility issue with software they haven’t bought, yet. I’ve heard a lot of stories like yours, and so have they. I’ve never heard the reverse. I’ve never heard somebody say they’d like to try Nvidia but they need AMD. Never. So everyone tends to buy Nvidia.

      The people dropping the ball are the reviewers, who should be putting a LOT more emphasis on use cases like yours. People are putting a lot of money into labs for exhaustive testing of cooling fans for fuck’s sake, but just running the same old gaming benchmarks like that’s the only thing anyone will ever do with the most expensive component in the modern PC.

      I’ve also heard of some software that just does not work without CUDA. Those differences between cards should be tested and the results made public. The hardware journalism scene needs to stop focusing so hard on damned video games and start focusing on all the software where Nvidia vs AMD really does make a difference, maybe it would force AMD to step up its game. At the very least, the gamebros would stop acting like people buy Nvidia cards for no reason except some sort of weird flex.

      No, dummy, AMD can’t run a lot of important shit that you don’t care about. There’s more to this than the FPS count on Shadow of the Tomb Raider.

  • Moubai@lemmy.ml
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    i can’t encode my video with amd gpu, this is why i stay with nvidia and his Nvenc. When amd will propose this kind of use, maybe i will change my gpu

  • beq@feddit.ch
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    I have read many of the comments in the thread, but there is a very basic question I hope someone can help me with: what does the OP even mean?

    I know what AMD is and what they do, but “taking W’s”? And “giving them away”?

    • jsdz@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      “W” is a letter often used to represent a “Win” which I assume is what’s meant here since that’s what AMD have been doing.

  • Brkdncr@artemis.camp
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    Amd has been a shitshownof a company since their beginning. Don’t believe they wouldn’t be gouging if they could.