I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games’ graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics’ level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn’t need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

  • MentalEdge@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Shadow can definitely look a lot better than this picture suggests.

    The biggest advancements in game graphics have not occurred in characters, except for perhaps in terms of animation and subsurface scattering tech.

    The main character always gets a disproportionate graphical resource allocation, and we achieved “really damn good” in that category a while ago.

    Adam Jensen didn’t look that much better in Mankind Divided, than he did in Human Revolution, but Prague IS SO MUCH MORE DETAILED than Detroit was.

    Then there’s efficiency improvements in rendering brought by systems like nanite, material shader improvements, more detailed lighting systems and more efficient ambient occlusion.

    Improvements in reverse kinematics is something I’m really excited about, as well.

    • Doods@infosec.pubOP
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      There are 2 different ways in which I managed to interpret this, care to elaborate further?

      • kemsat@lemmy.villa-straylight.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        I used to have a top of the line GPU, and when I did, I would have agreed with you. Now that I don’t, I can see the difference between my experience & the videos & screenshots I see online.

        So yeah, once I have a newer card, that can handle the pretty games I like, I’ll start agreeing with you. Probably because I don’t want to have to upgrade the GPU eventually lol

        • Doods@infosec.pubOP
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I am pretty sure we just want to run the games we want at ultra without fearing stutters, even if said games have PS3 level graphics.

  • Callie@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’ve disliked realistic art styles for quite a while now. In the short term the games look beautiful, but in the long term, they’ll look dated. I’d much prefer a game having their own look and style to it, something that says “yeah, this is X game” just from a screen cap.

    Look at JetSetRadio, Okami, Minecraft just to name a few, they’re easily identifiable because they have their own style

  • sculd@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Pushing for even more realistic graphics will make the cost of making even higher with no significant change in enjoyment of players.

    Players enjoyed games when we had Supernintendos and DOS games. They actually gave players more room for imagination.

  • LanAkou@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Is it diminishing returns? Yes, of course.

    Is it taxing on your GPU? Absolutely.

    But, consider Control.

    Control is a game made by the people who made Alan Wake. It’s a fun AAA title that is better than it has any right to be. Packed with content. No microtransactions. It has it all. The only reason it’s as good as it is? Nvidia paid them a shitload of money to put raytracing in their game to advertise the new (at the time) 20 series cards. Control made money before it even released thanks to GPU manufacturers.

    Would the game be as good if it didn’t have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn’t be as good if Nvidia hadn’t paid them, and that means raytracing has to be included.

    A lot of these big budget AAA “photorealism” games for PC are funded, at least partially, by Nvidia or AMD. They’re the games you’ll get for free if you buy their new GPU that month. Consoles are the same way. Did Bloodborne need to have shiny blood effects? Did Spiderman need to look better than real life New York? No, but these games are made to sell hardware, and the tradeoff is that the games don’t have to make piles of money (even if some choose to include mtx anyway).

    Until GPU manufacturers can find something else to strive for, I think we’ll be seeing these incremental increases in graphical fidelity, to our benefit.