I’ve been wondering this recently. I grew up on atari/nes/snes and so of course almost all of those games (pretty sure all) are written in assembly and are rock solid smooth and responsive for the most part. I wonder if this has affected how I cannot stand to play badly optimized games eith even a hint of a laggy feel to it. I’ve always been drawn to quake and cs for that reason: damn smooth. And no, it doesn’t just need to be FPS games either. I cant play beat saber with a modicum of lag or i suck massively, but others can play just fine and not even notice the lag.

Its odd. I feel like a complainer but maybe I just notice it more easily than others?

  • bigmclargehuge@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    As someone in my 20s who grew up on Windows XP era games, then lots of PS3 games, I’m very attuned to latency. My computer was lower mid-teir at best, and the performance standards for console games were nowhere near what they are today, so the first time I played a game on a high performance machine at 100+FPS/Hz refresh rate, it was like seeing color for the first time.

  • ThunderComplex@lemmy.today
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    1 day ago

    No. The thing is AAA games are now being released in an unoptimized state way too often. Even if you still get good FPS microstuttering and short lag spikes still occur frequently.

    Of course this can make you wonder if this is a you problem and you just got too sensitive.

    Nope, this is an industry problem. Why would you optimize a game? No, legitimately asking. It doesn’t affect sales numbers, it often doesn’t significantly tank your steam review score (that most publishers don’t care about), there are practically no downsides to not optimize your game.
    But if you do value optimization, it lowers dev velocity, requires more training/awareness for devs and artists, and you won’t be able to ship as fast anymore. And on top of that you get… nothing. A few more sales maybe?

    • NuXCOM_90Percent@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      23 hours ago

      I’m going to push back on that a fair bit.

      I used to agree it was “optimization” problems. And there are definitely some games/engines with those (I love Team Ninja but… god damn).

      But it is also that mindsets have changed. Most people know of the “can it run Crysis?” meme… if only from Jensen. But it was a question for a reason because Crysis (and other games) genuinely pushed the envelope of what desktop computers could handle. It was an era where you really would put a LOT of effort into figuring out what settings would get you what framerate and “ultra” was something that only the super rich or the people who JUST built a new computer could expect to run.

      But around the launch of the PS4/XBONE, that all changed. Consoles were just PCs for all intents and purposes and basically all games “worth playing” were cross platform. So rather than taking advantage of the latest nVidia card or going sicko mode for the people who got the crazy powerful single thread performance i7, they just targeted what the consoles could run. So when people did their mid-gen upgrades of PCs… suddenly “ultra” and “epic” were what we began defaulting to. Just crank that shit up, turn off whatever you don’t like, and see your framerate and go from there.

      The refresh SKU consoles bumped up the baseline but… not all that much since those games still had to run on a base XBONE. And then we got the PS5/XSEX which… you know how it is never a good time to build a new PC? It was REALLY not a good time to build a new console as ray tracing and upscaling/framegen rapidly became the path forward in the hardware/graphics space. But also? Those launched during COVID so the market share of the previous gen remained very large and all those third parties continued to target the previous gen anyway.

      Which gets back to PC gaming. Could more effort be put in to improve performance? Yeah, definitely. But we are also getting reminded of what things were actually like until the mid 10s where you might only play a game on Medium or High and wanting that new game to be gorgeous is what motivates you to drive down to Best Buy and get a new GPU.

      But instead it is the devs fault that we can’t play every game on maxed out Epic settings at 4k/240Hz… because this generation never knew any different.

      • ThunderComplex@lemmy.today
        link
        fedilink
        English
        arrow-up
        3
        ·
        21 hours ago

        I get what you’re trying to say but I’ve definitely experienced performance problems even on lowest settings.
        The issue isn’t that everyone tries to run the game maxed out. The issue is that fundamental problems are often left in the games that you can’t just fix by lowering quality settings.

        • NuXCOM_90Percent@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          21 hours ago

          And there is a reason the +/- (?) buttons literally changed the render window for DOOM and the like. Like… those iconic HUDs were specifically so that those playing on a 640480 monitor might actually only have to worry about a 640360 game and so forth.

          Same with those of us who played games like Unreal Tournament at 18-24 FPS on an 800*600.

          Like I said, there are definitely some problem children (again: Team Ninja). But it is also worth remembering that most games are still targeting a previous gen console SKU at 1080p. And, ironically, the optimizations are going to be geared more towards that.

          Which… is why upscaling is such a big deal. Yeah “AI Upscaling” is a great buzzword. But it really is no different than when we used to run OFP at a lower resolution on the helicopter missions. It is just that now we can get “shockingly good” visuals while doing that rather than thinking Viktor Troska looks extra blocky.

          Like, I’ll always crap on Team Ninja’s PC ports because they are REALLY bad… even if that is my preferred platform. But it took maybe 2 minutes of futzing about (once I got to Yokohama proper and had my game slow to sub 20 FPS…) to get the game to look good and play at a steady 60 FPS. No, it wasn’t at Epic (or whatever they use) but most of the stuff was actually on High. Is it the same as just hitting auto-detect and defaulting to everything maxed out? Of course not. But that gets back to “Can it run Crysis?”

        • IngeniousRocks (They/She) @lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          21 hours ago

          Much of this specifically is devs implementing MSAA, which once upon a time was cheap, efficient, and looked fine. Nowadays with RT added into the mix MSAA just simply can’t function well on modern hardware, to the point where even city builders like Cities Skylines 2 will crawl to 14-15fps on low settings if you haven’t overridden the graphics pipeline to remove msaa and replace it with one that actually functions.

  • hondaguy97386@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    2
    ·
    1 day ago

    Nostalgia is a hell of a drug. Older games are a laggy mess when there is too much on the screen. Not to mention sprites disappearing. The issue I think is, we have gotten better and better over the decades until recently. We are just seeing a backward slide in performance (for many reasons, not just poor optimization).

    • rozodru@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 day ago

      Examples: Virtua Racing on the Genesis or Star Fox on the SNES. they were slow and quite laggy. sure they were essentially pushing the limits of what the console could do and in the case of Star Fox had to have the FX chip in the cartridge but I wouldn’t call racing around on the Genesis in Virtua Racing a “smooth” experience.

      Other games are like this too with loading. Mortal Kombat CD on the Sega CD. you get to the Shang Tsung fight and the game has to load every time he morphs. Other games would also slow to a crawl if there was a lot on the screen. To your point Ranger X on the Genesis had these little tadpole enemy things that could quickly populate the screen if you didn’t take them out quickly it would slow the game down. Same would happen on the PSX with the game Loaded.

    • Nikls94@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      I’ve only recently (2 years ago) started to play older games I was interested in but never got the time to play. I even got a 16:9 CRT-TV and modded all the original consoles. It toatally depends on the game if it is a smooth and optimized experience or just an unresponsive mess of code.

      • bridgeenjoyer@sh.itjust.worksOP
        link
        fedilink
        English
        arrow-up
        3
        ·
        22 hours ago

        Yeah it really does depend on the game, which is obvious, but still. Games that push the hardware are obviously gonna feel laggy

  • Klear@quokk.au
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    24 hours ago

    The other way around. I grew up playing games on PCs that were quite underpowered for a long time. I played Doom like this. Hell, I had to reduce screen size even in Wolfenstein 3D. I loved fog in GTA San Andreas because it reduced draw distance and when it was raining in Las Venturas, I had to look at my feet like I was speedrunning Goldeneye. I played through Oblivion in a 640 x 480 window and thought it looked amazing. I still have to fight not to turn off AA completely first time running a game on my RTX 3080 because it was the first thing to go for so long.

    All of this trained my brain so now I have bulit-in antialiasing and frame generation. I don’t give a shit. Give me good art direction and gameplay loop and I can just generate smooth graphics in my head.

    • ThunderComplex@lemmy.today
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      I had a super underpowered PC I grew up with and it influenced my imagination. For a long time stuff I’d imagine also ran at like 15-20FPS. Really weird effect.

  • rafoix@lemmy.zip
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 day ago

    If you folks want to have a really hard time find a way to play the NES version of Mike Tyson’s Punch Out on original hardware with a CRT monitor and then play it on any emulator on a modern monitor. You will feel like you’ve aged 80 years.

    • invertedspear@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 hours ago

      I was playing punch out on the switch the other day and 100% this. That game was all about proper timing and reaction speed. All the little latencies add up to it being nearly impossible. I never beat the game as a kid, but I could get to the last fighter, Tyson in my version, Mr Dream? In the non Tyson version? Anyway, can’t even beat the Russian dude that laugh taunts me on the switch. I know what to hit, and when to hit it, but HDMI lag, upscaling lag, blue tooth controller lag, all add up to it being nearly impossible to react.

  • jordanlund@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    1 day ago

    I feel the opposite when I hear people complain about load times… “We want you to buy our SSD so your game will boot in 11 seconds instead of 19 seconds!”

    Son, let me tell you about loading games from casette tape.

    You’d start it loading, get up and go have dinner with the family. After 30 minutes, maybe it would be done. Maybe.

    Maybe it hit an error 5 minutes after you walked away and now you need to re-wind and try again.

      • Redredme@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        1 day ago

        Oh, you sweet summer child…

        Up to the 90s my friend. Then 3.5 floppy"s took over (1.44 MEGAbyte!) then came zip (100MB) but only for rich people, then it became the era of CD and later dvd burning. Internet was not measured in mbits back then and most of the time not even in kbits. The internet was not a valid delivery system. It was slow and very expensive. Also the first memory cards (CF) around the millennium and from there it went on to the 10s and around there you got the pivot to what we have now.

        Tape is still around in computing; its cheap, it’s cheerful, dependable and has quite a throughput. Seeking on it is still horrible though. But anyway, watching a real mechanised tapelibrary do it’s thing backing up computer systems is still mesmerizing.

        • FigMcLargeHuge@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 day ago

          You left out 5 1/4 floppy disks that were actually floppy. Yes, I know there are 8" floppies but those were mostly business use and specialized drives that you didn’t really get in the home computer market. Atari, Commodore, Radio Shack, etc all had 5 1/4" floppy drives, and when I got my first box of floppies, it was $50 of early 1980’s money for 10 disks. And on my Atari they held about 90K worth of space.

      • Acamon@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        Not only on tape, but some radio shows would transmit computer programs that you could record and use. I know of the UK and Finland, but I think other European countries did it too.

      • Jrockwar@feddit.uk
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        The generation of Amstrad, Spectrum etc had the games on tape. I would say they were the closest thing to a console pre-NES, so 1980s. I had an amstrad that was handed down to me by a friend of an older sister and it had tapes like this.

        • Redredme@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 day ago

          Nobody had this, it was way too expensive for what it was. Everybody just kept saving for a msx or Commodore and skipped this.

          • jordanlund@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            I had one, I had the tape drive for the Commodore 64 as well.

            The Supercharger back in the day wasn’t that expensive, about $70 or the price of 2 games, because you had to supply your own tape player, the supercharger just connected to it with a wire.

  • NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    1 day ago

    I grew up on atari/nes/snes and so of course almost all of those games (pretty sure all) are written in assembly and are rock solid smooth and responsive for the most part.

    HA!

    Older games were laggy as all fuck and had very significant input delay.

    But ignoring the rose tinted glasses: I DO think there is some element of truth to this: My formative years of online gaming were 56k and an ATI Rage. I probably logged at least a thousand hours of UT at 20-ish FPS and my ping was regularly in the hundreds. I can definitely appreciate lower latency games, but I mostly just need VRR (for screen tearing and the like) and I am set. Whereas one of the younglings from work pretty much can’t play anything below 60 FPS… and we have tested this.

    • bridgeenjoyer@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 hours ago

      Im not certain what input delay youre referring to. It is likely very dependent on the games I play as well. Of course some of the older games pushing the hardware to the max were laggy when a lot of sprtes etc were loading.

  • DigDoug@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 day ago

    There are only a few reasons I can surmise that this would be the case:

    CRTs don’t add any input lag

    There’s no extra latency from being connected to the internet

    There’s no latency from bluetooth/wireless on the controller

    Because most older games are extremely badly optimised by today’s standards. The original Metroid slows to an absolute crawl when there’s more than about 4 sprites on the screen; the dragon boss in Mega Man (2, I think) was such a laggy, slippery mess that I gave up trying to beat the game; Ocarina of Time runs at 20FPS (worse if you’re in a PAL territory like I am), and that’s one of the better playing N64 games.

    I think you’re either noticing one of these extra sources of delay, or you’re blinded by nostalgia.

    • bridgeenjoyer@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 hours ago

      Yes there’s definitely processing lag on some of those games where they were pushing it.

      Then you have joust on the 7800 which is ridiculously smooth.

    • Frezik@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 day ago

      If you’re measuring display lag the same way we measure it with modern LCDs, then yes, CRTs do have lag.

      • DigDoug@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 day ago

        Unless it’s an HD one, there’s no input buffer so it’s impossible for a CRT to have more than a frame of input lag. And the console needs a frame to notice your input anyway.

        • Frezik@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 day ago

          You measure lag by taking the capture of a frame an input happens when it is halfway down the screen. Therefore, CRTs have input lag of half their refresh rate. For NTSC, that’s about 8ms. For PAL, 10ms.

          Incidentally, a modern gaming LCD has a 2ms average pixel response time. Which is about the same as the difference between NTSC and PAL.

  • cerebralhawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 day ago

    Been wondering this, or something like this.

    I used to be good at Mario 1, but I cannot play it on emulators. It feels like there’s a delay. It feels a little like Mario is on ice, much like the ice levels of Mario 2. Mario is running, and I want to jump or stop, but there’s a noticeable delay and it makes me feel like my old ass has lost my touch. But playing any modern game, my reflexes are good enough. In a Nintendo to Nintendo comparison, I play Animal Crossing on the Switch, and sure enough, if I’m running and pull back on the stick, my villager skids at exactly the time I want them to. But on that same Switch with the same controller, I can’t control Mario in Mario 1 worth a damn. I do just fine in Super Mario Wonder, though.

    (Side note, more to do with Animal Crossing than older games, but I’ve noticed a wired controller, plugged into the Switch dock via USB, with the Switch on the dock, gets more latency than the Switch in handheld mode, which I’m pretty sure uses Bluetooth to connect to its controllers, even if they’re physically connected — not 100% sure on that. But for one example, fishing — even the five-star rarity fish — is quite easy in handheld. But, with the wired connection, I mash A as soon as the fish bites, and it still slips my hook. Maybe the latency isn’t from the controller to the dock to the Switch, maybe it’s from the Switch to the dock to the TV (and speakers since I close my eyes and listen for the sound, which most animal crossers agree is the best way to fish).)

    • dustyData@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      It’s mostly the TV. The input difference between wired and BT should be very small, though the switch is not optimized for wired controllers. The variability of TV response times on the other hand it massive in comparison. Specially modern TVs with heavy post processing who think they are clever trying to interpolate frames or other shit like bad HDR implementations, etc. HDMI DRM also adds latency.

      All that causes most TVs to be subpar for gaming. I still game on TV, mostly cozy games. But I accept that nothing competitive will come out of gaming on a TV.

  • Odo@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    1 day ago

    It’s so weird to me that no one uses the term “slowdown” any more. Lag and latency meant networking delays back in the days you’re talking about. Not a complaint, just an observation that I’ve been wondering about the last few years.

    But yeah, as others said, slowdown/lag was pretty common. I immediately think of the ninjas jumping out of the water in TMNT3, the beginning of Top Man’s stage in Mega Man 3, and the last boss of The Guardian Legend, but there were many more. Early 3d is shocking too, with more sub-30-fps games than you remember. Some called themselves at 20, even. [Edit: Now that I think about it, even some NES games capped at 20. Strange times.]

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      12
      ·
      1 day ago

      I believe OP is referring to input latency, which isn’t so much a result of the system slowing down due to increased load, as much as running in a consistently slowed-down state causing a delay on your inputs being reflected on-screen. There’s several reasons for why this is happening more often lately.

      Part of it has to do with the displays we use nowadays. In the past, most players used a CRT TV/monitor to play games, which have famously fast response times (the time between receiving the video signal and rendering that signal on the screen is nearly zero). But modern displays, while having a much crisper picture, often tend to be slower at the act of actually firing pixels on the screen, causing that delay between pressing Jump and seeing your character begin jumping.

      Some games also strain their systems so hard that, after various layers of post-processing effects get applied to every rendered frame, the displayed frames are already “old” before they’re even sent down the HDMI cable, resulting in a laggier feel for the player. You’ll see this difference in action with games that have a toggle for a “performance/quality” mode in the graphics settings. Usually this setting will enable/disable certain visual effects, reducing the load on the system and allowing your inputs to be registered faster.

      • bridgeenjoyer@sh.itjust.worksOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 hours ago

        You’re right. Yes, there’s slowdowns in a lot of older games but not necessarily input lag. The slowdowns dont bother me hardly at all. I think you hit right on it!

      • Lojcs@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 day ago

        Input latency includes the time it takes to render the frame. CRTs have a small inherent latency advantage compared to modern LCDs but they’re not instant and that advantage is miniscule compared to the disadvantage of the lower framerate. A game running at 30 fps on a gaming LCD will have lower input lag than a game running at 20 fps on a CRT. I’m sure there are outliers that poll inputs in a silly way that increases input lag, but for most games the render time will be the greatest factor. Performance modes usually simply reduce the render time (even if the framerate is unchanged).

    • NuXCOM_90Percent@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      24 hours ago

      “Lag” does indeed come from network/signal theory and does indeed refer to networking. Been a minute, but I want to say lag is the round trip delay and latency is A to B but don’t quote me on that.

      That said? Nobody cared. “Lag” was always the time between action and response. Some of that might be input delay. Some of that might be display delay (which has always been over-exaggerated but…). And a lot of that really was network delay. These days it tends to be more rendering/logic delay because people who are playing on shitty internet connections know it.

  • Spesknight@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    1 day ago

    You obviously did not play on pc when if you didn’t have the newest graphics card everything was laggy but still playable.

    • bridgeenjoyer@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 hours ago

      I wouldn’t play games my pc couldn’t run, for the aforementioned reason ha! I also dont buy new games until theve been out 10 years, because money.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    On the one hand, we’re more accustomed to better hardware latency. On the other hand… we played first-person shooters on 56K modems. The lag was legendary

    • ampersandrew@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Wasn’t prediction baked into the netcode very early in the FPS genre? I wasn’t playing multiplayer in the Doom days, but by the late 90s, you wouldn’t have latency so much as you’d have rubberbanding. Games also use very little bandwidth, so 56K was no different than broadband, from my recollection.

      • chunes@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        First multiplayer FPS I played was Jedi Knight: Dark Forces II (released in '97). In that game, you had to lead your shots to a silly degree to actually hit anyone. But I think you’re right; by then most games weren’t suffering from that problem as much.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        24 hours ago

        Yes and no.

        Different games (really engines) had different models for it. Some games you would feel things grind to a halt while you waited for a packet. Others you would have rubber banding where the prediction of what your opponent would do was wrong and they teleport 2 meters to the right. And a select few would result in endless double kills as you both killed the predictions.

        The big difference was that arena shooters (which DOOM effectively was) tended to have encounters where you might have 3 or 4 players all shooting each other at once with a high enough TTK that it was very easy to lose track of one enemy because you saw a more immediate threat. So it was a lot easier to just assume the rubber banding was a you problem or not notice it at all.

        Then we had CoD and it all became about super short TTK and 1on1 fights. And now? Now it was incredibly obvious when someone warped because they were your only concern.

        Back in the day, my games were UT (mostly the good one, sometimes 2k4), Jedi Knight 2, Tribes 2, and Operation Flashpoint. I was a cool kid… But even then, it was almost never perceptible in UT even though the Unreal Engine had “the worst netcode”. Also not OFP since your encounter ranges were so long and you were squinting through iron sights so you had no idea if you missed because of lag or what. But JK2 and Tribes 2 were VERY obvious when the network was acting up because you were generally dueling someone or taking out a lone flag carrier while skiing across a field.

    • bridgeenjoyer@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 hours ago

      Its ironic. Network latency has drastically decreased while game optimization tanked. Leading us back to where we were originally!

    • doingthestuff@lemy.lol
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      I played using a cell phone connected by USB with a 14k data connection. It was slow af but I got unlimited data for $5 a month and it didn’t tie up the land line.

  • yaroto98@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    1 day ago

    I distinctly remember mario bros on the nes. There was like a 1/3second latency between pressing the button and mario jumping. You had to time your jumps (especially when running) further back than you’d expect to compensate. You just kinda got used to it after a while.

      • yaroto98@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        22 hours ago

        Yep. On the emulators now it is instant. I recently stayed at an airbnb with an nes and played with my kids. The lag is definately there. Even my kids were falling off stuff shouting that they pushed the jump button.

  • Frezik@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    1 day ago

    An effect you may be noticing is motion smoothing, or the lack of it.

    If you play Pong on an old console, it likely moves the paddle at full speed the moment it gets input to move. Acceleration is instant. This is very precise, but it also feels unnatural.

    Modern versions will usually have some acceleration time that smooths out movement. It can be a very small effect, but it feels more natural and most people prefer it. It’s also less precise. People generally learn to compensate for it over time.