I’ve been wondering this recently. I grew up on atari/nes/snes and so of course almost all of those games (pretty sure all) are written in assembly and are rock solid smooth and responsive for the most part. I wonder if this has affected how I cannot stand to play badly optimized games eith even a hint of a laggy feel to it. I’ve always been drawn to quake and cs for that reason: damn smooth. And no, it doesn’t just need to be FPS games either. I cant play beat saber with a modicum of lag or i suck massively, but others can play just fine and not even notice the lag.
Its odd. I feel like a complainer but maybe I just notice it more easily than others?
It’s so weird to me that no one uses the term “slowdown” any more. Lag and latency meant networking delays back in the days you’re talking about. Not a complaint, just an observation that I’ve been wondering about the last few years.
But yeah, as others said, slowdown/lag was pretty common. I immediately think of the ninjas jumping out of the water in TMNT3, the beginning of Top Man’s stage in Mega Man 3, and the last boss of The Guardian Legend, but there were many more. Early 3d is shocking too, with more sub-30-fps games than you remember. Some called themselves at 20, even. [Edit: Now that I think about it, even some NES games capped at 20. Strange times.]
I believe OP is referring to input latency, which isn’t so much a result of the system slowing down due to increased load, as much as running in a consistently slowed-down state causing a delay on your inputs being reflected on-screen. There’s several reasons for why this is happening more often lately.
Part of it has to do with the displays we use nowadays. In the past, most players used a CRT TV/monitor to play games, which have famously fast response times (the time between receiving the video signal and rendering that signal on the screen is nearly zero). But modern displays, while having a much crisper picture, often tend to be slower at the act of actually firing pixels on the screen, causing that delay between pressing Jump and seeing your character begin jumping.
Some games also strain their systems so hard that, after various layers of post-processing effects get applied to every rendered frame, the displayed frames are already “old” before they’re even sent down the HDMI cable, resulting in a laggier feel for the player. You’ll see this difference in action with games that have a toggle for a “performance/quality” mode in the graphics settings. Usually this setting will enable/disable certain visual effects, reducing the load on the system and allowing your inputs to be registered faster.
You’re right. Yes, there’s slowdowns in a lot of older games but not necessarily input lag. The slowdowns dont bother me hardly at all. I think you hit right on it!
Input latency includes the time it takes to render the frame. CRTs have a small inherent latency advantage compared to modern LCDs but they’re not instant and that advantage is miniscule compared to the disadvantage of the lower framerate. A game running at 30 fps on a gaming LCD will have lower input lag than a game running at 20 fps on a CRT. I’m sure there are outliers that poll inputs in a silly way that increases input lag, but for most games the render time will be the greatest factor. Performance modes usually simply reduce the render time (even if the framerate is unchanged).
“Lag” does indeed come from network/signal theory and does indeed refer to networking. Been a minute, but I want to say lag is the round trip delay and latency is A to B but don’t quote me on that.
That said? Nobody cared. “Lag” was always the time between action and response. Some of that might be input delay. Some of that might be display delay (which has always been over-exaggerated but…). And a lot of that really was network delay. These days it tends to be more rendering/logic delay because people who are playing on shitty internet connections know it.