Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Now that I think about it, 60fps was a product of the CRT era, and the system was outputting an interlaced signal to get that 60fps, at 480i, not 480p. Some games could do progressive at 60fps, but most couldn't. 60fps has always been the exception, not the rule. Frankly, I'm not sure where this sudden fascination with 60fps has come from, except PC gamers that are used to it. But, frankly, a solid 30fps is perfectly fine for most titles. The operative word being "solid". Most of the "30fps" games that people complain about actually have framerates that fluctuate wildly, dipping as low as the teens and rarely actually hitting the 30fps target. I suppose some people get that into their heads that that's what 30fps is, and so they hate it. But take something like Lego Marvel on PS4. The action is so smooth that I honestly had a hard time telling that it was "only" 30fps instead of 60. I had to look for specific markers to tell me what was going on, something that 99.99% of gamers don't even know exist, much less would actually think to look for. They probably think the game does run at 60fps because of how smooth it is.

lol what? Where are you getting this bullshit from?

People like 60+ frames per second because of input latency, especially in competitive games. Call of Duty is one of the best selling franchises of all time, you think people wont notice the difference going from 60 to 30 frames per second? Its 16.6ms vs 33.3ms per frame, players being near equal, guess who has the decision making advantage.
 
Now that I think about it, 60fps was a product of the CRT era, and the system was outputting an interlaced signal to get that 60fps, at 480i, not 480p. Some games could do progressive at 60fps, but most couldn't. 60fps has always been the exception, not the rule. Frankly, I'm not sure where this sudden fascination with 60fps has come from, except PC gamers that are used to it.

Console games used to always be 60fps to give that smooth feel going way back to the Atari 2600, NES, Genesis etc days. It's not just a pc thing, if anything pc's took a long time before they could run anything at 60fps, they used to fumble around at much lower framerates for many years back in the day when consoles were putting out solid 60fps on everything. In fact having a console not running a game at 60fps was controversial, remember when Super Nintendo NHLPA ran at 30fps whereas the Genesis version ran at 60fps? Everyone noticed and it was a very big deal at the time which was the early 90s. The situation has flipped now to where it's consoles running at low framerates and pc's being able to run just about anything at 60fps.
 
The situation has flipped now to where it's consoles running at low framerates and pc's being able to run just about anything at 60fps.

This is not true.
I have a laptop/PC and it can't run just about anything at constant 60fps not to mention max settings.
Not every PC is good for gaming nor best consoles in every area easily.
 
This is not true.
Joker means as a platform, PC can run 60 fps by putting in more hardware, and not that all PCs can run 60 fps. Although my experience even to running a modern 2D game on a capable PC is that PC framerate is forever at risk of juddering with OS-level interruptions. You never know when some background task is going to drop your framerate.

The original Lego SW game on PS2 was fabulous for its 60 fps. The fact they've dropped to unstable 60fps and then not even 60 fps shows a change in values at TT, perhaps because of the WB buyout. Snowblind Studios also produced top-drawer PS2 games but their last-gen release under WB was a bit poopy. May just be a coincidence.
 
I have quite a good gaming PC, not top dog of course, but capable with 670 and i7. I have never experienced stable framerates in any of the games I play. PC games in my experience fluctuate as hell...e.g. playing COD, I get 45-150 Hz output...and I really really have to substantially crank down settings to get 60+ Hz such that vsync gives me stability...just not worth it imo.

So, to me, the last thing that comes to mind about PC gaming is stable framerates?!? Not even talking about micro stuttering. What do I wrong?
 
I think PC framerates just tend to vary more than consoles because of the less predictable driver/API. I'm not sure it's a case of background task (OS housekeeping, antivirus?) kicking in unless you've got a pretty poorely configured system but it's more a case of you'll notice fluctuations on screen if you use vsync and the framerate drops below the theshhold - or wthout vsync and you drop - even temporarily into stuttery territory (less than 30fps).

If you can maintain a minimum framerate over the vsync threshhold at all times, or you play without vsync but never drop below say 40fps then you should always have a nice smooth consistent experience (although without vsync you can obviously get tearing).
Last night for instance I was playing 2 player Lego Lord of the Rings on the PC through the TV. I didn't check the framerate but it would have been very high. I guess vsync was on because there was no tearing and not once did we see any slow down, jerkiness or stutter of any kind. Just a 100% silky smooth experience start to finish. That's what a PC can deliver if you have sufficient performance vs settings or now - even if you don't have sufficient performance to exceed the vsync threshhold thanks to Gsync. Note I have a similar setup to you and I'm able to get that kind of experince on mine in plenty of games. Others I do prefer to dial everything up and play in the 45fps region with vsync off so will see occiasion framerate drop into juddery territory. As long as it's the exception and not the rule it's a compromise I'm willing to live with in most cases.
 
I've been playing Awesomenauts on a Haswell i7 4770. Setting resolution to 720p, it's 60 fps except every now and then it stumbles. And that's with and without V-sync. It was the same way back when with NWN or Dungeon Siege. At a time when I was playing gorgeous 60 Hz BG: DA on PS2, the more powerful PC couldn't maintain a decent, stable framerate. A lot comes down to software, of course, but it's a standard in the PC experience that I've always had and hoped was gone, but it's still there, whatever the cause.

What this means for gamers by and large is the death of the 60 fps arcade experience, because the consoles aren't providing that either. Maybe a monster PC rig or carefully configured box will manage it, but for Joe Shmoe Consumer, main-screen gaming doesn't have much swish as its too low priority for most devs.
 
Console games used to always be 60fps to give that smooth feel going way back to the Atari 2600, NES, Genesis etc days.

True. But those games were very different, and just about all the games in similar or the same genres we're seeing today (side scrollers, beat-em-ups) still run at 60fps. As a matter of fact they run an awful lot better than they ever did. It's not like old 8 and 16 bit games didn't suffer from a shit ton of slowdown.
 
I'll chalk it up to giving a shit.

:LOL:

So very, very true. Tt stopped caring about technical quality in their output at the last console transition, it took what 3 or 4 titles on the PS3 before they stopped being a screen tearing judderfest? They know their gameplay is good enough to paper over the cracks so they get away with murder on the technical side and I love me some Lego games.
 
Again as i said, this approach of 30fps over 60 has been going on since the beginning of 3D rendering on consoles, when devs actually had to start to prioritize a lot of things because of the complexity of assets becoming a thing. Is it really useful going back 20+ years ago when you only had 2D to think about?

My main point is that, it being a big deal now is weird when its been the case for a long arse time.
 
Joker means as a platform, PC can run 60 fps by putting in more hardware, and not that all PCs can run 60 fps. Although my experience even to running a modern 2D game on a capable PC is that PC framerate is forever at risk of juddering with OS-level interruptions. You never know when some background task is going to drop your framerate.

The original Lego SW game on PS2 was fabulous for its 60 fps. The fact they've dropped to unstable 60fps and then not even 60 fps shows a change in values at TT, perhaps because of the WB buyout. Snowblind Studios also produced top-drawer PS2 games but their last-gen release under WB was a bit poopy. May just be a coincidence.

I know that you mean but to be honest Lego games don't require super low latency, they are not competitive FPS, they are E rated action-platform that are perfectly enjoyable and playable at 30fps...or you disagree?
Surely at 60fps they would be even better but IMO in this case targeting 30fps is not a terrible choice/trade-off.
I am sure kids love them regardless of the 30fps.

BTW DF constantly puts "unknown specs PC vs consoles" but that not very professional IMO.
DF is not biased, that not what I mean, but I would like to know what kind of PC is always winning every single face-off.


EDIT
For the record I know very well what are the benefits of 60fps and I am not saying that 60fps is useless...I just can't despise 30fps ;)
 
Last edited by a moderator:
Console games used to always be 60fps to give that smooth feel going way back to the Atari 2600, NES, Genesis etc days. It's not just a pc thing, if anything pc's took a long time before they could run anything at 60fps, they used to fumble around at much lower framerates for many years back in the day when consoles were putting out solid 60fps on everything. In fact having a console not running a game at 60fps was controversial, remember when Super Nintendo NHLPA ran at 30fps whereas the Genesis version ran at 60fps? Everyone noticed and it was a very big deal at the time which was the early 90s. The situation has flipped now to where it's consoles running at low framerates and pc's being able to run just about anything at 60fps.
Wow, colour me surprised. I thought that either the majority of old consoles had a very bad framerate or displayed significantly smaller framerates than 60, because there were very few frames of animation.

Say, a jump would be like 3-4 frames of animation, so running at 60 frames would be pointless.

It is the first time I hear that 60fps was the goddamn standard on home consoles.
 
DF have posted specs in the past and they usually tend to be reasonably modest PC's. I've certanly seen reference to a 270 based setup on more than one occassion. I agree a 30fps experience can be fine, especially a completely solid one with some nice motion blur but 60fps (or more) is instantly a obviously noticable to me over 30.
 
I've been playing Awesomenauts on a Haswell i7 4770. Setting resolution to 720p, it's 60 fps except every now and then it stumbles. And that's with and without V-sync. It was the same way back when with NWN or Dungeon Siege. At a time when I was playing gorgeous 60 Hz BG: DA on PS2, the more powerful PC couldn't maintain a decent, stable framerate. A lot comes down to software, of course, but it's a standard in the PC experience that I've always had and hoped was gone, but it's still there, whatever the cause.

What this means for gamers by and large is the death of the 60 fps arcade experience, because the consoles aren't providing that either. Maybe a monster PC rig or carefully configured box will manage it, but for Joe Shmoe Consumer, main-screen gaming doesn't have much swish as its too low priority for most devs.

I don't think you have to have a crazy powerful PC or go overboard on configuration to achieve this to be honest. Just make sure you don't download/install a load of unwanted crap like "Ask Toolbar" and Adobe update manager etc...

I did just make a video of AC4 going a bit crazy with the camera and uploaded to YT in an attempt to prove the point but unfortunately YT has dropped the frame rate back to 30fps and it ends up looking like a juddery mess. In reality this is silky smooth without a single hiccup, stall, sudden bogging down of framerate etc... and that's in one of todays most demanding games without vsync and mostly sub 60fps.

Here's the video anyway just because I spent so much time uploading it but aside fro the fraps counter it doesn't really serve to illustrate my point. I could always throw up a 120fps vsynced Lego Marvels video but I suspect that will suffer from the same YT quality issues.

http://www.youtube.com/watch?v=sdeWCmWdvF4&feature=youtu.be
 
I know that you mean but to be honest Lego games don't require super low latency, they are not competitive FPS, they are E rated action-platform that are perfectly enjoyable and playable at 30fps...or you disagree?
Surely at 60fps they would be even better but IMO in this case targeting 30fps is not a terrible choice/trade-off.
I am sure kids love them regardless of the 30fps.

They feel worse, but the TTGames owners can make money this way. I am just dissapointed that TT has turned into this kind of studio.
 
Wow, colour me surprised. I thought that either the majority of old consoles had a very bad framerate or displayed significantly smaller framerates than 60, because there were very few frames of animation.

Say, a jump would be like 3-4 frames of animation, so running at 60 frames would be pointless.

It is the first time I hear that 60fps was the goddamn standard on home consoles.

It started that way because of the Atari 2600. That machine had no frame buffer on it, you were drawing pixels as the tv beam scanned it's way across all the time. There was no way to go with a lower framerate because if you stopped drawing then the user would see nothing because that machine was tied to the 60hz nature of tv's and had no frame buffer to re-display if you didn't draw anything new hence every game on that box was 60fps. They would do other performance tricks drawing things on alternate frames, etc but they were always building 60 new frames per second. It's competitor Intellivision didn't always go with 60fps and as a result it had the reputation of it's games being slower, more sluggish, less crisp feeling. People noticed 60fps back then even on games that had no scrolling, which gave the Atari 2600 the reputation as more of the arcade gaming machine compared to the more sluggish feeling Intellivision which was the strategy machine. Believe me I used to have that argument with friends all the time as I was an Atari guy back then. Later consoles continued having 60fps be standard like the nes/sms, genesis/snes with the occasional controversy like 30fps NHLPA on the SNES that Sega took full advantage of to say how they had the more powerful machine than Nintendo since their version ran at 60fps. During those times getting 60fps on a pc was borderline impossible, but the few games that could do it like Jazz Jackrabbit, etc, really stood out and were noticed by people for their fluidity and smoothness.
 
It started that way because of the Atari 2600. That machine had no frame buffer on it, you were drawing pixels as the tv beam scanned it's way across all the time.
Anybody curious about this may find the Wired article 'Racing the Beam' an interesting read. The Atari 2600 was my first experience of video games and I still remember the day my dad brought one home :D

Looking back, it's crazy what Atari (and Activision) programmers achieved with so little. You can google David Crane's recollection of developing Pitfall.
 
Status
Not open for further replies.
Back
Top