Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Anybody curious about this may find the Wired article 'Racing the Beam' an interesting read. The Atari 2600 was my first experience of video games and I still remember the day my dad brought one home :D

Looking back, it's crazy what Atari (and Activision) programmers achieved with so little. You can google David Crane's recollection of developing Pitfall.

David Crane! There's a name I haven't heard in a long time. Those old school programmers were very talented, they did level design, music, art and all the coding typically working alone for months and then sitting down with management to see if they had something to go to market with.
 
David Crane! There's a name I haven't heard in a long time. Those old school programmers were very talented, they did level design, music, art and all the coding typically working alone for months and then sitting down with management to see if they had something to go to market with.

Little Computer People, C64 disk version! I'm sure the little bastard cheated at Poker. I can't remember my LCP's name, though :cry:
 
Again as i said, this approach of 30fps over 60 has been going on since the beginning of 3D rendering on consoles, when devs actually had to start to prioritize a lot of things because of the complexity of assets becoming a thing.
There have been plenty of 60 fps games on PS1 and PS2. It wasn't as much a standard, but it was much more the 'console experience' still then.

I know that you mean but to be honest Lego games don't require super low latency, they are not competitive FPS, they are E rated action-platform that are perfectly enjoyable and playable at 30fps...or you disagree?
Higher temporal resolution always makes it 1) easy to track what's happening, and 2) just nicer to look at. There's a visceral smoothness, like a thick, full-fat, creamy yoghurt versus a (30 Hz) cheap, low fat yoghurt. 60 Hz is the difference between the best chocolates (Monty Bojangles truffles, Hotel Chocolat) and Hershey Bar or Dairy Milk.

Say, a jump would be like 3-4 frames of animation, so running at 60 frames would be pointless.
It's most important for tracking movement. Static objects like bullets or even the central player-controlled spaceship sprite were drawn 60 times a second, making object tracking that much better and more comfortable.
 
I've been playing Awesomenauts on a Haswell i7 4770. Setting resolution to 720p, it's 60 fps except every now and then it stumbles. And that's with and without V-sync. It was the same way back when with NWN or Dungeon Siege. At a time when I was playing gorgeous 60 Hz BG: DA on PS2, the more powerful PC couldn't maintain a decent, stable framerate. A lot comes down to software, of course, but it's a standard in the PC experience that I've always had and hoped was gone, but it's still there, whatever the cause.

What this means for gamers by and large is the death of the 60 fps arcade experience, because the consoles aren't providing that either. Maybe a monster PC rig or carefully configured box will manage it, but for Joe Shmoe Consumer, main-screen gaming doesn't have much swish as its too low priority for most devs.

A stuttering framerate is mostly down to the game programming or some odd driver issue, it is also not something limited to PC.
Games like Mass Effect on xbox 360, KOTOR 1 and 2 on OG Xbox have large stuttering problems themselves.
The reason games like NWN have problems is due to their engine (which just happens to be the base for the engine that KOTOR uses), BG DA on the other hand uses a completely different engine.
 
Little Computer People, C64 disk version! I'm sure the little bastard cheated at Poker. I can't remember my LCP's name, though :cry:

I remember like it was last week all the cool games advertized in magazines like Compute!, I see sites that are archiving all this stuff and I really hope the industry steps up and puts together a formal body to catalogue and store all this stuff for later generations. Games like Seven Cities of Gold, Mule, Archon, Defender of the Crown, Wizardry, the Sierra King's Quest series not to mention all the hardware like the Atari 800XL, Apple 2, C64 or Amiga 500 games will always amaze me.
 
Wow, colour me surprised. I thought that either the majority of old consoles had a very bad framerate or displayed significantly smaller framerates than 60, because there were very few frames of animation.

Say, a jump would be like 3-4 frames of animation, so running at 60 frames would be pointless.

Motion was decoupled from animation in the 2D sprite days, commonly characters would move at a fixed step and the animation would play at a lower rate, often holding different frames for different periods.

There was also a lot less input latency because the screen wasn't double buffered, you updated the sprites and scroll positions in the vblank.

I wrote a lot of genesis titles and they all ran at 60Hz, it was uncommon for that not to be the case. I used to hate supporting NTSC, because it meant you got 3.3ms less frame time and probably more importantly less time in the vblank.
Some of the early arcade games code are works of art.

Things have changed a lot back then it was about programming, now it's more about software engineering.
 
Higher temporal resolution always makes it 1) easy to track what's happening, and 2) just nicer to look at. There's a visceral smoothness, like a thick, full-fat, creamy yoghurt versus a (30 Hz) cheap, low fat yoghurt. 60 Hz is the difference between the best chocolates (Monty Bojangles truffles, Hotel Chocolat) and Hershey Bar or Dairy Milk.

I know what 60fps brings to the table.
No need to tell me, truly.

Also I am all in favor of food analogies.
More of those and less cars analogies.
 
Higher temporal resolution always makes it 1) easy to track what's happening, and 2) just nicer to look at. There's a visceral smoothness, like a thick, full-fat, creamy yoghurt versus a (30 Hz) cheap, low fat yoghurt. 60 Hz is the difference between the best chocolates (Monty Bojangles truffles, Hotel Chocolat) and Hershey Bar or Dairy Milk.

But devs are choosing 30 fps for a reason. Its not like they are oblivious to the benefits of a higher frame rate.

Maybe the reason we see so many 30 fps based game is because like you said its a thick, full-fat, creamy yogurt versus a cheaper low fat yogurt but where the cost saving is spent on berries, fruits and other toppings. One is more visually appealing even though its not as thick, fatty or creamy. Thereby, more marketable and sells more.
 
But devs are choosing 30 fps for a reason. Its not like they are oblivious to the benefits of a higher frame rate.

I think they have fallen into the "good enough" trap. Lots of people here seems to have fallen into that as well.
 
But devs are choosing 30 fps for a reason. Its not like they are oblivious to the benefits of a higher frame rate.

Maybe the reason we see so many 30 fps based game is because like you said its a thick, full-fat, creamy yogurt versus a cheaper low fat yogurt but where the cost saving is spent on berries, fruits and other toppings. One is more visually appealing even though its not as thick, fatty or creamy. Thereby, more marketable and sells more.
In some cases they add eye-candy, but in others, IMO, they are just not trying hard enough. That may be wrong and sounds a bit cruel, but Lego was hardly a tour-de-force of lighting and shading techniques. The style of the original games was perfectly balanced for pixel quality and framerate. Heck, it even had good reflective floors so was hardly a bare-bones visual experience. Compare something like CON, a top-down hack and slash, with XMen Legends, the same style game. XMen Legends had a lousy framerate and none of the polish of CON. And look at a load of 30 (or less) UE3 games this gen on PS3. Even something like Fat Princess failed to hit 60 fps.

So overall I think it's a business decision to stop at something suitable and not pay the disproportionately high amount to double framerate. I guess on 16 bit and earlier the devs had fixed hardware with set numbers of sprites etc. that literally defined the parameters of the game. Now that everything is programmable and there's nothing to stop devs adding more and more, the devs themselves need to decide when to stop adding more polys or shaders, and rather than pick a target that's right on the edge of what's possible and optimised to Betsy to run at 60 fps, they can pick a safe 30 fps target (and even then frequently overstep the mark, like Borderland's 2's slideshow encounters).
 
These 1080p 30fps games would probably run at 720p 60fps. The temporal resolution would be similar since 1080p is only slight more than 2x the number of pixels as 720p (12.5%). Might even allow for a better anti-aliasing solution with that slight 'bandwidth' savings.

It would also be smooth and have less input lag. In the average living room, the advantage of 1080p over 720p is diminished by factors like sitting distance from the TV and TV size anyways. Could also do a horizontal squeezed resolution (854x1080) which has the same number of pixels as 720p and probably also get 60fps.

They should at least give you the option. I would always choose 720p @60fps over 1080p @30fps unless I was capturing screenshots.
 
In some cases they add eye-candy, but in others, IMO, they are just not trying hard enough. That may be wrong and sounds a bit cruel, but Lego was hardly a tour-de-force of lighting and shading techniques.

I dunno, the last couple of Lego games have looked spectacular IMO. I'm playing Lego LoTR at the moment and I've been continuously impressed by it. And from the small amount of Marvel I've played, its even nicer by a healthy margin.
 
That's assuming you're wholly pixel limited though.

Right. If there was something limiting the framerate to 40fps regardless of resolution, then what I said wouldn't matter. If a game is wholly (or mostly) pixel limited, I just get the feeling they would still go with 1080p @30fp for marketing reasons. A lot of people don't even know what temporal resolution is or why it matters. Only spatial resolution matters to them and that's what they see in screenshots. I could be wrong though.
 
Can you explain it for me in depth?
When you think of a single frame of video, you think of a horizontal axis and a vertical axis (both spatial axes) along which point-samples of visual data (pixels) are situated. When you have a video stream, you have a whole bunch of frames layered one after another. This layering can be represented by simply introducing a third axis, a time axis.

"Temporal resolution" refers to precision along the time axis, in exactly the same way that "spatial resolution" refers to precision along either/both of the spatial axes of the image.

Superior temporal resolution is (basically) what you get when you up the framerate.
 
Does temporal resolution refer only to time domain (framerate), or is it a product of framerate and spatial resolution, literally.
 
Superior temporal resolution is (basically) what you get when you up the framerate.

Yeah I basically get all that, but does it really matter? He is saying he'll take 720P@60 over 1080P@30 every time. What about 480P@240 how great is that temporal resolution? How much does this temporal resolution matter and where are the crossover points? I certainly prefer 1080p 30fps in many situations over a 720P image regardless of the framerate.
 
Yeah I basically get all that, but does it really matter? He is saying he'll take 720P@60 over 1080P@30 every time. What about 480P@240 how great is that temporal resolution? How much does this temporal resolution matter and where are the crossover points? I certainly prefer 1080p 30fps in many situations over a 720P image regardless of the framerate.
There's a topic on framerate somewhere in this forum. No need to start it again.
 
These 1080p 30fps games would probably run at 720p 60fps. The temporal resolution would be similar since 1080p is only slight more than 2x the number of pixels as 720p (12.5%). Might even allow for a better anti-aliasing solution with that slight 'bandwidth' savings.
In my experience (as a long time console graphics programmer) 720p @ 60 fps is harder to achieve than 1080p @ 30 fps.

GPU:
Yes, it's true that 1080p pixel count is 2.25x compared to 720p, so the total pixels pushed by 1080p @ 30 fps is 12.5% higher per second. However, if the triangle count remains the same, each triangle in 1080p screen cover on average 2.25x pixels, resulting in much better quad efficiency. 720p for example might result in 60% quad efficiency, while 1080p results in 75% quad efficiency with the same content. So the 720p does 0.75/0.6 = 1.25 (25 % more) unnecessary work (25% more ALU & fill wasted). Memory bandwidth savings are similar: each visible surface on 1080p cover more pixels resulting in better texture cache hit rate (reducing BW), also on 1080p more pixels in screen are sampling mip 0 (and thus bilinear upsampling same texel to multiple pixels instead of using 1:1 mapped texture -> savings in BW). In conclusion, my experience is that pixel shader cost is approximately doubled from 720p -> 1080p (not 2.25x).

Vertex shader cost doesn't increase at all (if the same content is used and LOD calculation is based on distance). Thus all vertex bound draw calls (such as high polygon skinned characters) only have slightly higher cost on 1080p compared to 720p. There are also always some resolution independent stages (and setup stages) in all graphics engines. These stages have exactly the same GPU cost on 720p and 1080p.

Conclusion: 720p->1080p GPU cost is approximately in the 1.5x to 1.9x ballpark (when the content remains the same). Thus 720p @ 60 fps is usually slightly harder to achieve than 1080p @ 30 fps when the GPU is the bottleneck.

CPU:
60 fps doubles the CPU cost of game logic and render setup (draw call overhead, animation, etc). However clever tricks can be used to reduce the update rate of some operations to every other frame (and interleave updates to balance the frame cost). But no matter how many tricks are used, the extra CPU cost is the biggest bottleneck for 60 fps frame rate for most game genres (on consoles).
 
Status
Not open for further replies.
Back
Top