Digital Foundry Article Technical Discussion Archive [2010]

Status
Not open for further replies.
I think the fast responsive controls provided by a roughly 60Hz game allows them to design a game that is incredibly fast and intense, where you might struggle a little bit with the same game design that had a less responsive control scheme. There are other things, like the actual analog control being implemented perfectly for a fast arcade shooter. The amount of sensitivity you can get is unreal, and coupled with the responsiveness, they can allow for some very frantic combat.
 
Ok, guys...if 60Hz is the reason for financial success. Why not other devs jumping in and producing 60Hz games?!

1. 60 fps on its own is no guarantee
2. 60 fps is pretty difficult in itself for a start
3. 60 fps AND competitive graphics are pretty damn hard

And by the way, do you guys remember the discussion Insomniac had, about how they studied impact of 60Hz gameplay and found no correlation between financial success, review success and 60Hz framerate?!? I think there was even a thread here at B3D forum about this topic...

Insomniac and Treyarch/IW are free to have different opinions on the matter ;)
 
1. 60 fps on its own is no guarantee
2. 60 fps is pretty difficult in itself for a start
3. 60 fps AND competitive graphics are pretty damn hard



Insomniac and Treyarch/IW are free to have different opinions on the matter ;)

Also:

4. 60 fps doesn't look as good in screen shots.
5. Trailers do not show off 60 fps, most sites don't even release video at that framerate.
6. Most other games simply use 30 and are fine for it but a 60 fps product has to compete in both looks as well as gameplay.

Anyway FWIW Nintendo is pretty much strictly 60 fps as well IIRC. So theres another high framerate high sales product to use as an example.
 
That's something I really respect about Nintendo. They understand what a console experience should feel like, and despite weak SD visuals, their games still have a sense of class. Do they ever drop below 60fps? I'd like to see a DF article on Wii's framerates across several games.
 
A few questions about Theater mode:

1)Why not limit it to 30 fps since you need not worry about controller feedback? Also, internet video (and YouTube specifically) is 30 fps. It would also allow for a bit more 'bling' in the effects/higher res (potentially).
2)Why do the screen shots not have better AA applied?
3)Why is YouTube limited to 360p vice 480p?
4)Why not allow straight 720p video to HDD?
 
That's something I really respect about Nintendo. They understand what a console experience should feel like, and despite weak SD visuals, their games still have a sense of class. Do they ever drop below 60fps? I'd like to see a DF article on Wii's framerates across several games.
Well the last Zelda was a 30FPS game, I can't remember about the previous ones though its been a long time since I played them.
 
You cannot compare different games feature-for-feature as a measure of engine efficiency. Game engines are way more complex than a list of features connected together. Seeing an effect in one game does not mean its absence in another game shows the devs aren't very good. This isn't about protecting the feelings of developers, but about using sound reason to form our opinions, and cross-game comparisons provide pretty much no insight whatsoever, so it's saving your own time to abandon that line of reasoning, and find answers through different channels.
Well said. I mean, the HD DVD player on XBox used all 6 threads and the GPU to 100%, and all it did was play movies... :)
 
Halo is big, but hasn't got either the sales or the number of online players that the x360 version of the newest COD game, at least as far as I know. It's a solid second place and offers a lot of unique and cool stuff but still not as massive for some reason.

Aye, it appears that the COD series has most certainly supplanted Halo as THE online FPS game. And a large part of that is going to be the control and display latency or lack thereof compared to 30 fps games.

Average Joe FPS player may not realize why the game seems more fluid, accurate and easier to control, but experienced players, especially those that played FPS competitively on PC, will most certainly immediately note that it's not a 30 fps game even if they weren't told that it was.

Regards,
SB
 
I'm curious about some old comments you had about MLAA prototyping. Was it COD:BO that you were testing MLAA on?

This was on my old gig, which is unfortunately no more. I still did get a nice appreciation for MLAA which is why I still think it's the way of the future (especially next gen with more programmable hardware etc).
 
1. 60 fps on its own is no guarantee
2. 60 fps is pretty difficult in itself for a start
3. 60 fps AND competitive graphics are pretty damn hard

Exactly.
For the longest time I was on the fence, especially because most of my previous projects were 30fps games.
Recently though I changed my mind and now I do believe 60fps IS a significant competitive advantage for COD (and basically any other game with fast moving camera and/or objects).
It all happened after I read some of the research that was published during this year's Siggraph. Specifically the one from the Lucas Arts guy talking about frame extrapolating 30fps to 60fps. There's some related research by other people on the subject, pretty much confirming the same observations (importance of eye tracking, motion blur as eye effect - not camera, etc). Anyways, I recommend to anyone to take a look at the demo videos provided with the paper. The difference in fluidity when switching back and forth between 30/60fps is stunning, even though every other frame is fake!
It's a subconscious thing. As you guys say, regular Joe doesn't even know it's 60 fps, they just know it's smooth and feels better.
 
While gamers like us may recognize and appreciate higher frame rates, Regular Joe doesn't know the difference between 60fps and 30fps. It's for this reason that I don't understand why IW or Treyarch don't lock the framerate down to 30fps and add in some new effects.

They don't know, but they can FEEL. You would not be able to have such a fast FPS such as MW2 without a high (around 50 fps) framerate.
 
Yes, the memory saving from turning MSAA off is a lot less than 18Mb.
Plus, if you want to do MLAA you'll have to move your frame buffer to main memory which is usually more scarce than video memory.
And don't forget that the cost of MLAA was quoted at 20ms for 1 SPU (for 720p I believe, for COD it will be less). At any rate that's quite a lot of SPU time for a 60fps game, that needs to get everything done in 16ms.

No offence, I really appreciate you input here, but BO is not often a 60 fps game on the PS3. But thanks for taking the time with us!
 
No offence, I really appreciate you input here, but BO is not often a 60 fps game on the PS3. But thanks for taking the time with us!
Which shows they were already going overbudget without MLAA! Squeezing 20 more SPUms in is clearly not straightforward as some seem to feel.
 
Halo is big, but hasn't got either the sales or the number of online players that the x360 version of the newest COD game, at least as far as I know. It's a solid second place and offers a lot of unique and cool stuff but still not as massive for some reason.

On the 360 they've been like for like most of the time. A multi-platform title however gets more buzz, as everyone knows and talks about it even if they have a different platform, and the sales can get bigger, marketing bigger, etc. Lately, COD is doing much better story-wise, which is a big appeal to many as you wouldn't believe how many people still buy it for the off-line experience.

Regardless, it is definitely also still a lot about brand. A lot of people I know wouldn't have bought or played Killzone 2 unless I had urged them to do so. So far there is no exception to that everyone who has actually played it, were hugely impressed with the single player campaign. Only the online players have tended to gravitate back to CoD.

Here, in contrast to what Billy Idol says, latency is in fact a big factor. Since latency becomes big with online games anyway, everything you can take away from that on the side of the game itself brings the total latency closer to that magical 150ms that grandmaster / DF's latency investigation came to the conclusion was the 'magic point' where people start complaining about response-times. This matched with the experience from Killzone, where latency was averaging around 150ms with even offline some spikes to 180ms (at least pre the later patched in option to improve response, which did help some), which is just about acceptable for offline, but online causes problems very quickly unless you have a very good connection. Killzone 3 brings it down to Halo levels, but at the level that is the best a 30fps game can do, that's still almost 0,4 higher than the best 60fps games manage (like Burnout Paradise). Still, it's good enough to keep it at or below the 'magic' 150ms mark for most decent connections and is excellent for offline of course.

On a higher level, I think it is interesting to note that the borderline of a human being's best response time in terms of pressing a button after a signal is in fact right inbetween a 1/60th and 1/30th of a second (i.e. one frame of a 60fps game vs one frame of a 30fps game) - for most people it is about .22 seconds, and it goes up as you get older apparently. ;)

This goes hand-in-hand with research back from the CRT days where displays would flicker, showing that the borderline for most people to be able to detect the flicker was between 70 and 80 fps. Not everyone would notice without being trained, but they'd still be getting headaches or more tired at 60hz or lower refresh rates, particularly with stuff where the image you are looking at is mostly static and bright. It's basically this ability to detect movement and then respond which matters a lot.

For moving images, things get more complicated on the visual department, as your brain interpolates a lot, which is why you can do so much at 30fps by adding object based motion blur to make things look incredibly smooth.
 
Which shows they were already going overbudget without MLAA! Squeezing 20 more SPUms in is clearly not straightforward as some seem to feel.
How do you know that's because of the CPU and not GPU? Usually it's the GPU that's overbudget on the PS3.

In any case, I don't think MLAA is a good idea for sub 720p so Treyarch did the right thing. 720p shows acceptable results, but I think GOWAA would shine at 1080p. I actually think 1080p + MLAA would be really close enough to "real" AA that it would truly be enough to not bother with more computationally intensive AA methods.

A more interesting thought would be how much difference there would be between true 60fps and the TFU2 motion blur method that didn't get implemented in the game, and would it be worth doing that instead of true 60fps in a shooter.
 
Last edited by a moderator:
It really is a shame that some devs can't release their SPU usage (charts with what systems are being run within a frame, etc) information. I would love to see the free SPU time left over or how low the number of jobs there are on the SPUs for CoD:BO! I also wonder about the streaming system for the PS3 version (if it's basically the same as the 360's).

It seems and if this game is built to maximize all the 360's strengths and almost none of the PS3's strengths. Of course, that would yield results similar to what we see here.

Would definitely be interesting. I'm surprised by the way about Barbarians comment on main memory generally being harder to spare after all the complaints in the past about the split memory pool on the PS3 not leaving enough room for textures in graphics memory.
 
You cannot compare different games feature-for-feature as a measure of engine efficiency. Game engines are way more complex than a list of features connected together. Seeing an effect in one game does not mean its absence in another game shows the devs aren't very good. This isn't about protecting the feelings of developers, but about using sound reason to form our opinions, and cross-game comparisons provide pretty much no insight whatsoever, so it's saving your own time to abandon that line of reasoning, and find answers through different channels.

Of course it's just as hard to compare the same engine between two different maps or scenes. Or different generations of engines, or engines with common ancestry, or etc etc.

This is of course the reason developers use profiling tools to get objective measurements about a scene's performance instead of querying idle speculators on forums.
 
Would definitely be interesting. I'm surprised by the way about Barbarians comment on main memory generally being harder to spare after all the complaints in the past about the split memory pool on the PS3 not leaving enough room for textures in graphics memory.

Are you sure you don't have it backwards? There's no reason textures can't go in non-graphics memory.
 
Halo is big, but hasn't got either the sales or the number of online players that the x360 version of the newest COD game, at least as far as I know. It's a solid second place and offers a lot of unique and cool stuff but still not as massive for some reason.

It's because Halo is harder to play than COD (longer kill times etc.) and it doesn't reward levelling up as much as COD, which works on much the same principles that keep thousands of people addicted to WOW.

Being a player of both I know that COD feels more worthwhile to sink hours into because you keep unlocking new stuff (and not just a few pieces of armor like in Reach).

I've actually had this conversation a while back with co workers. The consensus is that we made it too easy for devs to get stuff working on our console, and now it's backfiring on us by making developers lead on PS3.

Oh are you with MS? That's pretty hilarious then.

It also has something to do with the widespread assumption that PS3 is the far more powerful machine and so if it looks better on 360 it means the developers are lazy - this leads to something of a self fulfilling prophecy

As is typical, it depends on the game. The simplest example is ai, what if you need a humongous data set resident in ram at all times? What do you do, compromise ai to try and implement mlaa? The simplest solution is to lead on ps3, that way you can restrict all systems to it's parameters and everything will automatically port perfectly to the other machine. For example, if your ai coder one day is all giddy about a new system he developed and he wants to demo it to everyone. One of two things happen depending on which platform he is working on:

1) He demo's it on 360. It looks great, but then he reveals that he needs 110mb reserved for his system at all time. Crap, now what do we do? It looks legit, but it simply won't work on ps3. Who in heck has 110mb dram to spare on ps3? Sorry but it needs to be paired back or scrapped.

2) He demo's it on ps3. It looks good. More memory would improve it, make it more human like, but hey we don't have more memory, and this version will automatically work on the 360. So we go with it because it's ready to go on all platforms.

.

Thanks for the insight Joker.

Any guesses on how big the AI dataset on a game like Halo Reach is?
And how do games like KZ2/3 deal with the issue on PS3 (since they have pretty good AI).
 
Last edited by a moderator:
I think owners of both consoles (like myself) would like to see multiplat titles exploit the strengths of each console and not pare back the tech or the content so it can run on both consoles to achieve the notion of parity - which I think is fundamentally misguided.

Sales will need to take a leap for that. A ton of quality games sell poorly and the AAA games (like a CoD) are on such tight schedules the important thing is getting a stable, refined, quality experience out the door and not hitting as many bells and whistles theoretically possible. And even if the money was available, it would have to be shown to been counters the extra resources has a direct result in sales.
 
Status
Not open for further replies.
Back
Top