Digital Foundry Article Technical Discussion Archive [2010]

Status
Not open for further replies.
Another common misperception of the issue. Resolution on its own has no impact on perceived aliasing.

What you're actually referring to is, that as pixel density (PPI or Pixels Per Inch) increases, some forms of aliasing become less noticeable. So, for example, a 480p image on a 10" screen (high PPI) would appear to have less aliasing than a 1080p image on an 80" screen (very low PPI). Now put that onto the same size screen and the situation would be reversed. 480p on a 40" screen (low PPI) versus 1080p on a 40" screen (high PPI) and the perceived aliasing is now reversed.

It's incorrect to discuss PPI without taking viewing distance into account though. As you view something from further away, your eyes perform SSAA for you for free :) All I was saying is that a typical 1080p LCD TV viewed at a typical distance is already pushing the limits of human eye resolving power. Now, if you sit 3 ft away from your 46" TV or get a 100" projector screen, you'd obviously start noticing the individual pixels more, but those aren't typical cases. From a typical distance and a typical size screen, 1080p pixels are already small enough that aliasing does not become so noticable, and MLAA should be enough to take care of the remaining aliasing. IMHO, the next generation of consoles can get away with 1080p+MLAA rendering and don't need to budget extra for more traditional and computationally intensive forms of aliasing, and they should spend the gpu budget on other things.
 
Technical discussion of DF's articles, currently looking at COD:BO remains here.

Technical discussion of multiplatform development is moved here.

Business discussion of multiplatform development and exclusives is removed as it never ends well as that doesn't fit with B3D's technical mantra.
My apologies for collateral damage.
 
Any guesses on how big the AI dataset on a game like Halo Reach is?

Based on how Halo1 AI works I'd say it isn't that big...

A large part of the system is included in the levels, in the form of invisible navigation nodes. This data is used to drive the larger scope of the battle, ie. mark the position that the AI wants to try to defend or occupy.
The rest shouldn't take up too much memory and is fairly generic as all kinds of enemies can be encountered in any kind of environments. As far as I know there aren't such things as preset moves or attack patterns based on Q&A. Variety and complexity are the result of emergent gameplay created with a few, simple, but robust systems.

Each AI agent also maintains a small local memory of where it has seen the player last and so on. But I'd say that the animation data for the enemies takes up more RAM then the actual AI.
 
I've been out for the weekend so hopefully this post isn't seen as OT, if so delete away. :p

Joes notice. It may not significantly impact their enjoyment or influence purchases, but they can discern many things along with gamers. They just don't care as much.

Sure they notice if you point it out to them, but even then it's hit or miss IMO. Of course I can only go by what I've experienced in my gaming life.

COD is by far the best selling franchise (well I guess a few Nintendo franchises may compete...) in the world and is 60 FPS (well, close to it depending on platform). While I've often agreed with you, there's no way they're going to do something risky and fundamentally change the game now. If the games sales were faltering, then maybe.

What I see as the primary selling points for the series really wouldn't change much (if at all) moving the game from 60fps to 30fps.

Yeah, it's never ending appeal for multiplayer is certainly related to the low control latency. Even if the players can't tell what the difference is, I'm sure they're aware that it's better than the other games in some way.

See I have a large group of friends who I would call mainstream gamers. They aren't really into the whole casual genre, but at the same time they only know of the high profile titles coming out and what little info they care to know (not much). Now I've consistently tried to teach them about the fundamental basics to game engines, with frame rates being a huge ongoing topic.

I know my evidence is only anecdotal but from what I've experienced, most joe-gamers don't pay attention to frame rates and the benefits it may bring. In person and over Live, I've had racers and shooters loaded up that were 30fps and 60fps and my friends were usually hard pressed to see or even feel a difference.

When asking my CoD-obsessed friends why they play CoD above other shooters, I don't believe the feel or responsiveness of the game ever came up. It was always about other factors that are more obvious.

Halo is big, but hasn't got either the sales or the number of online players that the x360 version of the newest COD game, at least as far as I know. It's a solid second place and offers a lot of unique and cool stuff but still not as massive for some reason.

I think there are plenty of reasons why CoD has taken over as the shooter of choice for many gamers this gen:

-Last gen, few if any shooters matched Halo 2's basic online features, not the case this gen. Now gamers have more options than Halo if they want a quick quality online experience.

-Every current gen system has launched with a quality CoD title, capturing mind-share early on, and building awareness through lack of competition at launch along with good word of mouth.

-While popular before, the series really shot up in the minds of gamers once the setting was changed to modern day. Now gamers have a modern day military shooter that is high in quality, but void of any complexities found in other military shooters.

-What I see to be the key pillars of CoD's design seem to appeal to the masses. Solid gameplay that is easy to learn, offers quick (almost) instant gratification, and what I consider the greatest pillar in CoD's MP appeal: the carrot dangling on a string pushing gamers to unlock the next weapon/item or reaching the next rank.

---------------------------------------------

There are other reasons as well, halo being the older series and CoD being the newer kid on the block for example, but I hope I explained my point well enough. :smile:

Yes, just think about it, if the 60fps feature wouldn't give them some kind of edge then they probably would've abandoned it already, in order to catch up to their competitors in other graphical features.

Not sure about Treyarch, but I'm pretty sure early on this gen I've read quotes or comments from people at IW how 60fps was the studio's standard. This could be the reason for sticking with 60fps throughout the series.

Also as plenty of gamers (and publications) praised the last few CoD titles for not just the gameplay, but also the graphics, I'm not sure many people think they really need to catch up to the competition. The fact that they can remain competitive graphically while shooting for 60fps would give them even less reason to drop down to 30fps.

60fps for COD is a must.Its the same reason why racing sims(Forza/GT) are 60fps.More response and much smoother experience,very important for a multiplayer.Other games dont necessary need 60fps and im fine with that.

I understand why 60fps would be needed for a racing sim, but not so much for a shooter, even a twitch based shooter like CoD.
 
How about a 60fps game where only your weapon and the crosshair is updated at 60hz while the rest is updated at 30hz? Sure you'd have to render whats behind your weapon instead of culling it, but that's a very small cost compared to true 60hz rendering. It'd be like a move/wii fps game where your edge detect turning box is equal to how much you could turn in a single 16.6 ms frame.
 
Last edited by a moderator:
Nope.DF articles are so much more professional than LOTs,much more insight,so you cant miss.Its interesting to see that ps3 dips more when draw distance in city is bigger,presumably because of more geometry to render and 360 seems to struggle more when destruction comes.
 
DF seems to have missed some of the differences Lot found like the light beams in the church. They might have needed to spend a little more time comparing them.
 
Am I blind or is there a difference in motion blur during the horse riding scene in the city with the explosions?
 
DF seems to have missed some of the differences Lot found like the light beams in the church. They might have needed to spend a little more time comparing them.
Or perhaps it was a bug.
Am I blind or is there a difference in motion blur during the horse riding scene in the city with the explosions?
I thought I noticed more motion blur on the PS3 version but I didn't pay close attention to that scene. I'll watch it again.
 
DF seems to have missed some of the differences Lot found like the light beams in the church. They might have needed to spend a little more time comparing them.

Yes, DF seems to have glossed over some differences in shadowing between the two, some differences in foliage rendering too also LOD seems to be a bit more aggressive on PS3 in the cityscapes
360_007.jpg.jpg

PS3_007.jpg.jpg


It is interesting in since the game had the PS3 as the lead console.

Furthermore, let us not forget that the PS3 is aided by a 4 GB mandatory install wheras the 360 is running straight from disk and indeed can run even without a HDD - that is no mean feat
 
Last edited by a moderator:
That lighting bit was either a bug or perhaps drowned out by LOT's bad capture itself. They're way too bright and blurry, compounded by the QAA blur.

The shadow differences in the above shot are likely situational due to the position of the clouds.
 
I wonder if the increase in the PS3's HDD 4GB install (compared to AC2's 2GB) is suppose to increase the game's performance. I mean maybe there's no serious improvements the engine to itself, but just Ubisoft using more HDD space to help push down some performance issues shown in the past 2 games.

Or is it just more HDD space being used because of Brotherhood's bigger scale, compared to it's predecessor.
 
Quincunx and gamma aside, the two games are effectively a match, but this is nothing new, as the differentiating factor with the AC titles has always been the performance. In the original Assassin's Creed, performance wasn't spectacular on 360 but was positively woeful on PS3. The sequel showed little to no improvement on the Microsoft platform, but a lot of work had clearly been carried out on the PS3 engine – 360 still commanded the overall advantage, but in several places the Sony console outperformed it.

Where did AC2 PS3 outperform the 360 version? The video it links to doesn't show any situation where the PS3 does better.

Still, really impressive how far this engine has come on PS3, while still adding a ton of new stuff to it. The original was a real chore to play because of the performance.
 
Status
Not open for further replies.
Back
Top