Digital Foundry Article Technical Discussion Archive [2011]

Status
Not open for further replies.
If it switches as you get closer, doesn't that mean it's not likely to be a bug but just an LOD transition based on proximity?

More than likely, yeah. It seems plausible that they they would tweak geometry lod if they felt that it was a bottleneck for longer distances.

:oops: More work?! You realise most mods are ex-devs kicked out from game development because we were too lazy to even be lazy devs, right? ;)

And ex-devs have high Rants Per Minute. :D

Maybe it's a LoD issue that relates to memory?

For LOD to work (not some dynamically tessellated mesh for example), you need to have the models loaded into memory or there's going to be delay like if you were trying to stream higher LOD textures (basically).
 
If it switches as you get closer, doesn't that mean it's not likely to be a bug but just an LOD transition based on proximity?

body is high poly and head low... the head is switching between low/high when i am 2 meters from the guy ..

6234c9bftw1dmhlksn5krj.jpg


this guy on right on ps3 is high poly even the head ...
 
For LOD to work (not some dynamically tessellated mesh for example), you need to have the models loaded into memory or there's going to be delay like if you were trying to stream higher LOD textures (basically).

Not just a delay, if you didn't have the room they'd never load up! At least not till you'd made room by dumping something else based on, for example, moving closer. That seemed to fit with what RDK had seen, so I was offering it as a possibility. You'd expect them to take zooming into account, but in a multiplatform game running on systems with different memory quantities, who knows what choices the system managing assets might make.

That was my line of thinking anyway.
 
Not just a delay, if you didn't have the room they'd never load up! At least not till you'd made room by dumping something else based on, for example, moving closer. That seemed to fit with what RDK had seen, so I was offering it as a possibility. You'd expect them to take zooming into account, but in a multiplatform game running on systems with different memory quantities, who knows what choices the system managing assets might make.

That was my line of thinking anyway.

I'm not understood exactly what do you want to mean here... you mean it's sure isn't it a bug or a glitch, why? It's just a way to see a thing, ok, but to me seem pretty impossible to establish with sureness without know how works this scene in a code debugging ...
 
About MLAA I know works in this way, just explain me what do you mean for MLAA; probably I'm talking of MLAA sony method I imagine, I can wrong.

I could be the one who's wrong, but I'm pretty sure MLAA (even sony's method) does not add extra triangles.

I'm not understood exactly what do you want to mean here... you mean it's sure isn't it a bug or a glitch, why? It's just a way to see a thing, ok, but to me seem pretty impossible to establish without know how works this in a code debugging ...

He's trying to say that if they were streaming the LOD into memory, something else would have to be dumped from that memory in order to fit the newly streamed LOD asset.
 
Last edited by a moderator:
I could be the one who's wrong, but I'm pretty sure MLAA (even sony's method) does not add extra triangles.



He's trying to say that if they were streaming the LOD into memory, something else would have to be dumped from that memory in order to fit the newly streamed LOD asset.

Ah ok, never mind, my fault. I thought he substained this kind of things aren't bug or glitch but he has just explained what going on, my apologies. Probably this problem is missed during the debug code but I imagine it's impossible to fix everything with this huge amount of code; maybe a specific algorithm that detect when the mem it's fill & try to fix the more important visual details for the camera perspective & downgrade details negligible, could be helpful, but I imagine is more simple to say than to do...
 
Last edited by a moderator:
He's trying to say that if they were streaming the LOD into memory, something else would have to be dumped from that memory in order to fit the newly streamed LOD asset.

Yeah that's what I was getting at. I was offering as a possibility that the higher LOD assets weren't in memory yet on PS3 while they were on systems with more memory - they would enter memory (and therefore be available to be shown on screen) once the player had moved closer to the models and the contents of memory had been altered accordingly. Perhaps keeping LOD loading/transitions a little closer in parts of the game where memory has become a little tight.

This is just a suggestion though, and AlStrong may well be more on the money when he talks about geometry processing bottlenecks.

Ah ok, never mind, my fault. I thought he substained this kind of things aren't bug or glitch but he has just explained what going on, my apologies. Probably this problem is missed during the debug code but I imagine it's impossible to fix everything with this huge amount of code; maybe a specific algorithm that detect when the mem it's fill & try to fix the more important visual details for the camera perspective & downgrade details negligible, could be helpful, but I imagine is more simple to say than to do...

It could be a bug for all I know, but I can image situations where it's about load balancing of some form or another.

An algorithm that could determine the optimal use of memory across multiple platforms (with a varying amounts of memory) could be a great bonus to a multiplatform game especially if you could just put the same assets on disk and just throw them at any machine. I have no idea how you'd do this though, and there could be times when parity between platforms simply wasn't possible as a result of trying to use platforms optimally.
 
Six years in, and we're just seeing more and more that the two consoles are relatively equal, with some minor advantages and disadvantages for each. Seems the multiplatform devs are able to produce tech-heavy games that are up there with the best.

Cross platform developers shoot for parity as much as possible. Of course they (talented developers) will be close after 6 years. ^_^

Good to see that the PS3 version shows its edge though. Lighting and post processing effects (e.g., MLAA) should be better with the SPUs helping in rendering. Here we hit the DVD limit again.

Early in the gen, everyone, myself included, thought the first parties would eventually be able to put out tech that the multiplatform devs couldn't reproduce. It's been pretty interesting to see, through Digital Foundry, how parity (roughly) has become more the norm, and the 3rd parties are pushing the systems just as well as anyone else.

It seems that first parties focused on post processing last year, and stereoscopic 3D this year. Does BF3 support 3D ?

I just don't think it's all about doing things that cross platform developers can't reproduce. They should share tech with third parties. It's about integrating all the tech together to make a fun and impressive game. I suspect U3 will be able to squeeze more out of PS3.

BTW, is it confirmed that U3 doesn't use Havok ?

EDIT: Hmm... I see the Havok logo at the back of the U3 case.
 
Good to see that the PS3 version shows its edge though. Lighting and post processing effects (e.g., MLAA) should be better with the SPUs helping in rendering. Here we hit the DVD limit again.

Oops I didn't know I loaded into Gamefaqs. :LOL:
 
Last edited by a moderator:
DICE put in a lot of effort in the PS3 version, of course I'm happy they get good results. Also extremely happy that the final product is better than the beta, which people were so concerned about.

We were talking about "Early in the gen, everyone, myself included, thought ..."

In the early days, many people didn't believe developers will run out of DVD storage limit. Sony developers threw in multi-lingual audio tracks, and other assets to use up the space. Only recently did third parties like Id and DICE run into the DVD storage limit.
 
That is a strawman, and not a very good once considering the amount of digital ink spilled on these very pages. I don't think anyone who was informed thought that no game could never exceed the DVD limits seeing as last gen titles already could do this depending on the genre. I don't know who these many crazy people are who you speak of--do they post here??

No, the articulated issue was what kind of games would push the limits (some games are much more prone to push the limits than others) and at what costs (content creation). The general argument -- and might I add the one that has proven to be correct by a land slide -- is that between cost/genre considerations very few games pushed against the limit, and in most cases reasonable solutions where found (disk spanning, HDD installs, etc). And this is even allowing for how long this generation has gone (Holiday #7 right around the corner).

The number of games that are multidisk is a small fraction of the total market and it is a rare game that required more space than a single DVD but also couldn't find a solid resolution such as disk spanning.

DVD space is now becoming a much more looming issue for multiplatform developers for a multitude of reasons. Which, not ironically, is one of the natural pressures indicating that new hardware is needed.
 
That is a strawman, and not a very good once considering the amount of digital ink spilled on these very pages. I don't think anyone who was informed thought that no game could never exceed the DVD limits seeing as last gen titles already could do this depending on the genre. I don't know who these many crazy people are who you speak of--do they post here??

No, the articulated issue was what kind of games would push the limits (some games are much more prone to push the limits than others) and at what costs (content creation). The general argument -- and might I add the one that has proven to be correct by a land slide -- is that between cost/genre considerations very few games pushed against the limit, and in most cases reasonable solutions where found (disk spanning, HDD installs, etc). And this is even allowing for how long this generation has gone (Holiday #7 right around the corner).

The number of games that are multidisk is a small fraction of the total market and it is a rare game that required more space than a single DVD but also couldn't find a solid resolution such as disk spanning.

DVD space is now becoming a much more looming issue for multiplatform developers for a multitude of reasons. Which, not ironically, is one of the natural pressures indicating that new hardware is needed.


I'm kind of surprised there's not much consumer backlash.

By and large few seem to care that increasing numbers of 360 titles have used multiple discs recently, outside of the usual fanboy stuff.

I really didn't hear much complaining at all over BF3's use of multiple discs.

Ironically one oddball negative factor I recently discovered is that for some reason Redbox has a policy of not renting multi-disc games. As such the PS3 version of BF3 was available at their kiosks but not the 360 one. And I was seriously jonesing to rent and blast through the allegedly short campaign.
 
DICE put in a lot of effort in the PS3 version, of course I'm happy they get good results. Also extremely happy that the final product is better than the beta, which people were so concerned about.

We were talking about "Early in the gen, everyone, myself included, thought ..."

In the early days, many people didn't believe developers will run out of DVD storage limit. Sony developers threw in multi-lingual audio tracks, and other assets to use up the space. Only recently did third parties like Id and DICE run into the DVD storage limit.

Let's put this another way, do you think the 360 version of Dark Souls is superior due to the slight blur filter in the ps3 version?
 
No, the articulated issue was what kind of games would push the limits (some games are much more prone to push the limits than others) and at what costs (content creation). The general argument -- and might I add the one that has proven to be correct by a land slide -- is that between cost/genre considerations very few games pushed against the limit

I think actually very many games pushed against this limit, and there is plenty of evidence of it too for those who care to look (crap compression rates for video in many games, having different language versions for different countries, many games using up the available disc-space, etc.). A lot of games decided to hold themselves to this limit, not just because of content generation, but because not targetting the 360 was becoming too costly for developers even for some Japanese developers.

Rockstar very early on was one of the few multi-platform developers who dared to publicly talk about the matter and already indicated they knew that Microsoft was working on solutions to overcome the DVD size-limit.

and in most cases reasonable solutions where found (disk spanning, HDD installs, etc). And this is even allowing for how long this generation has gone (Holiday #7 right around the corner).

But these solutions came much earlier than just this Holiday. Forza 3 was my first game that had a two disc setup that if you didn't install the second one, you'd lose out on some of the game's content (and not small bits).

DVD space is now becoming a much more looming issue for multiplatform developers for a multitude of reasons. Which, not ironically, is one of the natural pressures indicating that new hardware is needed.

I reckon we will never agree on this subject. I think Sony was right in thinking BluRay is the right size for this generation, especially if you expect to make games for it for 10 years. I don't actually think that can so easily be disputed as you made it sound.

I also don't think Microsoft ever believed differently. The question they were forced to ask themselves was "So we've come to the conclusion that basically HD DVD is not an option for us if we want to meet our two crucial requirements of getting a head-start, and being affordable. Can we get away with just having a DVD drive again?" Just like they had done with optional or attacheable HDD after suffering for the fixed cost of it last gen against the HDD-less PS2. The decisions may even have been linked (e.g. we need the HDD if we get a slower HD DVD drive).

They answered the question with "Yes", and they were right, more than Sony ever even imagined, and Microsoft gained significant market-share from Sony, and may now have a permanent first place in the U.S. (of course there were more reasons).
 
Cross platform developers shoot for parity as much as possible. Of course they (talented developers) will be close after 6 years. ^_^

People say that now, but that was not the expectation early this gen, when people thought the PS3 would run roughshod over the Xbox 360. Early on, most multiplatform games heavily favoured one console or another, which made comparisons interesting. Now, they are essentially nitpicking differences, making the comparisons less than interesting.

Good to see that the PS3 version shows its edge though. Lighting and post processing effects (e.g., MLAA) should be better with the SPUs helping in rendering. Here we hit the DVD limit again.

If by showing its edge you mean negligible differences, as is the case with 99% of multiplatform releases, then ok. Lighting, post processing are relatively the same on both releases.


It seems that first parties focused on post processing last year, and stereoscopic 3D this year. Does BF3 support 3D ?

I just don't think it's all about doing things that cross platform developers can't reproduce. They should share tech with third parties. It's about integrating all the tech together to make a fun and impressive game. I suspect U3 will be able to squeeze more out of PS3.

BTW, is it confirmed that U3 doesn't use Havok ?

EDIT: Hmm... I see the Havok logo at the back of the U3 case.

I'm not sure what this is supposed to mean, or how it relates to my post. The expectation this gen was that the first party titles and exclusives would be able to take advantage of the strengths of the platforms and showcase the systems beyond the multi-platform releases. It hasn't really happened. There are many multi-platform titles that are just as impressive in their own right.
 
DICE put in a lot of effort in the PS3 version, of course I'm happy they get good results. Also extremely happy that the final product is better than the beta, which people were so concerned about.

We were talking about "Early in the gen, everyone, myself included, thought ..."

In the early days, many people didn't believe developers will run out of DVD storage limit. Sony developers threw in multi-lingual audio tracks, and other assets to use up the space. Only recently did third parties like Id and DICE run into the DVD storage limit.

Again, I have no idea how your post relates to what I wrote. Everyone knew DVDs would not be sufficient and multi-disc releases would be necessary. The "..." in that quotation, which is the entire context and meat of the statement was that people thought first party titles would be "superior" to third party titles, and it has not turned out to be true. The multi-platform releases are pushing the systems just as hard as anyone else. Parity has become the norm, because the systems have turned out to be relatively equal. They have different strengths and weaknesses, but the devs have figured out how to leverage them in different ways. I was trying to point out how impressive it is that dice was able to make a game look nearly identical on two platforms, with significantly different hardware, and with a game that is pushing visuals and tech as heavily as anything else.
 

They still seem to have put effort in optimising the engine for PC. Am pleasantly surprised I can run SP with ultra/max settings with SSAO/FXAA at 1680x1050 and yet have it almost all the time at 30-60fps and in many parts be near or at 60fps even during firefights. That on a stock 4890 and E8400 dual-core. Helluva lot more optimised than BFBC2. Something like the 8800GTX and 6850 dual-core could certainly do aswell at lower res and that is 2006 year hardware.
 
I thought sony or some other dev has said it before that if you optimized a game to run well enough for the rsx a lot of work are already done for the PC. Like CD use the ps3 code for pc version of Tomb Raider underworld.
 
IMO your wasting your time there Scott, I'm still waiting for him to answer my one simple question but it seems like he's disappeared from this thread for now. :LOL:
 
Status
Not open for further replies.
Back
Top