Digital Foundry Article Technical Discussion Archive [2011]

Status
Not open for further replies.
I should say it's more washed out in the particular screen I see there rather than "stronger". You want the bloom to be bright near the light source and not bleed out too much from it. Actually, Bungie has some screenshot comparisons in their attempts to reduce bloom & lighting washout due to "insufficient precision" in their HDR lighting presentation. Might want to have a peek at it.

Maybe Laa-Yosh would care to comment since he'd know more about it than I do. :p

I'm at work, so I can't see any of the pictures you are referring to, so maybe when I look at them at home, it all will make more sense to me. I do think I understand by what you mean regarding the washed out effect though.

Now that you mention it, I do recall the Bungie slides. I'll have to take another look at it when I get home.

I would be interested to read Laa-Yosh's comments on it if he posts though. :D Edit: Never mind, saw his post above mine.
 
kind of weird that the result is other way around where the PS3 version slow down on physics and 360 version slows down on the fog. Shouldn't it be other way around?

Even more odd, is the fact that both titles are apparently implemented with Sony's PhyreEngine middleware.

the middleware is available for free since day 1 and some multiplatform games like gripshift already use it ages ago. I don't think its a full engine like UE3, but should get dev started quickly.

The developers, available resources and their design goals will usually determine the outcome. Tools and h/w come into play when all the factors are aligned. I think content play a huge part too.

I suspect in the Dark Soul's case, the PhyreEngine particle tool generates the code (via a UI tool according to PhyreEngine slides). The developers probably didn't have time to hand optimize it later. For physics affecting PS3 visuals, it may be because the developers schedule both physics and graphics tasks to the same pool of SPUs without segregating them. It's a general workers pool model (simple and general, instead of a highly custom setup), likely based on SPURS ? A dedicated setup would require lot's of time to tune to prevent stalling any SPUs and the RSX.

If the developers intend to keep the versions on par, they can always achieve it give or take a little bit. PhyreEngine 3.0 can run on Vita now. In due time, we may/will see Vita running this game "on par" with PS3 and 360 too.

I remember in one of the cross platform racing games, the developers only use 3 SPUs for rendering according to their tech slides. Since it achieved parity (within reasonable bounds), project-wise, it may be wise to tackle other issues.


EDIT: For BF3 beta analysis, I am curious about MLAA vs FXAA. *If* there is no visible difference for this game, then it may be interesting to move FXAA to PS3. I remember BF3 follows PhyreEngine's MLAA. SPU does the calculations whereas RSX performs the blending. Doing FXAA on RSX alone takes about 1.2ms, should be able to save some SPU cycles for other purposes. But it's hard to say without knowing the full picture.
 
I'd love to hear how Rage performs on PS3 with an SSD. Would also be nice to learn why PS3 doesn't have a full install option.
 
I'd love to hear how Rage performs on PS3 with an SSD. Would also be nice to learn why PS3 doesn't have a full install option.

from what Carmack said it's because Sony imposes some limitations to what you can install, or how much HD space you can use for that,
 
Hmm, it's pretty interesting how some surfaces consistently close to the camera like the buggy's dashboard will only display a lower MIP level on the X360 compared to the PC, but others like that old dude with the laser scanner have the same detail. Or how the PS3 has a lower MIP with the old guy.
I wonder what controls or constrains this...

Oh and also strange to see how much aliasing is still there on the PC despite the 8xAA.
 
from what Carmack said it's because Sony imposes some limitations to what you can install, or how much HD space you can use for that,

Yeah, I think the 8Gb limit is imposed on Id.

Carmack also said that on hind sight, they might want to load the high res textures directly from HDD instead of loading the load res textures first. The latter may have taken some space and time away.

Would be interesting to compare the differences between DICE's approach and Id's. Based on Carmack's presentation, his framework starts from an abstract, clean and high level concept and then mapped to PS3 based on various constraints. The final PS3 code became a complex mix of things to work around the hardware bottlenecks. The PC one is clean and nice.

I wonder if he had start from the hardware constraints and strength, and then design (the streaming system) from ground up, what differences it would make. That's why I have always wanted to see DICE's streaming solution after they presented their SPU-based rendering system (How they design it).
 
Dice probably doesn't have nearly as much texture data in BF3. Much as I like BF3's looks, the completely unique texturing stuff is, well, unique for id now.
 
PC being clean and nice is a bit... well...

The "highest res" textures are... abysmal in MANY cases. There are thing in this game which receive less texture detail than PS1 games. And I am NOT kidding here. In parts, it is this extreme. And it's not my PC either, as I got a quite good one and I have the most recent Rage Performance drivers and ... ... yes, I did my homework.

I guess unique textures make the game too big. But I guess making this game look like it should on a high end PC would take 3 Blu Rays instead of 3 DVDs.

Still, the game does look pretty good, as long as you don't fix your eyes onto something. In movement, everything is "perfect" (except some of the smaller bugs still present).

It's also the first games, that really maxes out my CPU. My GPU otoh is "idling around" most of the time, even at 8xAA.
 
patsu said:
Yeah, I think the 8Gb limit is imposed on Id.

Carmack also said that on hind sight, they might want to load the high res textures directly from HDD instead of loading the load res textures first. The latter may have taken some space and time away.

Would be interesting to compare the differences between DICE's approach and Id's. Based on Carmack's presentation, his framework starts from an abstract, clean and high level concept and then mapped to PS3 based on various constraints. The final PS3 code became a complex mix of things to work around the hardware bottlenecks. The PC one is clean and nice.

I wonder if he had start from the hardware constraints and strength, and then design (the streaming system) from ground up, what differences it would make. That's why I have always wanted to see DICE's streaming solution after they presented their SPU-based rendering system (How they design it).

Are we sure there is actually more than 8GB of data worthy of streaming? The absolute max is about 12GB if I take off 1 disc for MP, 2GB for sound and no duplication of data on disc 1 and 2. So not that much difference there I suspect. Personally I think encryption imposes at least some burden and causes the HDD to be slower in the PS3, but memory structure could have hurt PS3 also. I don't really buy that Sony arbitrarily put the limit at 8GB if 12GB would have been the absolute max. SSD does seem to help
the PS3 version to the point that texture pop-in all but dissappears.
 
The SSD option is superb. We need a way to get SSD performance into a budget package, say a 16 GB unit, as a cache for next-gen. Virtual textures and meshes could be streamed very nicely from there.
 
I think the PC version uses the same assets (installation is about the same size as on 360). And the VT directory is 11.4GB in size (mp is in a different directory altogether and can be deleted if you don't care about it).

I am actually a bit amazed, that the streaming on PS3 from disc AND hdd doesn't work better (haven't watched the DF videos yet, as I surf via UMTS at the moment). My PC has slow HDDs in it (well, older ones), but it's completely defragmented, which probably helps a lot. All streaming "issues" I get is my CPU not being fast enough transcoding the textures. As I said before, my GPU is more or less 50% idle all the time, even at 1080P@8xAA (GPU-Z says so, at least). And given these observations, Rage is more CPU dependent than GPU. Thus I find it highly strange that PS3 has these issues. Or is it really the slow disc/hdd access that's the culprit? Some commenter at EG states that he uses an SDD in his PS3 has basically has no issues at all... I'd love to see a comparison between those systems.
 
How is the VT directory organized ? How many resolution does it support ? As in how are those 11Gb distributed ? PS Home supports up to 12Gb HDD cache, but it's optional.
 
It's two files per "area"... i.e. "wasteland.pageLines" and "wasteland.pages" (the first one seemingly being a toc of some sorts). So there's no telling what resolutions these files have inside them. The sizes for the "pages" files range from 50MB to 1.7GB (which is a file that stands out, as it has an underscore in front of it... probably npc models etc.).
 

No mention of AA method used on consoles, does anyone know?

I clicked on a couple of the PS3 shots and they had horrible aliasing. The dynamic resolution could possibly be blamed but I'm not certain.

Edit: Actually the dynamic resolution is blamed in the article, didn't read it closely enough the first time. But even still it seems like there's no attempt at AA on the PS3 shots.
 
Last edited by a moderator:
Uh, could we possibly see some tech interviews with maybe id, Epic, DICE, or Codemasters in the near future? Haven't had those in a while. :smile:
 
Status
Not open for further replies.
Back
Top