Digital Foundry Article Technical Discussion Archive [2012]

Status
Not open for further replies.

Just pure speculation, but I noticed that the review stated that the large number of characters in the hospital is the reason behind the significant slowdown in the scene.. I was wondering, could it be that it's rather the large amount of transparency used for the leafs of the two tree?
It occurred to me, because I saw that the Wii-U version of Darksiders2 also lacks the trees on the field where the xbox360 version has them?
I know it's unlikely that such a new GPU would have problem with a "few" leafs, but why would they remove the trees from the Darksiders2 then, or pehaps they just removed them because of the shadows?
 
Last edited by a moderator:
If the WiiU GPU isn't clocked as high as the Xbox GPU then it will probably have lower triangle and fill rates. Depending on how the WiiU's eDram is set up that may also have less bandwidth and affect fill rate accordingly.

So it's possible that the WiiU GPU isn't even as fast as the Xbox GPU in every way. That might explain some of the drops in the Mass Effect 3 cut scene graph.

Edit: maybe texturing bandwidth could be an issue for the Darksiders trees too? I can't really see how they're done in that video but if you're alpha testing and drawing lots of bits of the tree for each pixel that could possibly start eating into main memory bandwidth.
 
Last edited by a moderator:
If the WiiU GPU isn't clocked as high as the Xbox GPU then it will probably have lower triangle and fill rates. Depending on how the WiiU's eDram is set up that may also have less bandwidth and affect fill rate accordingly.

So it's possible that the WiiU GPU isn't even as fast as the Xbox GPU in every way. That might explain some of the drops in the Mass Effect 3 cut scene graph.

Edit: maybe texturing bandwidth could be an issue for the Darksiders trees too? I can't really see how they're done in that video but if you're alpha testing and drawing lots of bits of the tree for each pixel that could possibly start eating into main memory bandwidth.
Well, if the teardown was accurate, the Wii U has half the main memory bandwidth of the 360 and PS3. no idea on eDRAM bandwidth or size though.
 
Is memory bandwidth really that bad?

On paper PS3 has a good chunk more bandwidth ( Nearly double? ) then 360 has because of separate memory pools but yet you don't see developers complaing about lack of bandwidth on 360 because the EDRAM takes up most of the slack.

Same concept on the Wii U but much bigger EDRAM? Also remember that on paper the first Xbox had a lot less bandwidth then PS2 and GC and yet produced much much better looking games then both.

I think a lot of it has to do with it being a new console and running ports and games that it was never designed for.
 
Last edited by a moderator:
Is memory bandwidth really that bad?

On paper PS3 has a good chunk more bandwidth ( Nearly double? ) then 360 has because of separate memory pools but yet you don't see developers complaing about lack of bandwidth on 360 because the EDRAM takes up most of the slack.

Same concept on the Wii U but much bigger EDRAM? Also remember that on paper the first Xbox had a lot less bandwidth then PS2 and GC and yet produced much much better looking games then both.

I think a lot of it has to do with it being a new console and running ports and games that it was never designed for.

are you saying the GPU can't acess the 22.4GB/s memory bandwidth on the XB 360?

the Xbox 1 had 6.4GB/s memory, which was close enough to a high end PC VGA of the time,

the Wii memory bandwidth is equivalent to the lowest possible VGAs (like the 6450), but than there is the edram... even a simple PC VGA like the 6670 and 7750 at the moment have over 70GB/s memory bandwidth.


7 years later, I would expect a console to run even ""bad ports"" well enough...
 
Is memory bandwidth really that bad?

On paper PS3 has a good chunk more bandwidth ( Nearly double? ) then 360 has because of separate memory pools but yet you don't see developers complaing about lack of bandwidth on 360 because the EDRAM takes up most of the slack.

Same concept on the Wii U but much bigger EDRAM? Also remember that on paper the first Xbox had a lot less bandwidth then PS2 and GC and yet produced much much better looking games then both.

I think a lot of it has to do with it being a new console and running ports and games that it was never designed for.

Indications that WiiU is already memory bandwidth bound in the first round of ports probably do mean it's really that bad.

Anyway, the first identifiable problems seem to indicate that the bottlenecks are either the weak CPU or the lousy bandwidth.
Whichever becomes the bottleneck for each game seems to depend on the game itself right now. If you don't see much of a CPU issue, it probably has some kinks in it that indicate a bandwidth problem.

If you do see a CPU issue... well so far some devs went ahead and gave up instead of trying.
 
Is memory bandwidth really that bad?
We don't know.

On paper PS3 has a good chunk more bandwidth ( Nearly double? ) then 360 has because of separate memory pools but yet you don't see developers complaing about lack of bandwidth on 360 because the EDRAM takes up most of the slack.

Same concept on the Wii U but much bigger EDRAM? Also remember that on paper the first Xbox had a lot less bandwidth then PS2 and GC and yet produced much much better looking games then both.
Much depends on the implementation of the eDRAM. Xenos had the ROPS coupled to the eDRAM, saving a massive amount of BW from having to cross the eDRAM/logic bus. If Wuu is working from the eDRAM, it'll be equivalent to another other ordinary bandwidth configuration. eg. If Wuu has 12 GBps main RAM and 30 GBps eDRAM BW, that's a total of 42 GBps available for rendering and game code vs. PS3's 48 GBps total. None of these BW figures are directly comparable as there are various read/write limitations, but Wuu's position regards BW depends entirely on that eDRAM for which we have no information. If it's 30+ GBps and has the ROPS embedded like 360, it shouldn't be an issue beyond devs learning the system. If it's < 30 GBps and which has to serve the ROPS, Wuu could be more BW starved than PS3.

A lot of us were thinking the eDRAM was included for speed, but seeing the rest of the system, Nintendo may be going that way for price. Instead of putting in 30-40 GBps system RAM on a large bus, they used a pokey 64 bit bus for 12 GBps and used eDRAM to make up the rest. There may be nothing clever about it at all.
 
The problem is that some data is just too large to fit into the EDRAM but has to be accessed every frame. Like, all the textures or (cascaded) shadow maps.
 
Is memory bandwidth really that bad?

On paper PS3 has a good chunk more bandwidth ( Nearly double? ) then 360 has because of separate memory pools but yet you don't see developers complaing about lack of bandwidth on 360 because the EDRAM takes up most of the slack.

Same concept on the Wii U but much bigger EDRAM? Also remember that on paper the first Xbox had a lot less bandwidth then PS2 and GC and yet produced much much better looking games then both.

I think a lot of it has to do with it being a new console and running ports and games that it was never designed for.

I imagine, in simplistic way to see the matter, main problem is the edram is not the magical word for everything if the main ram is not the same appropriate. edram needs to a balanced main ram for other things, from what I have understood.
 
Looks like article is missing shadow LOD and other stuff (like missing normal maps).

Are you sure the former is not from the split-screen/gamepad mode and the latter was a streaming issue that someone found? They're pretty clearly identical from the captured (campaign) footage.
 
Are you sure the former is not from the split-screen/gamepad mode and the latter was a streaming issue that someone found? They're pretty clearly identical from the captured (campaign) footage.

Hmm, that may be the case. But I think shdow LOD screenshots (posted in IQ thread) were from SP, I'm not sure though.

Edit:

it was from the MP map hijacked.

Ok, thx.
 
Last edited by a moderator:
For the ROPs, yes, there's a fair bit of bandwidth consumption but those presumably render to eDRAM. Of course, we don't have specs for how much bandwidth the eDRAM provides, but one would think they'd have sufficient amount if they're going to bother implementing 32MB of a relatively expensive process.

Screen-filling transparencies/overdraw do incur high demands on raw fillrates, which is what's more curious. Speculation of a ~110mm^2 GPU @ 40nm does make the notion of an 8ROP chip somewhat likely. 16 ROP would almost certainly be out of the question given how much space it'd take relative to other important aspects of the GPU.
 
For the ROPs, yes, there's a fair bit of bandwidth consumption but those presumably render to eDRAM. Of course, we don't have specs for how much bandwidth the eDRAM provides, but one would think they'd have sufficient amount if they're going to bother implementing 32MB of a relatively expensive process.

Screen-filling transparencies/overdraw do incur high demands on raw fillrates, which is what's more curious.

It's always hard to determine hardware features from software, but the fact that they aren't running ported 360 games at higher res say to me that WiiU probably has similar fillrate and probably 8 rops, or the bandwidth to EDRAM isn't spectacular
 
Status
Not open for further replies.
Back
Top