HD problems in Xbox 360 and PS3 (Zenji Nishikawa article @ Game Watch)

Dave Glue said:
Can anyone point to some concrete info on what the 360 actually does with Xbox games in terms of resolution? Are they still running at 640*480 and just scaled to 720p with some free AA, or are they actually running in 720 res? I would expect if they were just upsampled we would also see a lot of aspect ratio problems which don't appear to be an issue.
They get free AA at 480p and are then upscaled to 720p. When Bungie showed their Halo 2 360 shot you could tell that it had been upscaled.

http://www.bungie.net/images/news/inlineimages/halo2cine2.jpg
The aliasing on the trouser pocket is two pixels wide compared to this one:
http://www.bungie.net/images/news/inlineimages/halo2_480i.jpg
 
Dave Glue said:
Can anyone point to some concrete info on what the 360 actually does with Xbox games in terms of resolution? Are they still running at 640*480 and just scaled to 720p with some free AA, or are they actually running in 720 res? I would expect if they were just upsampled we would also see a lot of aspect ratio problems which don't appear to be an issue.

It's run at 640x480 with AA and then upscaled to 720 horizontal lines (it obviously won't use all 1280 vertical lines, or you'd get goofy stretching, so it keeps the aspect ratio at 4:3, unless it was a widescreen game to begin with). So it's about 960x720 of used pixels (as the other pixels will just be the black bars on a 4:3 480p game).
 
Funnily enough, this posting asserts that BC is more work for XB360 than "next-gen" games :oops:

http://xbox360.qj.net/Xboxexpert-s-Watercooled-360/pg/49/aid/28958


Halo 2 nonstop for 1h:
CPU 88F/31C, GPU: 100F/38C
Condemned: Criminal Origins nonstop for 1h:
CPU: 73F/23C, GPU: 84F/29C, GDDR3: 95F/35C
GRAW nonstop for 1h:
CPU: 73C/23C, GPU: 82C/28C

Note these temperatures have been measured in a watercooled mod of the 360.

(Found over at Rage3D)

Jawed
 
That means that every single x360 game has no true 720p resolution ?

A game like FN3 or GRAW are not 720p ? Is it really noticeable if its true ?
 
rosman said:
That means that every single x360 game has no true 720p resolution ?

A game like FN3 or GRAW are not 720p ? Is it really noticeable if its true ?

They're talking about XBox1 games run on 360.
 
rosman said:
That means that every single x360 game has no true 720p resolution ?

A game like FN3 or GRAW are not 720p ? Is it really noticeable if its true ?

I just know that a lot of people are going to think like this gentleman from reading this thread. No Rosman this is not the case. And honestly there's no reason to be concerned. The only game that wasn't true 720p was PGR3 (because it was rushed) and it still looks amazing.
 
Fafalada said:
If they are still allocating one shadow map Per object, I doubt the little efficieny loss in shaders would make a difference. :p
I doubt they're still doing this with a full screen shadowing pass cause I think it would be a mess to select the correct shadow map to sample on a per pixel basis..(well, it can be done..but not with a single pass, probably the stencil buffer can be your friend here)
 
Jawed said:
Funnily enough, this posting asserts that BC is more work for XB360 than "next-gen" games :oops:

http://xbox360.qj.net/Xboxexpert-s-Watercooled-360/pg/49/aid/28958




Note these temperatures have been measured in a watercooled mod of the 360.

(Found over at Rage3D)

Jawed

Wow!!! and I believed GRAW was using some power from the console.

Pretty clear it is not. Still a long path to maximizing the usage of the console. Too many idle cores and ALUs still in there.
 
Jawed said:
Funnily enough, this posting asserts that BC is more work for XB360 than "next-gen" games :oops:

http://xbox360.qj.net/Xboxexpert-s-Watercooled-360/pg/49/aid/28958

Halo 2 nonstop for 1h:
CPU 88F/31C, GPU: 100F/38C

Condemned: Criminal Origins nonstop for 1h:
CPU: 73F/23C, GPU: 84F/29C, GDDR3: 95F/35C
GRAW nonstop for 1h:
CPU: 73C/23C, GPU: 82C/28C


Note these temperatures have been measured in a watercooled mod of the 360.

(Found over at Rage3D)

Jawed

Hmmm, I always thought that the CPU would be the heat beast here not the GPU, but from here it seems that the GPU is the warmer one. Any ideas how effective that water cooling is and what the heat would be with the standard cooling?...
 
Ue3 Engine And Aa?

Regarding this UE3 issue that was brought up and "sorta" explained here comes a dummy question.
Is not Gears of War *told* to have 4xAA(if its not please correct me), and its build on UE3 so it just made me stop and i have to ask.

ERP,nAo or faf any more info compiled into dumbass format for the UE3-engine problem/tiling blabla with shadows?
 
Platon said:
Hmmm, I always thought that the CPU would be the heat beast here not the GPU, but from here it seems that the GPU is the warmer one. Any ideas how effective that water cooling is and what the heat would be with the standard cooling?...
I didn't notice that, and yes it is surprising (when you consider the relative sizes of the heatsinks on the stock XB360). Hmm...

Obviously Xenos is a bigger die, with much less memory (which I guess runs cooler, so more %age of memory on die means cooler die overall I guess) but the clocks :oops:

Jawed
 
Jawed said:
I didn't notice that, and yes it is surprising (when you consider the relative sizes of the heatsinks on the stock XB360). Hmm...

Exactly. Going by the heatsink size and that is is stuffed under the DVD drive, one would think that heat would not be a problem for the GPU. I wonder if the so much talked abour overheating issues have to do with the CPU or GPU, because going by this it seems to be the GPU. A bit of a design flaw to put it under the DVD?...
 
This is a homebuilt water cooler kit, is it not? If so, you might want to check the flow of the water cooling before coming up with any conclusions (i.e. does it pass over the CPU then the graphics...?)
 
So What I was saying since the announcement of XBOX360 specification was TRUE !!! :D

I am very happy !!!

I was sure that Microsoft and ATI decision to include 10 MB edram to do 720P + 4X free anti aliasing using Tile rendering was just asking for trouble. A lot of trouble for developers. But unfortunately almost everyone on this forum was saying : You are wrong ! Tile rendering is a realistic intelligent not so difficult to implement solution and wont make XBOX360 programming more difficult and will allow Free 4X anti-aliasing...:rolleyes:

Microsoft just repeated the WORST technical problems of XBOX adding with them the worst technical problems of PS2, which make exploiting the console to the maximum a NIGHTMARE for even talented developers :

- 10 MB of edram !!! What the hell ?!! PS2 had 4MB of edram to render ONLY 480i images with ONLY 2X AA and everyone was saying this is a nightmare for programmers. So How the hell Microsoft and ATI repeated the same error ?!! How the helle they expect programmers to do real 720P resolution + 4X AA on 10 MB ?!!!!!!!!!! :oops: They could do this of course, but after how much of years ?!! 3 years ? 4 years ? And how many developers will achieve this goal ?!! 2 ? 3 developers ?

- 512 MB of SHARED memory !!! Same problem of XBOX1...this simply makes using maximum GPU power IMPOSSIBLE. If you dont have dedicated memory and bandwidth for textures and shaders, you are just asking for big problems...a conflict between CPU demand for bandwidth and GPU demand for bandwidth...How many developers could achieve this almost impossible equilibrium between Bandwitdth CPU demand and GPU bandwidth demand ? at what cost ? and after how many years of development ? 4 years ?...

- the CPU : 3 CORES with 6 threads :LOL: Are they really expecting programmers to create 6 threads running games ?!! using 3 cores is easy (like using all 6 avalaible SPU core on CELL) but adding to this the equilibrium of managing 2 threads in each core...this is just CRAZY...
 
you have a lot of angryness inside you.. breath some fresh air boy :)


on the other hand about the xecpu. Why did they choose for 10mb Edram ?
why not 15 or 20mb? purely for cost reasons or..?
 
FouadVG, that was the worst post I've seen yet on these boards. You completely ignored all the information in this thread, and threw in more of your misconceptions on top.

A) If XB360 is having bandwidth problems, it shows they made the right decision, as lack of eDRAM would make the problems worse.

B) PS2's eDRAM was for both the framebuffer and textures. XB360 has up to 512 MB for texture memory.

C) Your comparisons to last gen are backwards. XB360 is more like PS2 because the eDRAM offloads all framebuffer traffic. RSX is more like XBox1 as framebuffer and texture traffic are all on the same bus. Note that the FSB of XBox1's CPU can only get 1GB/s anyway.

D) Unified memory means you don't need any management in your code. If you are bandwidth limited, you'll always be using peak bandwidth, regardless of how the hardware distributes things. On the contrary, if RSX needs more bandwidth it's short of luck, even if CELL isn't using much, as the majority of the data needs to be read from or written to GDDR3. Splitting bandwidth between two buses doesn't increase efficiency, it decreases it. Let's not forget the problems of two different memory pools on PS3.

In terms of sharing resources, this argument is nonsense. GPUs have memory controllers that balance requests from the render back end, the z/stencil testing, the RAMDAC, the texture units, the vertex shaders, the command processor, and probably other clients also. Adding CPU requests is nothing, and Xenos doesn't need to handle the large volume from the first two on this list either. RSX is going to be juggling more than Xenos.

E) Game engines take years to write, and actual games take a while on top of that. Tiling is a simple concept, but you need to know about it early to put it in your engine properly. Even if you told developers about this at the end of 2004, you're unlikely to see it used in many games until maybe 2007.

F) To use 6 SPE's, you need six different threads also. There's no magic that lets CELL run non-parallel code on six processors.
 
Mintmaster said:
C) Your comparisons to last gen are backwards. XB360 is more like PS2 because the eDRAM offloads all framebuffer traffic. RSX is more like XBox1 as framebuffer and texture traffic are all on the same bus. Note that the FSB of XBox1's CPU can only get 1GB/s anyway.


This is exactly how I see the graphics playing out this coming generation.

PS3 = XBOX
XBOX 360 = PS2

I think PS3 will start higher and keep the edge for the generation but not improve as much across time just like the XBOX.

XBOX 360 will start lower but improve more over time just like PS2.

I also predict the ****** arguments will be the complete reversal of what it was last gen.
Actually that's not a prediction it's already happening... ;)
 
Tech design wise I would rather say:
XBox360 = NGC
PS3 = XBox (except for CELL of course)
and expect:
Wii = NGC
 
Back
Top