Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
G3 and G4 are overrated, back in the day Apple cheated in benchmarks, they offered completely cooked and meaningless Photoshop results to say that their CPU was twice faster than a PC CPU. They did it first with PowerPC G3 vs Pentium II.
(just write a couple bogus Photoshop plugins, specially tailored to give you that result, write some shit up and pay for ads space in the press. well, that worked 15 years ago.)
 
Motorola fell too far behind in clock speed. That was the only problem. A big problem lol

Developers have said 486MHz Gekko was pretty competitive with the 733MHz Celeron in Xbox though.
 
Bringing up Metroid Prime again (you can never bring up MP often enough!), that game ran skinning, animation of all visible objects, and additionally, ragdoll simulation of enemies on the CPU, at 60fps. Also little bonus things, like the reflection of Samus' face on the inside of the visor, her arm and fingers when using X-ray visor, and so on without any slowdown. Some scenes also had a lot of particles on-screen, like snow or rain.

It was a really really tight piece of coding methinks.
 
It's a PS360 game at PS360 resolution and PS360 frame rates with PS360 geometry but some improved assets because it has more memory. It's doing nothing unexpected.

If it was a 320 shader PC part it would be ripping the PS360 new assholes, just like 320 shader PC parts do with console ports. So either the WiiU doesn't have the power of those PC parts, or it's crippled. Nothing has changed.

One again, no super special secret sauce. The goal posts for WiiU powah validation are now practically on top of the PS360.
 
So, can we conclude that WiiU lacks hardware accelerated shadow/depthmap filtering?

Adding to Graham's "No", I'll add "No, we can conclude that WiiU has weaker hardware than top end PC hardware" which should be no surprise to anyone.

It's a PS360 game at PS360 resolution and PS360 frame rates with PS360 geometry but some improved assets because it has more memory. It's doing nothing unexpected.

If it was a 320 shader PC part it would be ripping the PS360 new assholes, just like 320 shader PC parts do with console ports. So either the WiiU doesn't have the power of those PC parts, or it's crippled. Nothing has changed.

One again, no super special secret sauce. The goal posts for WiiU powah validation are now practically on top of the PS360.

And yet if it was 160 one could claim it shouldn't be able to match PS360
 
And yet if it was 160 one could claim it shouldn't be able to match PS360

Not according to benchmarks of 160 shader PC parts. Even with only 8 TMUs and 4 ROPs they can give the 360 a run for its money. Even with all the PC AIDS that hangs around in PC land.
 
So, can we conclude that WiiU lacks hardware accelerated shadow/depthmap filtering?
Why would you need specific hardware for that? I don't see why regular filtering hardware couldn't treat a depth/shadow map like any other type of texture map, it's still simply binary data.
 
It's a PS360 game at PS360 resolution and PS360 frame rates with PS360 geometry but some improved assets because it has more memory. It's doing nothing unexpected.

If it was a 320 shader PC part it would be ripping the PS360 new assholes, just like 320 shader PC parts do with console ports. So either the WiiU doesn't have the power of those PC parts, or it's crippled. Nothing has changed.

One again, no super special secret sauce. The goal posts for WiiU powah validation are now practically on top of the PS360.

:LOL:
 
Why would you need specific hardware for that? I don't see why regular filtering hardware couldn't treat a depth/shadow map like any other type of texture map, it's still simply binary data.

Because 'normal' bi- and trilinear filtering multisamples a color map and blends it, while depthmap filtering multisamples a map, compares all samples to a certain depth and determines an attenuation based on that.

In comparison, the alternative would be programming the pixelshader to sample the map, compare the depth and add to the attenuation value for each sample, which is much slower.

I thought I read hw depthmap sampling (or perhaps multisampling) is a DX10 feature.

@Function: The WiiU version does feature more and better reflection mapping. Ofcourse, they may have precalculated the maps, just requiring more memory. But if it it calculated realtime, which I suspect, It needs to calculate a lot more views compared to XBOX version. Since we know the WiiU lacks any bandwidth (8 rops, 550MHz) it must feature more calculation power so that it in the end gains in fillrate.
 
It's a PS360 game at PS360 resolution and PS360 frame rates with PS360 geometry but some improved assets because it has more memory. It's doing nothing unexpected.

If it was a 320 shader PC part it would be ripping the PS360 new assholes, just like 320 shader PC parts do with console ports. So either the WiiU doesn't have the power of those PC parts, or it's crippled. Nothing has changed.

One again, no super special secret sauce. The goal posts for WiiU powah validation are now practically on top of the PS360.

I dont really disagree, but I assume weak CPU/Low mem BW could be hindering things to where we dont see 320 shaders yet, if they are there.
 
depthmap filtering multisamples a map, compares all samples to a certain depth and determines an attenuation based on that.
Ah, I thought it was handled like a greyscale bitmap. Thanks a lot for the info and clarification! It's appreciated.
 
It's a PS360 game at PS360 resolution and PS360 frame rates with PS360 geometry but some improved assets because it has more memory. It's doing nothing unexpected.

If it was a 320 shader PC part it would be ripping the PS360 new assholes, just like 320 shader PC parts do with console ports. So either the WiiU doesn't have the power of those PC parts, or it's crippled. Nothing has changed.

One again, no super special secret sauce. The goal posts for WiiU powah validation are now practically on top of the PS360.

I'm not so sure. That 160 shader pc card was a DirectX11 part. I haven't seen any evidence that running games designed around a DX9 foundation on a comparable DX10.1 chip (which Latte reportedly is) gives performance advantages substantial enough to make up for an 80 shader disparity. That is to say nothing of the fact that Wii U does not use DirectX.

While there are a variety of explanations for the disappointing framerates we've seen in Wii U games, I do agree with your assessment that a 320 shader part should be able to run PS360 games in higher resolutions. However, it is also possible that the amount of eDRAM is limiting the size of the framebuffer, and this is the reason for the lack of jump in resolution. This would be especially true if devs are also using it to store local render targets and perhaps even run CPU code.
 
I do agree with your assessment that a 320 shader part should be able to run PS360 games in higher resolutions.

Fourth, doesn't this only apply if the PS360 versions don't hit anywhere near maximum fillrate? I don't know how much SPU clocks today's shaders take. But the fact that PS360 doesn't filter shadowmaps either makes me think they don't use an excess of it. 20-30 ops divided over all 5 ALUs at a max doesn't impact the fillrate at all, having 240 SPUs. But it's a wild guess, I sadly enough don't have experience on modern GPUs.

A custom ISA that provides some 'shortcuts' to accelerate general shading algorithms, not being used due to immature GX2 API or unwillingness to port code to DX10.1++, may be a reason too. Though I think this is my own sad desire:)
 
Couldn't it be that a lot of the vectorizable code such as the physics engine etc is being offloaded onto the GPU to compensate for the weak CPU?
 
Status
Not open for further replies.
Back
Top