YIKES!
I didn't realize the protaganist of the game was elderly and arthritic.[/QUOTE]
Story spoiler:
[color=white]you get mutated into something that gives you a "primal" form.[/color]
YIKES!
I didn't realize the protaganist of the game was elderly and arthritic.[/QUOTE]
Story spoiler:
[color=white]you get mutated into something that gives you a "primal" form.[/color]
Well, remember that the Wii only has a 1 MB texture buffer
and if I recall correctly from the original Gamecube, some kind of virtual texturing.
This virtual texturing probably only reads from the 24 MB main memory pool
leaving developers to manually shuffle textures around between the 64 MB pool and the 24 MB pool
as well as whatever work may need to be done with the texture cache itself.
In any event, I'd hardly describe it as more efficient, though it may yield higher performance.
I prefer the look of the Xbox version based on those screens, the textures don't turn to crap 5 feet away from the camera.
One thing I don notice is that they've added some light blooming just like Red Steel. The enemies also seem to have a little bit more detailed textures (look at the arms and the veins).
Compare it to the Xbox version...
response
That looks better than the Wii version. I think they are underestimating Wii's graphical capabilities expecting that the game will sell anyways thanks to the original gameplay
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB). That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii. Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB). That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii. Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB).
That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii.
Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.
Wasn't Gamecube 24MB and the Wii upgraded to something like 96MB?
fact of life is the xbox architectureal decisions ended up so bad that even overly-expensive componets could not mend that. how much effor was wasted with the development of nv2a's 'omnipotent' cross-bar mem controller only to find out at the end of the day it was not that magic bullet*? how much did the high-specced DDR delivering the 'massive' bw cost only to be crippled by insane latencies?
* ok, it ended up being a fine local vidmem controler, so nv actually gained from it. at somebody else's expense, though.
I don't know, I think you're being a bit too hard on the Xbox here. It is, after all, the console with the best graphics of the past generation. It's also the only last-generation console with games running at HD resolutions (GT4 1080i notwithstanding). A big factor in it's ability to do that was in fact its UMA design.
It maintains the original 24 MB 1T-SRAM pool with the 16 MB "A-RAM" from the GameCube being upgraded to 64 MB, along with a faster interface.