Zelda Twilight Princess on Wii is a solid 30fps - it is absolutely *not* 60fps - same as the Gamecube version, 30fps.
I'll be sure to tell my good friend that he is, indeed, an idiot! I didn't think it was 60fps either.
Zelda Twilight Princess on Wii is a solid 30fps - it is absolutely *not* 60fps - same as the Gamecube version, 30fps.
I don't particularly remember physics in the monkey ball games besides the monkey bouncing, and the wii sports physics seem better than most havok/ragdoll stuff.
But since Wind Wakers 30fps was as close to 60fps as 30fps is ever going to get (i don't know what they did but it was damn effective), I'm quite fine with that.
Monkey bowling!!!!! The physics weren't perfect, but they were there.
the PS2 played the PS1 startup when playing a PS1 game as well. however, on Wii if you play a GC game you have to reset the system to get the wiimotes to function again or to get back to the wii menu. it's downright annoying, because you need a wiimote to start the GC game as well, and the power button on the wiimote won't even reset the system. unless, of course, i'm missing something. i skimmed through the manual but don't remember reading anything about it.Also I noticed watching a GC game being played on the Wii that when you start a GC game,instead of the game just starting,the GC startup happened first as if the Wii is emulating the GC instead of just playing the game. I don't know what it means I just thought it was interesting.
2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.
3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).
the PS2 played the PS1 startup when playing a PS1 game as well. however, on Wii if you play a GC game you have to reset the system to get the wiimotes to function again or to get back to the wii menu. it's downright annoying, because you need a wiimote to start the GC game as well, and the power button on the wiimote won't even reset the system. unless, of course, i'm missing something. i skimmed through the manual but don't remember reading anything about it.
Why would they limit it?
But since Wind Wakers 30fps was as close to 60fps as 30fps is ever going to get (i don't know what they did but it was damn effective), I'm quite fine with that.
I played mario galaxy which really just felt right. BTW the game runs at a super super super super smooth 60fps. Placed right next to DQ Swords which was 30fps really makes you appreciate just how awesome 60fps is. In fact, seeing the game in person genuinely impressed me. Its really really clean.
So now that we know the Wii has 64MB of GDDR3 and 24 MB 1T-SRAM it brings up the question of how devs will use this RAM.
1. What is GDDR3 typically best suited for(graphics?) vs. 1T-SRAM(other stuff ?)
2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.
3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).
Also I noticed watching a GC game being played on the Wii that when you start a GC game,instead of the game just starting,the GC startup happened first as if the Wii is emulating the GC instead of just playing the game. I don't know what it means I just thought it was interesting.
It does not do the GC bootup, it just plays the GC bootup noise while you're still in the Wii OS looking at the disc channel. Now for a crazy wackjob theory.....the extra 20% of unexpected die size on the cpu is due to virtualization hardware! (doubt it, you can't even quit out to the host OS from within a cube game)
Also interesting, while I didn't do any exhaustive testing, it appears Gamecube games are free of slow down when played on the Wii. I was only playing on an SDTV though and not an HDTV, but the games I tried (RE4, the demo of Rebel Strike, and Super Smash Bros Melee with poke balls on very high) seemed to maintain whatever their default framerate was. Additionally, the flicker filter option in SSBM seemed to give better results than I remember on the cube, but since I haven't played over anything less than component video (even when running interlaced) in years, it's probably just the extra blur from composite hiding most of the detail of the image anyway, jaggies and textures. I'd be surprised if Nintendo left the clock speeds at full when running gamecube games (though as long as they maintain the same ratios, it should work, right? well, unless nintendo allowed games to be made that didn't use vsync), but it would be nice if gamecube games had slowdown removed. Though they never had all that much slowdown to begin with.
No memory type really can have a natural best application. GDDR is GDDR just to separate it from the PC main memory spec revs, not because it was "optimized for graphics" in any way. Though you could say that type of memory is optimized for being soldered onto a PCB to act as non-upgradeable memory, as opposed to optimized for a mixed-and-matched jumble of memory sticks, with an unknown total number and different sizes and manufacturers, if you catch my drift.So now that we know the Wii has 64MB of GDDR3 and 24 MB 1T-SRAM it brings up the question of how devs will use this RAM.
1. What is GDDR3 typically best suited for(graphics?) vs. 1T-SRAM(other stuff ?)
Developers will avoid using the GDDR for graphics as hard as they can. They just won't be able to avoid it completely.ninzel said:2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.
I have absolutely no doubt that all of it will be available to native Wii games. As you say, it would be a waste. Also because the GDDR alone would be just too slow for a new console, even for Nintendo's humble standards.ninzel said:3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).
No memory type really can have a natural best application. GDDR is GDDR just to separate it from the PC main memory spec revs, not because it was "optimized for graphics" in any way. Though you could say that type of memory is optimized for being soldered onto a PCB to act as non-upgradeable memory, as opposed to optimized for a mixed-and-matched jumble of memory sticks, with an unknown total number and different sizes and manufacturers, if you catch my drift.
What's far more important here is that the GDDR has such a dinky interface, low bandwidth and in all likelihood terrible latency especially in comparison to the 1T-SRAM.
So in the Wii the GDDR is really slow memory while the 1T-SRAM is fast -- no idea how fast exactly (yet) but faster than the GDDR in any case.
Developers will avoid using the GDDR for graphics as hard as they can. They just won't be able to avoid it completely.
It's probably safe to say that a majority of the system memory, and that includes the GDDR space of course, will be used for graphics assets, but that applies to all consoles, not just the Wii. Better textures und more detailed models always make a difference and hence are a good way to use up the available memory.
It probably isn't a huge problem either way. Flipper has edram for the framebuffer and an enormous texture virtual memory kinda thing/cache so it can cope with poorly performing external memory really well. Hollywood should have inherited that ability.
I have absolutely no doubt that all of it will be available to native Wii games. As you say, it would be a waste. Also because the GDDR alone would be just too slow for a new console, even for Nintendo's humble standards.
The GDDR3 is simply a very cheap mass-market product. The SRAM die alone, even though it's not even half the storage space, is two thirds of the size of the whole GDDR3 chip package. That should tell you something.So then why use GDDR at all,is it cheap? Doesn't it seen strange that the largest pool of memory would be the worst performing? Ever since I saw these specs I've been trying to figure out why Nintendo devised this memory setup, and this idea that GDDR3 is poor compared to 1T-SRAm makes it more confusing.Last gen the largest pool was the fastest. I though Nintendo made efficient well thought out systems.
The GDDR3 is simply a very cheap mass-market product. The SRAM die alone, even though it's not even half the storage space, is two thirds of the size of the whole GDDR3 chip package. That should tell you something.
Having a small amount of fast memory plus a larger amount of slower memory isn't all that bad, it's not even unusual. It forces programmers to be careful about where which data goes and when, but they usually can handle that. It's the same with the GBA btw, and with the DS and many other systems. You do it on other levels as well: every PC processor (or Broadway itself) has a cache that is a smaller, faster companion to the main memory, and on the other side of the spectrum virtual memory (on the HDD) to complement system memory.
I'm far more concerned about the total size of the memory. 88MB is just extremely little.
i mostly agree. this time, on Wii, the slower memory isnt super-slow, like it is on Gamecube
(the 16 MB of slow DRAM called ARAM).
88 MB is extremely little, compared to the other two new consoles, but it's a lot for a last-gen console, which is what Wii is, sans the all-new controller.
It's a stupid definition, though. The 360, PS3 and Wii are current gen, if the PS2 is last gen. Or you can call the new consoles "next gen", if you refer to PS2 as "current gen" (not that this makes much sense to me). But having just "next gen" and "last gen" means that there's no current generation of consoles. Right?The wii is (by definition) next gen.
If there's 4 MB on there, wouldn't that be enough for proper 32 bit colour? In which case, why are some titles dithered 24 bit? A throwback to GC development that'll end with the first gen titles? Or an indicator that there isn't much framebuffer room in there?You're forgetting the z-buffer in your calculations. But no worries, what I wrote actually doesn't make sense at all. If Hollywood is straight 2xFlipper, the edram would be doubled too. 4MB of framebuffer memory would be more than enough for proper widescreen.