WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
But since Wind Wakers 30fps was as close to 60fps as 30fps is ever going to get (i don't know what they did but it was damn effective), I'm quite fine with that.
 
But since Wind Wakers 30fps was as close to 60fps as 30fps is ever going to get (i don't know what they did but it was damn effective), I'm quite fine with that.

It was a steady 30fps, and I think triple buffering was used to guard against framerate drops.

Monkey bowling!!!!! The physics weren't perfect, but they were there.

I still have super monkey ball, and I don't remember its physics being even close to on par with Wii Sports. I should probably check though.
 
So now that we know the Wii has 64MB of GDDR3 and 24 MB 1T-SRAM it brings up the question of how devs will use this RAM.
1. What is GDDR3 typically best suited for(graphics?) vs. 1T-SRAM(other stuff ?)
2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.
3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).

Also I noticed watching a GC game being played on the Wii that when you start a GC game,instead of the game just starting,the GC startup happened first as if the Wii is emulating the GC instead of just playing the game. I don't know what it means I just thought it was interesting.
 
Also I noticed watching a GC game being played on the Wii that when you start a GC game,instead of the game just starting,the GC startup happened first as if the Wii is emulating the GC instead of just playing the game. I don't know what it means I just thought it was interesting.
the PS2 played the PS1 startup when playing a PS1 game as well. however, on Wii if you play a GC game you have to reset the system to get the wiimotes to function again or to get back to the wii menu. it's downright annoying, because you need a wiimote to start the GC game as well, and the power button on the wiimote won't even reset the system. unless, of course, i'm missing something. i skimmed through the manual but don't remember reading anything about it.
 
2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.

There is more in a game than just gfx like sound, AI maps, physics, animations...
Anyway I guess that the GDDR3 will mostly used to gfx, sound and the CPU will work mainly with the 1T-Sram.

3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).

Why would they limit it?
 
the PS2 played the PS1 startup when playing a PS1 game as well. however, on Wii if you play a GC game you have to reset the system to get the wiimotes to function again or to get back to the wii menu. it's downright annoying, because you need a wiimote to start the GC game as well, and the power button on the wiimote won't even reset the system. unless, of course, i'm missing something. i skimmed through the manual but don't remember reading anything about it.

Well there is hope now that a Nintendo system can be updated.
 
But since Wind Wakers 30fps was as close to 60fps as 30fps is ever going to get (i don't know what they did but it was damn effective), I'm quite fine with that.

I didn't think Wind Waker was anywhere remotely close to 60fps. it was 30fps. the look of the game and the filters/blurring perhaps made it look different/smoother to most people than a typical 30fps game, but in no way was Wind Waker anything other than ~30fps. the animation was great, perhaps that too had a hand in making the game appear to have a higher framerate than it did.

the only think Wind Waker related that *might* have been 60fps was the first and very shocking trailer shown at SpaceWorld 2001. but the game itself was definitally 30fps. that's the framerate Nintendo usually finds adaquate for most games.


now, as for the upcoming Mario Galaxy, in some of the videos, it *appears* that the framerate is close to 60fps, but don't quote that as gospel, I could well be wrong, and even IF it is ~60fps, Nintendo might end up cutting it to 30fps as they did with Mario Galaxy. although I would want any platformer to be 60fps.

edit: seems I was right about Mario Galaxy

I played mario galaxy which really just felt right. BTW the game runs at a super super super super smooth 60fps. Placed right next to DQ Swords which was 30fps really makes you appreciate just how awesome 60fps is. In fact, seeing the game in person genuinely impressed me. Its really really clean.

http://www.neogaf.com/forum/showthread.php?t=130738
 
Last edited by a moderator:
So now that we know the Wii has 64MB of GDDR3 and 24 MB 1T-SRAM it brings up the question of how devs will use this RAM.
1. What is GDDR3 typically best suited for(graphics?) vs. 1T-SRAM(other stuff ?)
2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.
3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).

Also I noticed watching a GC game being played on the Wii that when you start a GC game,instead of the game just starting,the GC startup happened first as if the Wii is emulating the GC instead of just playing the game. I don't know what it means I just thought it was interesting.

It does not do the GC bootup, it just plays the GC bootup noise while you're still in the Wii OS looking at the disc channel. Now for a crazy wackjob theory.....the extra 20% of unexpected die size on the cpu is due to virtualization hardware! (doubt it, you can't even quit out to the host OS from within a cube game)

Also interesting, while I didn't do any exhaustive testing, it appears Gamecube games are free of slow down when played on the Wii. I was only playing on an SDTV though and not an HDTV, but the games I tried (RE4, the demo of Rebel Strike, and Super Smash Bros Melee with poke balls on very high) seemed to maintain whatever their default framerate was. Additionally, the flicker filter option in SSBM seemed to give better results than I remember on the cube, but since I haven't played over anything less than component video (even when running interlaced) in years, it's probably just the extra blur from composite hiding most of the detail of the image anyway, jaggies and textures. I'd be surprised if Nintendo left the clock speeds at full when running gamecube games (though as long as they maintain the same ratios, it should work, right? well, unless nintendo allowed games to be made that didn't use vsync), but it would be nice if gamecube games had slowdown removed. Though they never had all that much slowdown to begin with.
 
It does not do the GC bootup, it just plays the GC bootup noise while you're still in the Wii OS looking at the disc channel. Now for a crazy wackjob theory.....the extra 20% of unexpected die size on the cpu is due to virtualization hardware! (doubt it, you can't even quit out to the host OS from within a cube game)

Also interesting, while I didn't do any exhaustive testing, it appears Gamecube games are free of slow down when played on the Wii. I was only playing on an SDTV though and not an HDTV, but the games I tried (RE4, the demo of Rebel Strike, and Super Smash Bros Melee with poke balls on very high) seemed to maintain whatever their default framerate was. Additionally, the flicker filter option in SSBM seemed to give better results than I remember on the cube, but since I haven't played over anything less than component video (even when running interlaced) in years, it's probably just the extra blur from composite hiding most of the detail of the image anyway, jaggies and textures. I'd be surprised if Nintendo left the clock speeds at full when running gamecube games (though as long as they maintain the same ratios, it should work, right? well, unless nintendo allowed games to be made that didn't use vsync), but it would be nice if gamecube games had slowdown removed. Though they never had all that much slowdown to begin with.

I just watched the video again and you're right it doesn't do the full bootup.
As to the other stuff I wouldn't be suprised to see some form of very subtle improvements with GC games. I was reading an article on 1UP and they stated that VC games showed cleaner and better IQ even though the games were perfectly emulated. It's nothing conclusive but does point to something better going on.

http://www.1up.com/do/feature?cId=3155418


3. If Mario 64 is any indication, N64 software is going to look way better than on an actual N64. Textures look cleaner, sprite objects are sharper, and all that blurry anti-aliasing has been toned down. The result? Crisp, attractive graphics. And finally playing N64 games on a better controller makes the package even better.
 
Well, I'd expect N64 games to look better. The Zelda games emulated on gamecube looked better than their n64 counterparts, and PC emulation showed a long time ago that proper bilinear filtering and higher res can go a long way to helping n64 games. Real bilinear filtering and increasing the res from 320x240 to 640x480 were all nintendo needed to make the emulated mario 64 look much better than what the n64 could do.

And I doubt the emulate 2d games look better than they do in a PC emulator. They're just pixel perfect versions of the classic games, but with high quality video output free of interference and insufficient bandwidth.
 
So now that we know the Wii has 64MB of GDDR3 and 24 MB 1T-SRAM it brings up the question of how devs will use this RAM.
1. What is GDDR3 typically best suited for(graphics?) vs. 1T-SRAM(other stuff ?)
No memory type really can have a natural best application. GDDR is GDDR just to separate it from the PC main memory spec revs, not because it was "optimized for graphics" in any way. Though you could say that type of memory is optimized for being soldered onto a PCB to act as non-upgradeable memory, as opposed to optimized for a mixed-and-matched jumble of memory sticks, with an unknown total number and different sizes and manufacturers, if you catch my drift.

What's far more important here is that the GDDR has such a dinky interface, low bandwidth and in all likelihood terrible latency especially in comparison to the 1T-SRAM.

So in the Wii the GDDR is really slow memory while the 1T-SRAM is fast -- no idea how fast exactly (yet) but faster than the GDDR in any case.
ninzel said:
2.Will they use just the 64MB for graphics or will all the 88 MB be available for rendering a scene.
Developers will avoid using the GDDR for graphics as hard as they can. They just won't be able to avoid it completely.
It's probably safe to say that a majority of the system memory, and that includes the GDDR space of course, will be used for graphics assets, but that applies to all consoles, not just the Wii. Better textures und more detailed models always make a difference and hence are a good way to use up the available memory.

It probably isn't a huge problem either way. Flipper has edram for the framebuffer and an enormous texture virtual memory kinda thing/cache so it can cope with poorly performing external memory really well. Hollywood should have inherited that ability.
ninzel said:
3. Is all 88 available for Wii games or only the 64+3(would seem like a huge waste with a system already so limited on resources to let 24MB sit idle).
I have absolutely no doubt that all of it will be available to native Wii games. As you say, it would be a waste. Also because the GDDR alone would be just too slow for a new console, even for Nintendo's humble standards.
 
Last edited by a moderator:
No memory type really can have a natural best application. GDDR is GDDR just to separate it from the PC main memory spec revs, not because it was "optimized for graphics" in any way. Though you could say that type of memory is optimized for being soldered onto a PCB to act as non-upgradeable memory, as opposed to optimized for a mixed-and-matched jumble of memory sticks, with an unknown total number and different sizes and manufacturers, if you catch my drift.

What's far more important here is that the GDDR has such a dinky interface, low bandwidth and in all likelihood terrible latency especially in comparison to the 1T-SRAM.

So in the Wii the GDDR is really slow memory while the 1T-SRAM is fast -- no idea how fast exactly (yet) but faster than the GDDR in any case.
Developers will avoid using the GDDR for graphics as hard as they can. They just won't be able to avoid it completely.
It's probably safe to say that a majority of the system memory, and that includes the GDDR space of course, will be used for graphics assets, but that applies to all consoles, not just the Wii. Better textures und more detailed models always make a difference and hence are a good way to use up the available memory.

It probably isn't a huge problem either way. Flipper has edram for the framebuffer and an enormous texture virtual memory kinda thing/cache so it can cope with poorly performing external memory really well. Hollywood should have inherited that ability.
I have absolutely no doubt that all of it will be available to native Wii games. As you say, it would be a waste. Also because the GDDR alone would be just too slow for a new console, even for Nintendo's humble standards.

So then why use GDDR at all,is it cheap? Doesn't it seen strange that the largest pool of memory would be the worst performing? Ever since I saw these specs I've been trying to figure out why Nintendo devised this memory setup, and this idea that GDDR3 is poor compared to 1T-SRAm makes it more confusing.Last gen the largest pool was the fastest. I though Nintendo made efficient well thought out systems.
 
So then why use GDDR at all,is it cheap? Doesn't it seen strange that the largest pool of memory would be the worst performing? Ever since I saw these specs I've been trying to figure out why Nintendo devised this memory setup, and this idea that GDDR3 is poor compared to 1T-SRAm makes it more confusing.Last gen the largest pool was the fastest. I though Nintendo made efficient well thought out systems.
The GDDR3 is simply a very cheap mass-market product. The SRAM die alone, even though it's not even half the storage space, is two thirds of the size of the whole GDDR3 chip package. That should tell you something.

Having a small amount of fast memory plus a larger amount of slower memory isn't all that bad, it's not even unusual. It forces programmers to be careful about where which data goes and when, but they usually can handle that. It's the same with the GBA btw, and with the DS and many other systems. You do it on other levels as well: every PC processor (or Broadway itself) has a cache that is a smaller, faster companion to the main memory, and on the other side of the spectrum virtual memory (on the HDD) to complement system memory.

I'm far more concerned about the total size of the memory. 88MB is just extremely little.
 
The GDDR3 is simply a very cheap mass-market product. The SRAM die alone, even though it's not even half the storage space, is two thirds of the size of the whole GDDR3 chip package. That should tell you something.

Having a small amount of fast memory plus a larger amount of slower memory isn't all that bad, it's not even unusual. It forces programmers to be careful about where which data goes and when, but they usually can handle that. It's the same with the GBA btw, and with the DS and many other systems. You do it on other levels as well: every PC processor (or Broadway itself) has a cache that is a smaller, faster companion to the main memory, and on the other side of the spectrum virtual memory (on the HDD) to complement system memory.

I'm far more concerned about the total size of the memory. 88MB is just extremely little.

i mostly agree. this time, on Wii, the slower memory isnt super-slow, like it is on Gamecube
(the 16 MB of slow DRAM called ARAM).

88 MB is extremely little, compared to the other two new consoles, but it's a lot for a last-gen console, which is what Wii is, sans the all-new controller.
 
i mostly agree. this time, on Wii, the slower memory isnt super-slow, like it is on Gamecube
(the 16 MB of slow DRAM called ARAM).

88 MB is extremely little, compared to the other two new consoles, but it's a lot for a last-gen console, which is what Wii is, sans the all-new controller.

The wii is (by definition) next gen.

Probably the sony and the ms is like the "last gen with a new controller" idea due to marketing consideration, but it is not change the fact.

The other point is in the case of the ps3 and x360 the main part is the memory is needed for the hi-def.It is mean that the wii is can bring the same visual perfomrance ,only the resolution will be smaller .Not a bad deal, isn't it?
 
The wii is (by definition) next gen.
It's a stupid definition, though. The 360, PS3 and Wii are current gen, if the PS2 is last gen. Or you can call the new consoles "next gen", if you refer to PS2 as "current gen" (not that this makes much sense to me). But having just "next gen" and "last gen" means that there's no current generation of consoles. Right?
 
You're forgetting the z-buffer in your calculations. But no worries, what I wrote actually doesn't make sense at all. If Hollywood is straight 2xFlipper, the edram would be doubled too. 4MB of framebuffer memory would be more than enough for proper widescreen.
If there's 4 MB on there, wouldn't that be enough for proper 32 bit colour? In which case, why are some titles dithered 24 bit? A throwback to GC development that'll end with the first gen titles? Or an indicator that there isn't much framebuffer room in there?
 
Status
Not open for further replies.
Back
Top