Ok, full interview from anonymous third party about Wii GPU.

Just remember that the CPU is not even twice as fast and the RAM size is less than doubled.

Even if you include GC's A-Ram Wii still has more then twice the memory (91MB vs 43MB).

Really though you just can't compare GC's A-Ram to any of Wii's memory (appart from maybe the 512MB flash memory or the 16MB buffer memory in Wii's DVD drive). It was used mostly to speed up load times, with about 2-4MB being used for sound. We don't know how much faster the CPU is, it could very easily be twice as fast.

The GPU may be twice the clock speed, but it isn't much more impressive in feature set apparently. So be prepared for some Cube-like graphics with maybe more polys and slightly better texture resolution.

The GPU runs 50% faster then Flipper, but it seems extremely likely that the GPU is doubled up. Which would leave it as three times the power of Flipper. This balances well with what we know of the rest of Wii's main specs compared to GC:

3 times the memory bandwidth
3.5 times the amount of main memory
3 times the GPU power

Unless we get some quality devs in addition to N, who put effort into it, Cube-level may be the best we see for the most part. I'd wager that Mario Galaxy and Metroid Prime 3 are the machine's peak. After all, it's basically a Cube arch and people know it well. It's not like it's a weird, quirky design ala PS2. Cube was praised a zillion times for its leniency with bad code and so we saw it get fully utilized rather quickly.

Being lenient of poor coding doesn't mean it would get fully utilised easily. If anything it'd likely cause the opposite effect.

Very few developers showed us anywhere near what GC could do. There's really no evidence to suggest that GC was ever fully utilised.
 
Last edited by a moderator:
How moreso than dodgy port after dodgy port of PS2 titles? As long as developers target the Wiimote properly, ports won't be an issue. And if devs are just going to port software, it'd be better to port down from PS360 to high-quality AA'd SDTV res games, than ported upwards from PS2 to low-quality jaggied games, no? Well, TBH it doesn't make any difference. If Wii just get downports, you'd be better off with an XB360. And if it just gets PS2 ports, you'd be better off with a PS2. Wii's only good for the custom software that use it's motion controls. For me personally, I'd be happy with taking Wii's abilities as they are, adding 4xAA, losing any dithering, upping the poly counts on characters a bit and providing some great per-pixel lighting effects. It's doesn't need a 48 pipe, 500 MHz GPU to manage that!

I agree that custom software is clearly the best thing for Wii.
 

Which one is?
If you are looking at current sales only it looks like the Wii,but you have to take into account the current PS2 installed base, plus it's continued sales plus it's software sales. I'm sure there's way more people playing the PS2 than the Wii right now.
 
If your talking about the current most populair console you have to look at current sales, not what it sold in 7 years time. If you count like that it might as well be that in 5, or maybe even 10 years time the ps2 is still the most populair console because it sold more in total than other consoles, but that doesnt mean its still the most populair console 5 years from now.
 
If your talking about the current most populair console you have to look at current sales, not what it sold in 7 years time. If you count like that it might as well be that in 5, or maybe even 10 years time the ps2 is still the most populair console because it sold more in total than other consoles, but that doesnt mean its still the most populair console 5 years from now.

It's a combination of current hardware and software sales,continued developer support and past sales,not just past sales.
If there were no current PS2 software sales and no current hardware sales it wouldn't be factor. But there are.
I see what you are saying which is why I'm not looking at PS1 or Gamecube or N64. I think that the PS2 is in a unique position in that it continues to be so popular that it can't be ignored.
 
I was only posting to point out to some that there isn't really any price/cost consideration for Nintendo's choices.

That leaves us then with only one other possibility: it was a bewildering, foolhardy, and very shortsighted decision with no serious rationale. But I don't think that fits with anything Nintendo does, not as the most cost-conscious player in the entire console industry. That's why I think there are almost certainly costs and risks that you haven't accounted for on the back of your napkin, regardless of whether you are personally satisfied by the ones I have suggested.

Anyway, I wonder how many Wii games are going to be cross-platform with the PSP and PS2. Both of them have quite a bit of life left in them.
 
Something I was pondering today, gave me cause to ask this question. The 24MBs of 1T-SRAM is on Hollywood die, right?

Could it be placed there to remove any bottlenecks associated with using the TEVs 16 stages?

This memory should be at a faster clock than what is in the GC, the memory being BC it clocks down during GC mode...I'm assuming. Also, why would Nintendo not at least ask for solution to this problem.
 
Something I was pondering today, gave me cause to ask this question. The 24MBs of 1T-SRAM is on Hollywood die, right?

Could it be placed there to remove any bottlenecks associated with using the TEVs 16 stages?

This memory should be at a faster clock than what is in the GC, the memory being BC it clocks down during GC mode...I'm assuming. Also, why would Nintendo not at least ask for solution to this problem.

No.The TEV memory architecture (the 1 megs of texture buffer) is absolutly integrated with the TEV iteself.

The only one thing that you can do is to increase the mem bandwith between the gpu and the main memory .And they did that.:)

Based on the informations from the SDK, they had to double the memory of the TEV if they want to increase the no of the pipelines they have to increae the size of the texture buffer


Right now in the GC the mem bandwith was 512 bit , 128 bit for every pipeline, the tex memory have 16 spearated aare, each of them able to reach them mem separatedly
 
No.The TEV memory architecture (the 1 megs of texture buffer) is absolutly integrated with the TEV iteself.

The only one thing that you can do is to increase the mem bandwith between the gpu and the main memory .And they did that.:)

Based on the informations from the SDK, they had to double the memory of the TEV if they want to increase the no of the pipelines they have to increae the size of the texture buffer


Right now in the GC the mem bandwith was 512 bit , 128 bit for every pipeline, the tex memory have 16 spearated aare, each of them able to reach them mem separatedly

If a dev wanted to, could they use a portion of the main mem as a buffer?

Is Hollywood a 8 pixel pipe piece?
 
Even if you include GC's A-Ram Wii still has more then twice the memory (91MB vs 43MB).

Really though you just can't compare GC's A-Ram to any of Wii's memory (appart from maybe the 512MB flash memory or the 16MB buffer memory in Wii's DVD drive). It was used mostly to speed up load times, with about 2-4MB being used for sound. We don't know how much faster the CPU is, it could very easily be twice as fast.
I had thought Cube had 48 MB for some reason. You are also right that A-RAM was rather useless for general use. I should've thought more about that.

But, what about the distinct possibility that the hardware may have the same size edram? Doesn't that mean that the hardware will perhaps have many of the same limitations inherent to this as Cube?

Very few developers showed us anywhere near what GC could do. There's really no evidence to suggest that GC was ever fully utilised.

Just what does "fully utilized" mean? I hate it when people say this about consoles. It reminds me of the stupid comments by PR people in interviews, where they say game X pushes the machine only Y%. Obviously this is just a way to hype people up more.

Factor 5 pushed the machine harder than they should've with Rebel Strike. It put out some amazing effects for something not far from a GeForce 2 (NSR!). The game ran at what looked like 15 fps sometimes, as a result. But it sure was pretty. I find it rather astonishing that you could think there is some more "untapped" hardware in there that they just missed.

I thought F-Zero's totally fluild graphics were even more impressive simply because, well, games on consoles should be smooth as butter. It's a closed platform, after all. When a dev like F5 has frame rate issues, what immediately comes to mind is that they are probably really scraping for performance to keep all their features going.

And then RE4 was basically the defining moment for the console's graphics, IMO, even if they had to pre-bake most of the lighting and shadowing to make the game. It's not like the hardware could do full-screen per-pixel lighting and stencil shadows, or run a higher color depth to eliminate all that banding. It also had some nasty texture aliasing (looked like missing mipmaps) because of the devs perhaps fighting with RAM space somewhere in the scheme of things. Twilight Princess has this same problem, too.

I know some people will bring up the Metroid Prime games. I haven't played more than part of the original (wasn't my thing), so I can't comment on those.

To me, Cube was pushed really hard in more than few games. Right now, other than Nintendo, I'm not convinced that Wii has the same caliber of people working on it. I am also pretty worried about the number of ports. Will we get down-to-the-metal games from more than Nintendo? Actually I'm quite disappointed by the game lineup in general.
 
Sorry, what I meant was, as additional buffer mem.

Ok.The gpu need same texture.It move it from the main memory , exactly it move same piece of the texutre, and when it need other one, it move that into the buffer and trash the previous one.
Because it the bandwith between the mem and the gpu is the only one that count,the other things (even the latency) is only a chery on the cake.
 
Doesn't that mean that the hardware will perhaps have many of the same limitations inherent to this as Cube?

As far as resolution and colour depth perhaps, nothing relating to the extra main memory that I can think of though.

Just what does "fully utilized" mean? I hate it when people say this about consoles. It reminds me of the stupid comments by PR people in interviews, where they say game X pushes the machine only Y%. Obviously this is just a way to hype people up more.

Hold on there, you were actually the one who used the term.. I just used it in response to your post..

Factor 5 pushed the machine harder than they should've with Rebel Strike. It put out some amazing effects for something not far from a GeForce 2 (NSR!). The game ran at what looked like 15 fps sometimes, as a result. But it sure was pretty. I find it rather astonishing that you could think there is some more "untapped" hardware in there that they just missed.

Neither of us know how much more Factor 5 could have gotten out of GC. Framerate doesn't tell us anything in that respect. Since lower framerate doesn't neccesarily show a limitation of the system itself.

There's nothing astonishing about thinking that Factor 5 could have gotten more out of GC with even more development time either. People thought RL was amazing and claimed GC had been maxed out with a launch game, then came RS, which technically blew it out of the water.

You mention RE4 being a defining moment for GC graphics. But how many defining moments did PS2 have during its time? If PS2 development had stopped shortly after MGS2 people would have claimed that game fully utilised PS2.

BTW I'm not claiming that I know for a fact that Capcom could have made a sequal to RE4 on GC that would have blown RE4 out of the water. Or that Factor 5 could have blown RS out of the water. Or that no game pushed the console hard. What I do think is that with the level of effort put into PS2 software over the same kind of time period GC games would have kept on improving. Developers always find knew tricks, little ways to squeeze more performance out of each new engine. So I just don't agree that it was neccesarily fully utilised, which I take to mean that no developer could have improved on the visuals in the best looking GC games.
 
Last edited by a moderator:
Its strange to me, that he thinks Wii capabilities won't allow it to exceed where RS or RE4 technically end. I mean, GC major limitations has always been its low allocation of memory. It was what held back the TEV 16 stage shader capabilities. I'm not suggesting Wii will output something comparable to Uncharted:DF, but I do expect it to surpass what has been considered the best from Xbox 1.
 
Its strange to me, that he thinks Wii capabilities won't allow it to exceed where RS or RE4 technically end. I mean, GC major limitations has always been its low allocation of memory. It was what held back the TEV 16 stage shader capabilities. I'm not suggesting Wii will output something comparable to Uncharted:DF, but I do expect it to surpass what has been considered the best from Xbox 1.

That's kind of what I expect. Maybe half way to the next gen or something. It should look awesome on SD consoles, anyway.

I'm looking at the DS, too. Have you guys seen the trailer for Final Fantasy Crystal Chronicals on it? It's amazing!
 
Its strange to me, that he thinks Wii capabilities won't allow it to exceed where RS or RE4 technically end. I mean, GC major limitations has always been its low allocation of memory. It was what held back the TEV 16 stage shader capabilities. I'm not suggesting Wii will output something comparable to Uncharted:DF, but I do expect it to surpass what has been considered the best from Xbox 1.

No no no. I'm not saying that I think Wii is less impressive than Cube. I'm sure it will do better things simply because it's clocked higher, has more RAM, and supposedly has an 8-pipe GPU.

I think that those facts will let them push the Cube arch further than before, even if just using the same techniques.

I'm not entirely convinced of its superiority to NV2A yet though. Mainly due to feature-set, not fillrate or bandwidth.
 
BTW I'm not claiming that I know for a fact that Capcom could have made a sequal to RE4 on GC that would have blown RE4 out of the water. Or that Factor 5 could have blown RS out of the water. Or that no game pushed the console hard. What I do think is that with the level of effort put into PS2 software over the same kind of time period GC games would have kept on improving. Developers always find knew tricks, little ways to squeeze more performance out of each new engine. So I just don't agree that it was neccesarily fully utilised, which I take to mean that no developer could have improved on the visuals in the best looking GC games.

Yeah it is definitely true that PS2 got way more attention. An order of magnitude more, lol. So, yeah, maybe Cube missed out on some interesting implementations because of that.

Still, I've always thought of the Factor 5 guys as the Futuremark of the console industry (with regards to their talent for digging deep into hardware). Hell, some of the dev tools for Wii and Cube are built by them! Since N64, they've been putting out amazing stuff.

It's rather odd actually. I don't really like anything they've done with respect to gameplay. Regardless, I own all of their N64 and Cube games just because of technical fascination. Rebel Strike does things that, at the time, I had only seen on my Radeon 9700 with DirectX 9.0 games. That was truly amazing considering what's inside that Cube.
 
Last edited by a moderator:
Yeah it is definitely true that PS2 got way more attention. An order of magnitude more, lol. So, yeah, maybe Cube missed out on some interesting implementations because of that.

Still, I've always thought of the Factor 5 guys as the Futuremark of the console industry (with regards to their talent for digging deep into hardware). Hell, some of the dev tools for Wii and Cube are built by them! Since N64, they've been putting out amazing stuff.

It's rather odd actually. I don't really like anything they've done with respect to gameplay. Regardless, I own all of their N64 and Cube games just because of technical fascination. Rebel Strike does things that, at the time, I had only seen on my Radeon 9700 with DirectX 9.0 games. That was truly amazing considering what's inside that Cube.

Indeed, that's why I'm not as hyped up on Lair as some people are. Not to mention that game is really brownish.

I do think that the visuals of Wii games will improve a lot. But developers will probably have to learn how to learn how do great things with lesser resources. Hell, look at Conker for the X-Box. That game looks damn near next gen to me. ;)

Some screenshots.

conker-live-reloaded-20050617022634432.jpg


conker-live-reloaded-20050412110131707.jpg


Now, it's not next gen, but for a console looking to entertain those still using SDTVs, I think it's more than enough.

Another good example is Dewy's Adventure for the Wii. A video can be found here.

And another example of hardware being pushed to do things thought not to be possible (on that hardware) is Final Fantasy Crystal Chronicals for the DS.

Edit: Video seems to be down. :( A screenshot will have to do.

h-104_71657_f1.jpg.jpg


Now, this is the first time that I know of, that the DS was able to render a town and characters all at once in a 3D enviorment. In the video they're constantly running around and fighting. There is some slow down, though. But if I remember correctly, it also uses the bottom screen (in FFIII, the bottom screen wasn't used due to hardware limitations).

So yeah, the Wii could end up being pushed well beyond it's limits in the future. It just depends on how successfull the Wii is, how many third parties jump on the Wii, and how many of those are skilled developers.

Wow, run on sentence. :O
 
Back
Top