Teasy said:
The IGN article that said no shaders had been added to the GPU and then went on to say that its own development source had no knowledge about the GPU at that point? I'm still unsure of how much to believe about that report. It just seems very strange to me that ATI would claim that Hollywood has been developed from the ground up for the last 4 to 5 years for Wii and is not based on Flipper (which they did come out and say) if it was in fact just a 50% overclocked Flipper. I believe ATI also said outright that Hollywood would be DirectX9 minimum spec.
By the way, some of the early Wii game shots I've seen already show good use of bump mapping and other effects even in some multi platform games. Marvel Alliance, a multi platform game from a company obviously not very technically gifted (look at the XBox version of the game, it stinks visually) has heavy bump mapping on some characters, detailed shadows on all characters and specular highlights all over the place. Those kind of shiny shine effects were very rare even in the best exclusive games on GC, because of the extra work that had to be put into the TEV to program those kinds of effects, but they seem to be easy on Wii.
H.A.M.M.E.R is another nice example of this, on the level I saw the road looked bump mapped, but also it was fully destructable. If you hit the hammer on the floor paving stones would break off in clumps, cars, lamposts, basically everything breaks when you hit it. This is another very rare thing on GC because of the lack of a vertex shader. So to me I'm seeing more then just a 50% overclock here, though I see you also agree with that to a degree. I just with we'd hear something concrete about the GPU, so the speculation could end.
Couldn't really tell about hammer, the video I saw was too small and low res.
The Marvel game is also in development for PS3, so it could shots from a crpapy Ps3 version.
If you check out Super Smash Bros. Brawl video, you'll notice that the first characters are shown as they were in Super Smash Bros. Melee (NGC) before being turned into the new version which look better.
Well, Kirby and Pikachu didn't change...
Anyhow, it looked like just a new texture for both. Mario was given a slightly more detailed texture, and Link was made to look like he does in the new Zelda, rather than in Ocarina of Time. It could just be the additional graphics memory, if anything, that made this possible if it wasn't on GameCube.
About the shaders, they basically do the same than TEV but in the GCthe ops/data that textures can do/store is very limited IIRC, about the IGN article that things is really bad (qualitity wise) I cant even understand if he meant that the GPU does have 3xmpre the texture cache or if he is just talking about the same 3Mgs from flipper.
Anyway I still find very strange that in avarerage, besides textures, most will games looked way worst than they should (considering that it is at least ,GC HW + 2x the memory + 50% speed) once you compare it too SW/RE4/SC3, even Zelda that from what I read it is one of the wii best looking games it looks EQUAL to the GC version.
I'm not even sure if the games have better texture overall, Red Steel had some hideous textures and character models, but the Need for Speed Underground style special effects help hide that really well. (it certainly made undergrond look much better than its texture or polygon counts alone would, but there's somewhat of a backlash against that visual style....beyond good and evil used it too, and many games now use it in combination with high polygon counts and good textures)
The Zelda game IS the gamecube version, so I wouldn't judge it as a Wii game. Rather, judge that it looked better than most of the Wii games shown. If IGN's comments about specs are right, it looks exactly like it should, like GameCube graphics but slightly more. How good graphics looks doesn't scale linearly with the power (with the exception of major feature additions, like shaders), usually it's more like 10x the power is required before many people even notice a significant difference in quality, assuming all features are equal. The xbox 360 just barely makes that power jump and gets ragged on for looking too much like Xbox. And then there's still the people who claim Dreamcast had better graphics than ps2...
Graphics Processing Unit: Being developed with ATI.
Don't put too much hope into a grammatical error. Final clock speed probably isn't locked down yet, maybe even final memory amount isn't, but the hardware is not going to change significantly prior to launch.
Oh well, at least the games look good. It saves Nintendo work too, adding bump mapping and shininess to all their characters would require yet another artistic reenvisionment, and the one they already went through going from n64 to gamecube already made many of their characters more detailed and realistic than fans thought appropriate. (Mario in Melee had visible denim stiches, Bowser looked like a vicious dinosaur, it was the "Who Framed Roger Rabbit?" of Nintendo)