Analysts optimistic about Wii ; THQ talk about Wii costs

The Virtua Boy was a long time ago, like the PowerGlove, technology changes and becomes more feadisble. I'm sure Nintendo are looking closely at VR, if they find a good technology I've no doubt they'd use it. GPU power isn't an issue either. If the idea Nintendo have requires lots of power then they'll have no problem producing the hardware neccesary. Having said that I have no particular belief that they're next console will be VR, might be, might not be :)
 
Teasy said:
Not that its important, considering I doubt very much that Wii is simply an overclocked GC to start with and the fact that I think your just fishing at the moment Demo :).. But since its relivant to your post I might as well post it here. In Iwata's latest interview he said that Nintendo expect HDTV to be the standard in 5 years so they will definitely support HD resolutions with the succesor to Wii.

Good, I might actually have an HDTV in 5 years :)

Forget HD. Right now, most people can't use HD.
Wii barely even supports shaders! It's quite possible that Wii will never surpass the best looking xbox games because of this, it seems like quite a serious oversight.
 
Fox5 said:
Forget HD. Right now, most people can't use HD.
Wii barely even supports shaders! It's quite possible that Wii will never surpass the best looking xbox games because of this, it seems like quite a serious oversight.
I love the amount of trolling exuded from that last sentence.
Wii footage has already shown the console is capable of shaders and much more.
 
Fox5 said:
Forget HD. Right now, most people can't use HD.
Wii barely even supports shaders! It's quite possible that Wii will never surpass the best looking xbox games because of this, it seems like quite a serious oversight.

Wha? Games don't look "best" just because the platform it's on is the "most powerful".

In other words, look at it this way. The platform is the POTENTIAL look of the game but it's up to the developer to live up to that potential (become ACTUAL).

RE4 happens to be one of the best looking games(imo) of the previous generation. There is little reason to believe that a souped up GCN couldn't have games that look superior to the "best looking" Xbox game.
 
off topic

Fox5 said:
Forget HD. Right now, most people can't use HD.
Wii barely even supports shaders! It's quite possible that Wii will never surpass the best looking xbox games because of this, it seems like quite a serious oversight.

the difference between the output of shaders and fixed-texture combiners is not so much in the look achievable by both as it is in the performance of those looks - shaders allow for sophisticated shading techniques w/o dying of bandwidth and/or trisetup starvation somehwere in the middle of the fixed pipeline multi-passes. for things that can be implemented in one or two passes in a clever fixed-texcombiners pipeline shaders do not have much to offer though.
 
sfried said:
I love the amount of trolling exuded from that last sentence.
Wii footage has already shown the console is capable of shaders and much more.

Nothing beyond what was done on gamecube, just at a higher framerate.
The shading in the Mario Universe game looks on par with Pikmin 2's boss battles, but running at 60 fps instead of an unsteady 30fps.
The pixel shader effects in Red Steel look like Need for Speed Underground (I'm sure there's a better example, but it's all I can think of at the moment), but once again with a better framerate.

In other words, look at it this way. The platform is the POTENTIAL look of the game but it's up to the developer to live up to that potential (become ACTUAL).

RE4 happens to be one of the best looking games(imo) of the previous generation. There is little reason to believe that a souped up GCN couldn't have games that look superior to the "best looking" Xbox game.

I think shaders are definingly next gen. GCN didn't have too bad of a time up against xbox cause only the very later and best xbox games made good use of shaders. However, most of the current Xbox 360 games look like the best of Xbox 1 games, but in high res. Wii games...look like Gamecube games...at standard resolution and a higher framerate.
But it's pretty telling when Zelda, a GCN game, looks as good as or better than most of the Wii games, and Rebel Strike looks better than any Wii game shown, and Rebel Strike is no Chaos Theory.

shaders allow for sophisticated shading techniques w/o dying of bandwidth and/or trisetup starvation somehwere in the middle of the fixed pipeline multi-passes. for things that can be implemented in one or two passes in a clever fixed-texcombiners pipeline shaders do not have much to offer though.

Wii's got edram (and so does Cube), shouldn't that solve the bandwidth limitation? And for the trisetup, well the Wii's new cpu is decently fast compared to the gpu.
 
Fox5 said:
Forget HD. Right now, most people can't use HD.
Wii barely even supports shaders! It's quite possible that Wii will never surpass the best looking xbox games because of this, it seems like quite a serious oversight.

Excellent!, someone finally knows the specifics of the Hollywood GPU!, please outline the chip and its features for us, and of course link me to the announcement of the specs, can't wait!!

By the way, when you talk about shaders that are beyond what was possible on GC, could you give me some examples of what you see as beyond what was possible on GC?, I'm being serious this time :)
 
Last edited by a moderator:
Fox5 said:
I think shaders are definingly next gen. GCN didn't have too bad of a time up against xbox cause only the very later and best xbox games made good use of shaders. However, most of the current Xbox 360 games look like the best of Xbox 1 games, but in high res. Wii games...look like Gamecube games...at standard resolution and a higher framerate.

shaders are next-gen, no doubt about it. production-wise, though. what do we care from our consumer perspective (as regardless of our occupations and past-time interests we're consumers when we buy it)? can fixed-pipelines produce emersive visuals - yes, they can. from there on it's the duty of the developer to use that in a way that would make their game enjoyable and atmospheric. you, me, and the next guy's dog need not worry how it will be done ..as strange as it may sound coming from me on a tech board.. that is unless, of course, you want to discuss a certain visual effect in a certain title.

But it's pretty telling when Zelda, a GCN game, looks as good as or better than most of the Wii games, and Rebel Strike looks better than any Wii game shown, and Rebel Strike is no Chaos Theory.

i personally liked what i saw from the red steel footage. actors in the play shots had nice self-shadowing (cough, something some boastful titles on other platforms still seem to have issues with, cough) and the shading was artistic, but most of all the game does seem to have athmosphere. i don't care how that's done (even if i have to apply effort to turn off certain centers in my brain while playing it). i like what i see in red steel. don't you?

Wii's got edram (and so does Cube), shouldn't that solve the bandwidth limitation?

edram is still way slower than shader's register file.

And for the trisetup, well the Wii's new cpu is decently fast compared to the gpu.

i'd be surprised if the trisetup was not done on board the gpu. i.e. the cpu would have little to do aside from setting up display lists, it'd be exclusively up to hollywood. point being, it's still faster not having to do it.
 
Last edited by a moderator:
Teasy said:
Excellent!, someone finally knows the specifics of the Hollywood GPU!, please outline the chip and its features for us, and of course link me to the announcement of the specs, can't wait!!

By the way, when you talk about shaders that are beyond what was possible on GC, could you give me some examples of what you see as beyond what was possible on GC?, I'm being serious this time :)

Bleh, going with IGN's statement (no new shaders added to the hardware), along with that the graphics look just like gamecube but more, I'd say it's a safe assumption to say the hardware has the same feature set. Bump mapping is so prevalent at this point in time that if the hardware was readily capable of it, you can be sure at least one Wii game would have been shined all the way to hell.

As for beyond what was possible on GC, nothing on the Wii looks beyond the capabilities of Gamecube as far as looks go. As for other systems, well if you discount performance limitations, and the existence of rebel strike, I'd say you could look at almost any recent Xbox, Xbox 360, or PC game as beyond gamecube's feature set. Of course, rebel strike contradicts that and looks close to as good as any xbox game, and runs at near 60 fps versus 30 fps on most xbox games, so maybe with the right programming expertise the gamecube/wii is capable of using modern looking graphics, but I haven't heard of the rebel strike engine being licensed by anyone, nor has anyone else developed an engine for the hardware of similar ability. (perhaps the rebel strike engine doesn't play to the gamecube's strengths, or maybe it's just not easy to do? still, wasn't the advantage of using the same hardware to make the development process easier and have established tools?)

actors in the play shots had nice self-shadowing (cough, something some boastful titles on other platforms still seem to have issues with, cough) and the shading was artistic, but most of all the game does seem to have athmosphere. i don't care how that's done (even if i have to apply effort to turn off certain centers in my brain while playing it). i like what i see in red steel. don't you?

IIRC, wasn't self-shadowing something the gamecube hardware was good at? It was strange that few games other than super smash bros melee used it.
The game does look good, better as a whole than its individual parts would suggest, but it also has a similar washed out, overbright look as used in just about every racing game for the last 3 years.

BTW, I kind of hope it's revealed that the wii has two TEVs, since I think what's currently shown, though looking just like gamecube but more and at a higher framerate, looks like more than just the clock speed upgrade could give, and a 2nd TEV would mean that the current graphics are still a bit conservative and put more the Wii more on par with what the performance rumors were saying prior to IGN's annoucement of being basically an overclocked cube.
 
Bleh, going with IGN's statement (no new shaders added to the hardware), along with that the graphics look just like gamecube but more, I'd say it's a safe assumption to say the hardware has the same feature set. Bump mapping is so prevalent at this point in time that if the hardware was readily capable of it, you can be sure at least one Wii game would have been shined all the way to hell.

The IGN article that said no shaders had been added to the GPU and then went on to say that its own development source had no knowledge about the GPU at that point? I'm still unsure of how much to believe about that report. It just seems very strange to me that ATI would claim that Hollywood has been developed from the ground up for the last 4 to 5 years for Wii and is not based on Flipper (which they did come out and say) if it was in fact just a 50% overclocked Flipper. I believe ATI also said outright that Hollywood would be DirectX9 minimum spec.

By the way, some of the early Wii game shots I've seen already show good use of bump mapping and other effects even in some multi platform games. Marvel Alliance, a multi platform game from a company obviously not very technically gifted (look at the XBox version of the game, it stinks visually) has heavy bump mapping on some characters, detailed shadows on all characters and specular highlights all over the place. Those kind of shiny shine effects were very rare even in the best exclusive games on GC, because of the extra work that had to be put into the TEV to program those kinds of effects, but they seem to be easy on Wii.

H.A.M.M.E.R is another nice example of this, on the level I saw the road looked bump mapped, but also it was fully destructable. If you hit the hammer on the floor paving stones would break off in clumps, cars, lamposts, basically everything breaks when you hit it. This is another very rare thing on GC because of the lack of a vertex shader. So to me I'm seeing more then just a 50% overclock here, though I see you also agree with that to a degree. I just with we'd hear something concrete about the GPU, so the speculation could end.
 
Last edited by a moderator:
If you check out Super Smash Bros. Brawl video, you'll notice that the first characters are shown as they were in Super Smash Bros. Melee (NGC) before being turned into the new version which look better.

I think that Hollywood have Shaders, although I wouldn't bet on it.
 
Personally, besides textures, after saw SW:RL/RE4 every wii game looks worst than that IMO.

About the shaders, they basically do the same than TEV but in the GCthe ops/data that textures can do/store is very limited IIRC, about the IGN article that things is really bad (qualitity wise) I cant even understand if he meant that the GPU does have 3xmpre the texture cache or if he is just talking about the same 3Mgs from flipper.

Anyway I still find very strange that in avarerage, besides textures, most will games looked way worst than they should (considering that it is at least ,GC HW + 2x the memory + 50% speed) once you compare it too SW/RE4/SC3, even Zelda that from what I read it is one of the wii best looking games it looks EQUAL to the GC version.

So I really wonder if they didnt tune the gfx down for the E3, that would explain why the games look that way and why some big games/name dont even showed (RE5, SC4, CoD3, MGS guy project...) plus why we heard in so many places/things that it would be on par and about multiplatform games, or even explain the strange MR coments on HD and not raw power/features.It wouldnt suprise me althought I wouldnt bet on it.

Anyway maybe we will see more as it looks like they dont have final HW yet (althought, even then, I dont expect something as dramatic as the diference from E305 XB360 games to todays games).

http://wii.nintendo.com/hardware.html

CPU: PowerPC CPU (code-named "Broadway"). Made with a 90 nm SOI CMOS process, jointly developed with and manufactured by IBM.

Graphics Processing Unit: Being developed with ATI.
 
Teasy said:
The IGN article that said no shaders had been added to the GPU and then went on to say that its own development source had no knowledge about the GPU at that point? I'm still unsure of how much to believe about that report. It just seems very strange to me that ATI would claim that Hollywood has been developed from the ground up for the last 4 to 5 years for Wii and is not based on Flipper (which they did come out and say) if it was in fact just a 50% overclocked Flipper. I believe ATI also said outright that Hollywood would be DirectX9 minimum spec.

By the way, some of the early Wii game shots I've seen already show good use of bump mapping and other effects even in some multi platform games. Marvel Alliance, a multi platform game from a company obviously not very technically gifted (look at the XBox version of the game, it stinks visually) has heavy bump mapping on some characters, detailed shadows on all characters and specular highlights all over the place. Those kind of shiny shine effects were very rare even in the best exclusive games on GC, because of the extra work that had to be put into the TEV to program those kinds of effects, but they seem to be easy on Wii.

H.A.M.M.E.R is another nice example of this, on the level I saw the road looked bump mapped, but also it was fully destructable. If you hit the hammer on the floor paving stones would break off in clumps, cars, lamposts, basically everything breaks when you hit it. This is another very rare thing on GC because of the lack of a vertex shader. So to me I'm seeing more then just a 50% overclock here, though I see you also agree with that to a degree. I just with we'd hear something concrete about the GPU, so the speculation could end.

Couldn't really tell about hammer, the video I saw was too small and low res.
The Marvel game is also in development for PS3, so it could shots from a crpapy Ps3 version.

If you check out Super Smash Bros. Brawl video, you'll notice that the first characters are shown as they were in Super Smash Bros. Melee (NGC) before being turned into the new version which look better.

Well, Kirby and Pikachu didn't change...
Anyhow, it looked like just a new texture for both. Mario was given a slightly more detailed texture, and Link was made to look like he does in the new Zelda, rather than in Ocarina of Time. It could just be the additional graphics memory, if anything, that made this possible if it wasn't on GameCube.

About the shaders, they basically do the same than TEV but in the GCthe ops/data that textures can do/store is very limited IIRC, about the IGN article that things is really bad (qualitity wise) I cant even understand if he meant that the GPU does have 3xmpre the texture cache or if he is just talking about the same 3Mgs from flipper.

Anyway I still find very strange that in avarerage, besides textures, most will games looked way worst than they should (considering that it is at least ,GC HW + 2x the memory + 50% speed) once you compare it too SW/RE4/SC3, even Zelda that from what I read it is one of the wii best looking games it looks EQUAL to the GC version.

I'm not even sure if the games have better texture overall, Red Steel had some hideous textures and character models, but the Need for Speed Underground style special effects help hide that really well. (it certainly made undergrond look much better than its texture or polygon counts alone would, but there's somewhat of a backlash against that visual style....beyond good and evil used it too, and many games now use it in combination with high polygon counts and good textures)

The Zelda game IS the gamecube version, so I wouldn't judge it as a Wii game. Rather, judge that it looked better than most of the Wii games shown. If IGN's comments about specs are right, it looks exactly like it should, like GameCube graphics but slightly more. How good graphics looks doesn't scale linearly with the power (with the exception of major feature additions, like shaders), usually it's more like 10x the power is required before many people even notice a significant difference in quality, assuming all features are equal. The xbox 360 just barely makes that power jump and gets ragged on for looking too much like Xbox. And then there's still the people who claim Dreamcast had better graphics than ps2...

Graphics Processing Unit: Being developed with ATI.

Don't put too much hope into a grammatical error. Final clock speed probably isn't locked down yet, maybe even final memory amount isn't, but the hardware is not going to change significantly prior to launch.

Oh well, at least the games look good. It saves Nintendo work too, adding bump mapping and shininess to all their characters would require yet another artistic reenvisionment, and the one they already went through going from n64 to gamecube already made many of their characters more detailed and realistic than fans thought appropriate. (Mario in Melee had visible denim stiches, Bowser looked like a vicious dinosaur, it was the "Who Framed Roger Rabbit?" of Nintendo)
 
I pretty much agree that over all the games look just like GC ones must of them worst than the best of GC, I will not really reply on texture till I saw better movies. And taking in acount that a 970FX + X1300-X1600 would be very low cost too it is a real shame IMO.

About the "gramatical error" I think I just had a bit more hope .



Once that ATI is been working on physics, meybe we will get a real upgrade. Althought now I do have really low hope for that.
 
pc999 said:
I pretty much agree that over all the games look just like GC ones must of them worst than the best of GC, I will not really reply on texture till I saw better movies. And taking in acount that a 970FX + X1300-X1600 would be very low cost too it is a real shame IMO.

About the "gramatical error" I think I just had a bit more hope .



Once that ATI is been working on physics, meybe we will get a real upgrade. Althought now I do have really low hope for that.

Yeah, it would really stink if Nintendo skipped over better hardware for backwards compatibility. Well, unless they're going for a system on a chip process, which given the specs might be possible, discounting memory. Actually, gamecube already was almost a system on a chip, wasn't everything except the cpu and memory on the same die as the gpu?
As for textures, there are some hi def movies out on the Internet. They seem to show that the Wii will retain the same crisp output that the gamecube had and they look good, but many gamecube games still look good, especially the ones that focused on creating a solid experience rather than pushing the limit. It's nice to see a game where you don't have to think "That would look so much better if it just had a higher/stable framerate" or "Anti aliasing" or "Proper texture filtering". (I don't think Wii will be using AA though, but try and find the high res smash bros and mario galaxy videos)
 
Actually, gamecube already was almost a system on a chip, wasn't everything except the cpu and memory on the same die as the gpu?
As for textures, there are some hi def movies out on the Internet.

IIRC, yes. I will try to find them.

Fox5 said:
Yeah, it would really stink if Nintendo skipped over better hardware for backwards compatibility.

Meybe, just meybe...

 
NANOTEC said:
Nice find pc999:D :oops:

Nice HTML markup, I didn't know you could link a quote box.

That is an interesting comment, but Nintendo's had ambiguous statements before that turn out to be nothing like what's expected. I really like the job Nintendo marketting is doing with the Wii so far. Maybe the hardware is a bit more powerful than expected though, the 50W power envelope seems a bit high considering if it was using essentially gamecube hardware shrunk from 180nm to 90nm, considering that the gamecube had a power envelope of about 40W.
I feel that the Wii, far more so than the xbox 360 or PS3, deserves two skus. They actually could differentiate in power relatively easily (have the resolution and maybe a dolby digital encoder chip be the only things different between the systems) and keep the base functionality the same between the two, unlike the xbox 360 and ps3 that differentiate between core features for their high and low skus. A 2nd graphics chip would give the wii the power it needed to go from 480p widescreen to 720p (though since all it really needs is doubled fillrate, memory bandwidth, and framebuffer size, it may make sense to just make a high end chip rather than duplicating everything). I know it won't happen, but I'd love to see the base Wii launch at $200, and then a hi def Wii launch at $300 or $400. Artistically, Zelda looks better than many xbox 360 games, so something competing against the xbox 360 pricewise for graphics might not be looked too badly upon, with the right games.
 
Fox5 said:
Artistically, Zelda looks better than many xbox 360 games, so something competing against the xbox 360 pricewise for graphics might not be looked too badly upon, with the right games.
To each his taste, I don't like the new Zelda that much, I feel the gfx made it loose some of its charm...

That said I wasn't convinced about Wind Waker (screenshots) until I saw it in motion...
 
Fox5 said:
Nice HTML markup, I didn't know you could link a quote box.

That is an interesting comment, but Nintendo's had ambiguous statements before that turn out to be nothing like what's expected. I really like the job Nintendo marketting is doing with the Wii so far. Maybe the hardware is a bit more powerful than expected though
Without knowing the specs, the thing to look to is the graphics and games themselves (actually you're better off looking at these than spec-sheets!). If the 'hardcore graphics-fiend' gamers are supposed to keep paying attention and expect something better than guessed so far, wouldn't that show in the games? Or is the final Wii spec going to improve noticeably from the devs have been developing for and shown their games for? Because at the moment though the Wii grapihcs are workable and not bad, they're not providing anything indicative of 'next-gen' power.
 
Back
Top