Nintendo Wii specs (courtesy of maxconsole.net)

hupfinsgack said:
Already in discussion in this thread.

OT: New, valid, discussion worthy information should not be barried in 900+ post threads that are dominated by aimless conjectures and predictions. Most of us gave up on following that thread as it carried on, and on, and on with no decent summary of new information (because there was none!).

Seriously, this deserved a new thread. Anyhow, on topic: Looks like Matt was pretty much on the money if this ends up being right, and seeing as Matt claims Wii devs as his source I think he is after we saw the E3 footage. All those Matt bashers may owe him a beer. I would be VERY dissappointed if this did not launch at $199 and quickly hit $149. Should be enough power and memory to get some pretty nice games though.
 
Last edited by a moderator:
Acert93 said:
I would be VERY dissappointed if this did not launch at $199 and quickly hit $149.
Well given the pricing on other consoles I think they can easily get away with 250$ at launch. But yea I should hope they go lower.

PowderKeg said:
The Gamecube uses 1MB of it's EDRAM for texture cache.
Which is about 30x more then most GPUs (including Xenos) have for texture cache. How you fathom this constitutes a bandwith limitation is beyond me.

darkblu said:
from themselves to the otherwise 100% (overclocked) Gekko that those specs depict?
As noted, they are 100% Gekko specs (the same 750cx bit is in Gekko docs as well). If Nintendo pulls it off, it'll be a nice mockery of nexgen concepts MS and Sony are pushing for - if there was ever approriate time for "console Version 1.5" monicker this would be it.
 
Fafalada said:
Well given the pricing on other consoles I think they can easily get away with 250$ at launch. But yea I should hope they go lower.

$249 Wii versus a $299 Xbox 360 Core... as a long time Nintendo owner, and similarly somewhat excited about the potential of the Wii controller, a $50 difference just doesn't convince me. Nintendo's 3rd party software support has lagged and it looks like people are scrambling to support Wii whereas the 360 would have significantly better hardware and a much broader support in software.

Of course the 360 would not have the Wii mote, and MS has nowhere the mindshare of great family friendly franchises. Of course I am on B3D for a reason and as much as a cherish gameplay I would have a hard time throwing down $249 when for $50 I could get a proper next gen console.

Of course this is just me, but it would seem to me $249 would be putting it more in competition with the "proper" next gen consoles. $199 on the other hand... that is a console and 2 games and maybe even an extra controller.
 
We have two (for those who dont remember) much more credible sites teling us that it is 2,5-3x GC (with or without advancements in features?) why would we belive in this one who says 1,5xGC. Unless GC really need lot of memory/BW I dont see how a 50% speed up of equal chips would make it.


Plus in the last two pages (35-36 IIRC:LOL: ) of the other thread there is a lot of reasons why this is at least strage, some from a tech POV others no (like why are they paying, and still not have the job done, to ATI/IBM; or Retro that refered that it is a new architeture; 200-250$ and barely losse money but meybe still lossing etc...).


BTW completely agree that if real then it is completely overpriced, unless it brings some games and extra controllers but I would prefer a cheaper console.
 
Last edited by a moderator:
pc999 said:
We have two (for those who dont remember) much more credible sites teling us that it is 2,5-3x GC (with or without advancements in features?) why would we belive in this one who says 1,5xGC. Unless GC really need lot of memory/BW I dont see how a 50% speed up of equal chips would make it.
I think the change from GC to Wii is comparable to the change from PS2 to slim PS2. Slim PS2 has an Ethernet controller, while Wii has auxiliary chips such as flash and WiFi. Wii has more memory than GC, but it's because Nintendo replaced ancient chips with currently available ones which are cheaper in the market than extinct SDRAM and happened to have a bigger size. 1.5x overclock is also a byproduct of chip shrink, it may be possible for slim PS2 too. Overall cost reduction is the priority of Nintendo.
 
Great

thenefariousone said:
Maxconsole.net apparently has the full Nintendo Wii specs.

Thank you my friend for Wii specs. I think this will have great graphics better than old gen. I feel Rogue Squadron and Resident Evil 4 had amazing graphics (maybe best old gen no?) so 3x power is great!
 
ihamoitc2005 said:
I feel Rogue Squadron and Resident Evil 4 had amazing graphics (maybe best old gen no?) so 3x power is great!

Id guess a couple Xbox games did a good deal more technically (Riddick for one), but for overall polish and performance you could certainly make that case.

The question to me is how is this config going to age in 2 or 3 years next to a 360/PS3 that are really hitting their stride in visuals and game design scope and complexity. Now, Im excited that Nintendo is pushing in this asymmetric direction, and Im interested to see how the development community embraces and evolves with the new interface. But to me this is a visual medium as much as it is an interactive one. I dont think the two aspects can be seperated at all. Its potential relies on both marching forward. So for such a valued brand in Nintendo to essentially eschew one over the other is fascinating and at the same time a bit disconcerting on another level. Interesting times ahead.
 
fearsomepirate said:
More like, how much difference did that extra 10 GB/s for framebuffer effects and texturing make? Answer: enough that a system with a much, much simpler and 30% slower CPU and about 40% less total RAM was able to output relatively comparable graphics and maintain the most consistent framerates of any of the 3 consoles. And it didn't lose money.
I've had little experience with GC, but weren't a lot of it's games limited to 16 bit FBs? These specs would produce current-level graphics but with better texture resolution and a few more characters say, in lower colour fidelity and without high levels of AA - visually it's going to look weak by comparison on just visuals. That might not matter to their customers, but I for one would much rather play the Wii with smooth, jaggie reduced (XB360 and PS3 ought to be able to do amazing quality SDTV) graphics than what we have now. Current gen graphics but with lots of AA and AF would be fine, but with none of the IQ niceties, Wii's going to grate. Especially for those with big TVs.
 
Shifty Geezer said:
I've had little experience with GC, but weren't a lot of it's games limited to 16 bit FBs?
Games that wanted destination alpha had to use a 24bit format (6:6:6:6) which resulted in banding. Most visible in P.N.03 (because of its use of grays and near textureless surfaces), which by the way really deserves a successor IMO. Hear me, Capcom?
 
Acert93 said:
$249 Wii versus a $299 Xbox 360 Core...
FWIW: I've already seen the Xbox 360 Core incl. Ridge Racer 6 as low as $215 (+VAT) on sale from a big retail chain around here (owned by Dixons). This indicates to me that $249 will be too high an asking price for Wii since MS will be able to hit them on price with the 360 Core. They might not reduce RRP, but I they sure has the muscle to get a out good number of consoles at a competitive price through big retail partners/customers to declaw a $249 Wii launch. Thus, I doubt it will be more than $199.
 
Last edited by a moderator:
[maven] said:
Games that wanted destination alpha had to use a 24bit format (6:6:6:6) which resulted in banding. Most visible in P.N.03 (because of its use of grays and near textureless surfaces), which by the way really deserves a successor IMO. Hear me, Capcom?

seconded. hear us, capcom?
 
one said:
I think the change from GC to Wii is comparable to the change from PS2 to slim PS2. Slim PS2 has an Ethernet controller, while Wii has auxiliary chips such as flash and WiFi. Wii has more memory than GC, but it's because Nintendo replaced ancient chips with currently available ones which are cheaper in the market than extinct SDRAM and happened to have a bigger size. 1.5x overclock is also a byproduct of chip shrink, it may be possible for slim PS2 too. Overall cost reduction is the priority of Nintendo.


Personally I dont agree very much, I would if this is completely real.

BTW besides memory, everything only increased 1,5 (in those specs), would that be gives it 2,5-3x the performance of GC?

Personally I dont particulary mind with relatively low specs as long as price reflec it (and they basicaly said that price-cost would be very close, so if they are indeed loking to sell it at 200-250$ should be a bit more than this no?) but what I dont like is if it cant do new gameplay forms like physics gameplay (ie, HL2 level), but I will let the price speak for itself.

I think, given eveything we know, this is very strange if real (multiple reasons above) and I doubt it is real.
I also find strange Nintendo did nothing to improve the specs, besides 2-3x (even with thier strategy) but that is other thing.
 
Last edited by a moderator:
If as a developer, Wii had 16MB of edram, is there anything you use it for?

I was guessing AA, assuming the bandwidth of the edram would be pretty high.

I won't to know why Nintendo would release devkits without Hollywood if it was just an overclocked Flipper.
 
pc999 said:
BTW besides memory, everything only increased 1,5 (in those specs), would that be gives it 2,5-3x the performance of GC?
Doesn't the GPU have some new fixed functions added? Also the 1T-SRAM might be upgraded to 1T-SRAM-Q. These minor changes can collectively contribute to additional performance though it's not clear how much it really does.
pc999 said:
but what I dont like is if it cant do new gameplay forms like physics gameplay (ie, HL2 level), but I will let the price speak for itself.
As for physics, in this June 21 interview with the producer of Elebits, the Wii game by Konami, he talks about implementing physics in the game based on a certain engine (probably Havok).
http://watch.impress.co.jp/game/docs/20060621/ele.htm
-- I assume there are limitations such as calculation cost and memory problems, how did you assign resources?

Mukaitouge: Rather than the resource problem, the calculation speed was painful as we had feared... The bottleneck is not graphics but physics, the frame rate was terrible. On several occasions I thought "what, is it right? Perhaps we should give up physics." Then, it was 1 month ago, a bit before E3 that the goal came in sight. Until then each programmer had optimized here and there bit by bit, they hadn't be able to produce a good result. But 1 month ago what were stacked up fitted into the right places, the day of "hey it's running at 60fps!" came, at that time the developers were excited a lot. Though I don't know if it runs at 60fps in the final version, it can keep 30fps at least, I think. That part was the most painful. Now that it could reach the level where it's OK as a game, I suppose it can proceed as it is.

-- You mean the graphics is not as painful?

Mukaitouge: It seems the graphics has headroom relatively speaking. Elebits is not a game in which you do something with the latest technologies such as normal mapping. Since it has headroom in terms of graphics I'm planning to do something more.
You know, I read a Wii developer interview in the other thread in which the guy says the Wii CPU is comparable to Athlon and the Wii GPU has some physics acceleration, then read the interview of the Elebits developer, which was enough to regard the rumor about the Wii GPU with physics acceleration as BS unless the Elebits producer talks about an old GC devkit throughout the interview.
 
Last edited by a moderator:
One that interview you refering to was fake or at the very least something that Ubisoft has had to deny taking place, since fearsome washed his hand of it no one takes it seriously.
 
one said:
Doesn't the GPU have some new fixed functions added? Also the 1T-SRAM might be upgraded to 1T-SRAM-Q. These minor changes can collectively contribute to additional performance though it's not clear how much it really does.

According to this leak we cant know, but I hope for (at least) that.
I had interpreted this as everything 2,5-3x faster (ie as CPU 2,5-3x faster, a GPU 2,5-3x faster...)

As for physics, in this June 21 interview with the producer of Elebits, the Wii game by Konami, he talks about implementing physics in the game based on a certain engine (probably Havok).
http://watch.impress.co.jp/game/docs/20060621/ele.htm

Thank you very much for the translation:D .

It is hard to tell if there is more or less than HL2 physics (really diferent games) or if there is for other kind of/more complex games eg FPS/RTS... (personally it seems there is more physics than HL2, if everything react at the same time) but is there after one put in there animations, AI...? I hope so.

Anyway the really interesting question is how they resolved the lack of calculations power problem if optimization didn't worked before, specially at 60FPS (that should be at least 3x faster):?:


You know, I read a Wii developer interview in the other thread in which the guy says the Wii CPU is comparable to Athlon and the Wii GPU has some physics acceleration,

That has a supposed interview by a supposed friend of fearsomepirate, he should confirm if real to us later but even he already said it must be fake (he had no news from here).

then read the interview of the Elebits developer, which was enough to regard the rumor about the Wii GPU with physics acceleration as BS unless the Elebits producer talks about an old GC devkit throughout the interview.

Personally that even bring me more doubts because according to IGN/EA/Retro they only got dev kits that resemble Wii HW little time before E3 (IIRC between 1 month to 2 wekes before) the same time they got them working. Plus if there is a GPU physics rumor you should blame them.

Elebits Interview on May 11 said:
IGN: Is the hardware as easy to use on the Wii as it was with the GameCube? The two systems are very similar is structure we're told.

Konami: Yes, the structure is very similar to GameCube, but you already knew that. The development was not that difficult, as the Wii system has built in physics simulation. That helped the process.

BTW/PS one of the other arguments, at the time, is if Broadway is just a OC Gekko and there is nothing to help with then why is Havok optimising their engines for Wii? Althought I am not sure if this is a good argument or no.
 
Last edited by a moderator:
darkblu said:
not really. because the framebuffer is such a negligible BW contributer. it's really easy to forget about it. i mean, i still can't figure it out for the life of me why MS bothered with all the trouble to implement one in the 360, poor shmucks..
Negligible?! http://www.beyond3d.com/forum/showpost.php?p=502341&postcount=44
The only reason PS3 didn't go with eDRAM this time around was that it would have taken way to much of it to contain a HDTV frame.
Wii with it's EDTV resolution can do it without having to go over budget. I just really hope Nintendo has included enough for a RGBA8 mode.
 
Squeak said:
John Ruskin

I just had to laugh when I read that quote. I mean, that's a lot coming from a guy who could just have said "to be read and understood, keep it short and simple."

/offtopic
 
Back
Top