WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
man, hoping this stuff is all true. has anything cropped up to indicate that this info is bogus?

I like the stated amount of edram / 1t-sram in that interview. more than double that of Flipper.
 
Urian said:
Why you need the A-RAM for compatibility purposes?

You do not really, you can just patch the calls to A-RAM in such a way that addresses that on GCN software would point to A-RAM would point to main RAM, something they can easily work out IMHO.
 
Sup all. I e-mailed Jessica asking to listen to the interview. I'll keep y'all posted. I'm posting zilch on GAF, IGN, GSpot, or G4. Pearls before swine and all that.
 
The performance numbers for the CPU are very interesting. An Athlon XP 2400+ is 2Ghz, which means it has single-threaded integer performance that is better than any of the next-gen consoles. Remember, it's OOOE so it has something like 2x the IPC of a PPE.

So either it's a PowerPC 970 derivative, a 4Ghz processor PPE, or dual 2Ghz PPE.

Then first is impossible given how hot that chip is, and 4Ghz may be beyond what the PPE can deliver. That leaves us with 2Ghz dual core PPE as the most likely candidate.

PS: Anyone getting the sense that the guy just listed the specs of a laptop? Makes more sense than a real leak.
 
I didn't say you couldn't do it, but the question was about fitting inside 8MB, and clearly, it won't be done without tiling.
 
I hope the new information is true it only makes me more excited to get my hands on a wii later this year. If the wii can really do 8x af and 4x aa in most games it would be one pleasent suprise to me.
 
Last edited by a moderator:
Alstrong said:
640x480x(32bpp+ 24bit Z)/8*4/1024/1024=8.2MB
You also forgot that it will do 480p widescreen, so 854x480. (It comes to 11 MB.)

The framebuffer seems rather small, though, at 2 MB. You'd be trying to compress 6.25 MB of frame into that (which I'm not sure is possible). And do you really need a Z-value for each FSAA pixel? I mean, you're going to end up with 24672, 24672, 24672, 24672 saved in memory for one pixel. Without duplicating all of those values a 32 bit 480p Z-buffer is only 1.56 MB.
 
The "typical" case would be 640x480, and the point is that the lowest common denominator indicates that the frame will not fit within a single tile, even with the lower precision Z-Buffer.

Resolution alone does not dictate "widescreen" view angle. The image is usually stretched.
 
Last edited by a moderator:
Alstrong said:
The "typical" case would be 640x480, and the point is that the lowest common denominator indicates that the frame will not fit within a single tile, even with the lower precision Z-Buffer.

Resolution alone does not dictate "widescreen" view angle. The image is usually stretched.
Well then no problem there. Just render at 320x240 and stretch the heck out of it. Plenty of room for 4x FSAA then.
 
OtakingGX said:
You also forgot that it will do 480p widescreen, so 854x480. (It comes to 11 MB.)

The framebuffer seems rather small, though, at 2 MB. You'd be trying to compress 6.25 MB of frame into that (which I'm not sure is possible). And do you really need a Z-value for each FSAA pixel? I mean, you're going to end up with 24672, 24672, 24672, 24672 saved in memory for one pixel. Without duplicating all of those values a 32 bit 480p Z-buffer is only 1.56 MB.
Yes, you need a Z value for each FSAA sample. For a screen aligned triangle like you describe Z compression will save bandwidth, but since you can't be assured that compression is always possible you need to allocate memory for the worst case. Lossy compression is the only way to guarantee a smaller memory footprint. Parhelia is the only GPU I know of that I'd consider to have a form of lossy compression and that led to artifacts in certain situations.
 
OtakingGX said:
Well then no problem there. Just render at 320x240 and stretch the heck out of it. Plenty of room for 4x FSAA then.

OtakingGX, that's been the normal way to do widescreen, no need to get grumpy.
 
I'm really suprised that nobody has called this article fake yet. Not that I'm saying it is, but usually there are always some skeptics. For some reason it feels slightly unatural that everyone is accepting this interview as real

Being the ignorant, good for nothing bastard that i am:

I think its fake. Based on nothing more gut feeling and a little comon sense.

An identified third party developer heading an important proyect, just out of the blue and in an informal interview reveals what Nintendo has been hiding the most? Their biggest secret?
I dont think so.

Btw, im not questioning Fearsome's credibility. In fact i really like his posts.

Finally, the games dont reflect this hardware.
 
nonamer said:
So either it's a PowerPC 970 derivative, a 4Ghz processor PPE, or dual 2Ghz PPE.

Then first is impossible given how hot that chip is, and 4Ghz may be beyond what the PPE can deliver. That leaves us with 2Ghz dual core PPE as the most likely candidate.

Why not a extention of Gekko (eg intel did a great job with the PIII to Core Duo)? Or any other core that didnt saw the day light till now (like the "PPE")?

Refreshment said:
An identified third party developer heading an important proyect, just out of the blue and in an informal interview reveals what Nintendo has been hiding the most? Their biggest secret?
I dont think so.

Actually the konamy guy as also talked about physics HW this is just the first time someone asks directely about specs to a dev (at least that we have read, in every other interviewn or the reporter suposse that the specs are low (IGN) or didnt asked, anyway there are very few interviewns from wii), also there are comments like: MarkRein that said that UE3 just wouldnt run in HD (indeed if these are the specs); Perry Kaplan; next gen games on wii from ubi leak; ERP;... Plus we dont know if this is the big secret (if there still any left).

Finally, the games dont reflect this hardware.

Did last year E3 XB360 playable games reflected the 360 HW? IMO most of them had the same kind of improvement that Mario/MP/... had over their GC versions.

I am not saying that this is true, just that there is many things that suport this and fall right here it can give you a upgraded experience yet it still cheap and cool to produce.
 
Last edited by a moderator:
OtakingGX said:
You also forgot that it will do 480p widescreen, so 854x480. (It comes to 11 MB.)

The framebuffer seems rather small, though, at 2 MB. You'd be trying to compress 6.25 MB of frame into that (which I'm not sure is possible). And do you really need a Z-value for each FSAA pixel? I mean, you're going to end up with 24672, 24672, 24672, 24672 saved in memory for one pixel. Without duplicating all of those values a 32 bit 480p Z-buffer is only 1.56 MB.

What if the anti-aliasing is only edge AA? With 8X anisotropic filtering for textures.

Also to be honest I doubt he litterally means that the Z/Framebuffer memory is in two seperate 2MB pools. Its not very efficient to do it that way AFAICS since the Z-Buffer is unlikely to ever need to be as big as the framebuffer. With Flipper the embedded memory was split into two pools, 1MB for texture cache and 2MB for buffers, so I'd guess that if this interview is real Hollywoods embedded memory will also be split into two pools of 4MB.
 
Last edited by a moderator:
I thought his answers were pretty handwavy. From what I gathered, the Red Steel floor demo was the Gamecube version. The trailer looked fairly consistent with this (very vaguely defined) hardware.

Factor 5 got Rogue Leader running on Gamecube in 480i at fairly consistent 60fps with 3-subpixel antialiasing. And that was a system with 1 MB of framebuffer and 2.6 GB/s main bandwidth--so go figure. 4x AA at 480p doesn't seem unbelievable to me if these specs are real.

We'll have proof one way or the other soon enough.
 
Last edited by a moderator:
Alstrong said:
640x480x(32bpp+ 24bit Z)/8*4/1024/1024=8.2MB

That's assuming it'll be doing 32bpp. Oh, is that calculation just for a single buffer?

An Athlon XP 2400+ is 2Ghz, which means it has single-threaded integer performance that is better than any of the next-gen consoles. Remember, it's OOOE so it has something like 2x the IPC of a PPE.

Yeah, but an athlon xp 3200+ is 2.2ghz but with nearly double the bus bandwidth and half the latency, for some things better memory performance (mainly latency) will scale performance linearly (games actually seem to be one of those) while in other things it won't.
Could be upwards of 3x the performance per clock of a PPE...

Factor 5 got Rogue Leader running on Gamecube in 480i at fairly consistent 60fps with 3-subpixel antialiasing. And that was a system with 1 MB of framebuffer and 2.6 GB/s main bandwidth--so go figure. 4x AA at 480p doesn't seem unbelievable to me if these specs are real.

Did the AA shut off at 480p, in which case it was more of a flicker filter? BTW, was it ever revealed what type of AA algorithm the GC supported? I remember sub-pixel antialiasing mentioned as a spec, but unless they were using subsampling (which the flicker filters generally looked like) then I don't think that would refer specifically to any form of AA.
 
Is it possible Ubisoft Paris has final hardware already, considering they're particularly close relationship with Nintendo at this time?

Why can't he comment on memory, seems to suggest Nintendo hasn't come to a final number for main memory. Which could also affect the consoles price.
 
Status
Not open for further replies.
Back
Top