WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
Again, I think this is down to a misinterpretation of leaked/rumored information. Both memory pools are sure to be external, as is the case with the current Gamecube.

Could be, but there is a saving to be made to embed that 24 MB. Leaving only one external pool. If not they should really design it with a single larger memory pool.
 
Hi all i have lost my old account and i follow this forum mainly reading it cause much serious people. I have read this about the chip codename ''hollywood'' :

The graphics processor (code-named "Hollywood") will use an ATI chipset running at 243Mhz with 3 MB of texture memory. It might also have 32 shader pipelines -- 16 fewer than the Xbox 360. However, the Nintendo GPU is rumored to run at 500 million triangles per second (100 million sustained) -- roughly equivalent to the Xbox 360. It will also be able to handle 50 billion shader operations per second, which is about the same as the 360 as well.

source: http://electronics.howstuffworks.com/nintendo-revolution1.htm

What you think about this? thank you for reply bye.


nobody know if this is fake?
 
Could be, but there is a saving to be made to embed that 24 MB. Leaving only one external pool. If not they should really design it with a single larger memory pool.

That's not really a savings. A likely 64-bit GDDR bus to some surface mounted DRAMs is not a black art. Bloating your GPU another 200M+ transistors and sacrificing yields in the process, on the other hand, is something that simply cannot be afforded in an economical $250 console.

This is all pretty straight-forward. The Gamecube had two external memory pools. For the sake of compatibility, Wii retains these, including the existing 24 MB 1T-SRAM pool. The 8-bit/16 MB DRAM interface is swapped for something a bit beefier. The CPU and GPU receive modest clock speed increases and perhaps a few other upgrades (512 KB L2 on the CPU, potentially some changes on the GPU).

Anything else is really stretching it, in my opinion.
 
darkblu said:
very interesting.. and it has always been a possibility. do you happen to know the bandwidth or the clock of that pool?
Don't remember the exact number but IIRC it was marginally higher then the SRam (kinda like XDR and GDDR in PS3).

V3 said:
If not they should really design it with a single larger memory pool.
I have no idea if 24 is embeded, but unified pool would probably be cost prohibitive. And making all ram into some cheaper variant might compromise backward compatibility.
 
I wonder if another reason for two dissimilar memory banks is so that one (the presumably higher power GDDR3 bank) can be powered down when the device is in the low power 'off' state, leaving the low power 1T memory bank running to support the Wii24 background operations?
 
Could be, but there is a saving to be made to embed that 24 MB. Leaving only one external pool. If not they should really design it with a single larger memory pool.

given Faf's info is right, the 24MB pool has to be external - the proverbial mosys press release speaks about embedded and external usage of 1T. so if it is not the 64MB pool (not being 1T-sram) then it has to be the 24MB pool which is the external.
 
Afaik the other 64MB is GDDR (and with a bit more bandwith then the 24MB pool to boot).

Inst the GDDR latency higher than the A-Ram on the GC? If so, considering that A-Ram it is used not just for audio but also code, how can they have BC with all games on the GC?

Also this means that they did some good rework to the flipper/hoolywood so it can handle the much higher latency than 1T-Sram of GC. I wonder what more they did reworked for Wii.

BTW

FC:V pics+Interview

http://wii.ign.com/articles/733/733951p1.html

It is just me or they are worst than XB version (blaming bad/rushed port IMO)

IGN Wii: The Wii has been described as more powerful than the original Xbox, but not as powerful as Xbox 360. What type of visuals can we expect from the Wii version of Far Cry?

Fabrice Cuny: Overall the Wii is more powerful than an Xbox, even if the Xbox can do some stuff that the Wii can't. But remember the Wii is more focused on its unique gameplay using the Wii controller and not on power. We can expect a game as beautiful as what we used to see on Xbox.


IGN Wii: Will the game run at 60 frames per second?


Fabrice Cuny: Like the previous titles, Far Cry Vengeance will be running at 30 frames per second to ensure stability.
 
Last edited by a moderator:
Inst the GDDR latency higher than the A-Ram on the GC? If so, considering that A-Ram it is used not just for audio but also code, how can they have BC with all games on the GC?

Since the A-RAM on had an atrocious bandwith of 81MB/s I highly doubt that it's even anywhere near the latency of GDDR. The GDDR should have a much lower latency.
 
Since the A-RAM on had an atrocious bandwith of 81MB/s I highly doubt that it's even anywhere near the latency of GDDR. The GDDR should have a much lower latency.

Thanks.

Anyone knows what they mean with this "Overall the Wii is more powerful than an Xbox, even if the Xbox can do some stuff that the Wii can't."? I guess that it can do more polys/lights/s and other things also possible on the GC (eg self shading and the ones presented on gamasutra article) faster than XB would do, but not some advanced shading methods possible on XB (eg vertex shading based, "fast" normal mapping ...).
 
Inst the GDDR latency higher than the A-Ram on the GC? If so, considering that A-Ram it is used not just for audio but also code, how can they have BC with all games on the GC?

Also this means that they did some good rework to the flipper/hoolywood so it can handle the much higher latency than 1T-Sram of GC. I wonder what more they did reworked for Wii.

BTW

FC:V pics+Interview

http://wii.ign.com/articles/733/733951p1.html

It is just me or they are worst than XB version (blaming bad/rushed port IMO)


The textures are low resolution, there's no normal mapping to speak of. Which is what I assume to be is what they are suggesting the Wii can't do. But, I believe, if Crytek was able to perform there version on the Cube, it is more than likely possible on Wii.

This is one of the problems I see with Nintendo decision to not use something similar to PC tech. Unless you have a developer willing to put the time in to come up with the algorithm to perform normal mapping, you want see it. This is the time I wish Nintendo had picked up Factor 5.
 
Since the A-RAM on had an atrocious bandwith of 81MB/s I highly doubt that it's even anywhere near the latency of GDDR. The GDDR should have a much lower latency.

that bandwidth was a function of the ultra-low-clock and ultra-narrow bus. we don't know anything about the RAM latency per se. although the bus clock already sets fairly lax latency requirements. for what it's worth the GDDR may have 4 times higher latiencies per bus clock but a 4 times faster bus would keep the pool's latencies as seen from outside to A-RAM levels.
 
Last edited by a moderator:
Thanks.

Anyone knows what they mean with this "Overall the Wii is more powerful than an Xbox, even if the Xbox can do some stuff that the Wii can't."? I guess that it can do more polys/lights/s and other things also possible on the GC (eg self shading and the ones presented on gamasutra article) faster than XB would do, but not some advanced shading methods possible on XB (eg vertex shading based, "fast" normal mapping ...).

What I don't get is the overall visual quality of SC:DA, compared to FC:V. Texture quality, what looks like normal mapping, shadows.

Also why the lack of bump mapping, and what is going on with Midway performing some new shading affects in Rampage.
 
The textures are low resolution, there's no normal mapping to speak of. Which is what I assume to be is what they are suggesting the Wii can't do. But, I believe, if Crytek was able to perform there version on the Cube, it is more than likely possible on Wii.

This is one of the problems I see with Nintendo decision to not use something similar to PC tech. Unless you have a developer willing to put the time in to come up with the algorithm to perform normal mapping, you want see it. This is the time I wish Nintendo had picked up Factor 5.

Yeah, it looks terrible. This is a disgrace. How can you not have normal mapping in a game like far cry. Nintendo has shot themselves in the foot with this hardware.
 
No-one believes that info.

Who's this No-one character? ;)

edit: Wow those Far Cry shots really do look like ass. I don't remember my Xbox Far Cry Instincts looking like that.... These shots look very blurry with quite poor textures. The forest up the mountains has been culled down a ton too. What? Couldn't it handle the 2D trees!?!?!

http://media.wii.ign.com/media/846/846381/img_3931987.html
I mean YIKES! There's some horrible dithering going on in the smoked area. OMG or is that DRAW IN?!? The gun looks like something out of Project IGI (heh!!). The ground texture is appealing to the judges for a worst-ever award. The tail of the Heli has some beautiful heavy aliasing going on. But! Wait! There is bloom! This game's running 480p and looks like that!?

Let me just say that I played Far Cry on a Radeon 9600 with 64MB RAM. A Mobility Radeon 9600. And, it looked a hell of a lot nicer than this.
 
Last edited by a moderator:
re the those misfortunate recent FC:V shots:

what do you expect, people? Ubi have been totally bandwagon-jumping the wii with those 8 titles. originally 2 of those (RS and RRR) were actually designed as full-fledged titles for the platform, and the rest being just quick cash-ins. i would not be surprised if the level of development effort that went in the low-profile ones was somewhere in the vicinity of xbl arcade games (no offence for the live arcade intended).

i would not draw any conclusions from such low-profile titles, and even less so take them for hw estimates.
 
Status
Not open for further replies.
Back
Top