WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
Megadrive1988 said:
I played the N64 OoT emulated on PC--there was an option to boost the framerate, which really worked -- it seems to be running at, a good 30fps. i was impressed. I forgot the name of that N64 emulator though (dang!).... years later I used the Project64 emulator, the version I have runs the game at basicly the original ~20fps, and I don't see any options to boost the framerate.

I probably was UltraHLE, it was the first high level emulator around and it supported Glide at that time.
Linky Wiki
 
fearsomepirate said:
You're joking, right? It's 4 times the pixels. It makes a huge difference in the sharpness and clarity of the picture. Just put in your N64 and your Cube and switch back and forth between the two pictures, and you'll see what I mean. Of course, it doesn't do anything for the texture resolution. And I really don't know why this matters--it was a freebie pack-in bonus so you could play an N64 classic on your Gamecube. They weren't selling it as a $49.99 stand-alone game. Obviously, if they wanted to port it and enhance the graphics, they could have, just like they did with Mario 64 on the DS.

Huh? No I wasn't. Someone earlier said that they didn't even up the resolution on OoT on the Gamecube, and I pointed out that he was wrong. Seriously, who are you talking to? You shouldn't just make up random things and attribute them to people.

It may be 4x the pixels, but I didn't see a huge difference in clarity. IMO, a PC emulating the game at 640x480 provided a much bigger jump.

Oh, and who said that they didn't even up the resolution of OOT on gamecube? Was it even said?
My statement was
Based on how little nintendo improved the graphics of Ocarina of Time when emulated on the Cube, I doubt they'll do any enhancements. Forcing a higher resolution or AA would have been nice though.

I didn't say they didn't enhance Ocarina of Time, just that very little was done to it. The 640x480 was likely a side effect that the default resolution of the gamecube was 640x480. The forcing a higher resolution comment or AA was in reference to gamecube backwards compatibility, you're right that it wasn't you who made the comment about 720p backwards compatible games though. Sorry, forgive me? :)

I played the N64 OoT emulated on PC--there was an option to boost the framerate, which really worked -- it seems to be running at, a good 30fps. i was impressed. I forgot the name of that N64 emulator though (dang!).... years later I used the Project64 emulator, the version I have runs the game at basicly the original ~20fps, and I don't see any options to boost the framerate.

Majora's Mask actually runs at a lower framerate. Anyhow, Zelda 64 seemed to have everything keyed for 20fps (or maybe 15fps or something around there), not 30fps, so if you run it at a higher speed in an emulator then either the animation and sound speeds up, or the animation and sound looks and sounds exactly like it does in the n64 version. I recall seeing both in an emulator, depending on the options selected, but it appears that the jerky look the game had was innate to the game since it exists even at 60fps unless the actual game is sped up, which then generally breaks the sound.
 
There was a leak sort of on IGN, the Broadway is supposed to be a 750CL. Max speeds are noted as being 900Mhz. So the question is, would it produce to much heat to be in the Wii console?
 
it's possible or likely that Broadway is based on, a derivative of, the 750CL, probably not just a 750CL.

edit: it seems there is doubt that an IBM PowerPC 750CL even exists.

back on topic: waiting for some real info to leak on Hollywood. dang, Nintendo-ATI are really keeping a fairly tight lid on it -- unlike with Gamecube. we knew almost all the details on Flipper well over a year before Gamecube came out.
 
Last edited by a moderator:
I don't think there's any such thing as a 750CL, they must have meant 750GL. By the way, did anyone here actually see the article before it disapeared? If so what exactly did it say?
 
It actually said 750CL and it wasn't a typo as someone who is actually reliable confirmed it. My guess is the 750CL is a lot like the VX that never showed up for whatever reasons. The same source also said the cache sizes were the same. Here is the picture of the original version of the article.
 
It actually said 750CL and it wasn't a typo as someone who is actually reliable confirmed it. My guess is the 750CL is a lot like the VX that never showed up for whatever reasons. The same source also said the cache sizes were the same. Here is the picture of the original version of the article.

I thought you said in the other thread that you thought it was actually meant to be 750GL and your source also backed that up? What has changed now to make you think IGN were correct when they said 750CL?

By the way, thanks for the pic :)
 
Last edited by a moderator:
I thought you said in the other thread that you thought it was actually meant to be 750GL and your source also backed that up? What has changed now to make you think IGN were correct when they said 750CL?

By the way, thanks for the pic :)

I inferred the GL part as I thought it was a typo but the post does CL enclosed. So I checked up with the source and he said it was CL despite that fact it doesn't seem to actually have existed he told me more about the cpu than I shared.
 
Well according to the IGN article the 750CL is a continuation of the 750GX (GL is also a continuation of GX by the way). Does anyone know in what way the GX is better then CXe (which Gekko is derived from)? Other then the far bigger and faster L2 cache that is.
 
Well according to the IGN article the 750CL is a continuation of the 750GX (GL is also a continuation of GX by the way). Does anyone know in what way the GX is better then CXe (which Gekko is derived from)? Other then the far bigger and faster L2 cache that is.

PowerPC 750GX
750GX (codenamed Gobi), revealed in 2004 is the latest and most powerful G3 processor from IBM. It has a on-die L2 cache of 1 MB, top frequency of 1.1 GHz, support bus speeds up to 200 MHz among other enhancements compared to 970FX. It is manufactured using a 0.13 μm copper based fabrication with Low-K dielectric and Silicon on insulator technology. 750GX has 44 million transistors, a die size of 52 mm² and consums less than 9 W at 1 GHz at typical loads.

A low powered version of 750GX is available called 750GL.

http://en.wikipedia.org/wiki/PowerPC_G3#PowerPC_750GX
 
Don't forget dual PLLs. Apparently these appeared in the 750FX in order to be able to run the chip at two different speeds. I think this is what will allow for Wii to be always on, as the 750GL documentation indicates that the chip naps and sleeps at 500 MHz instead of full speed for reduced power consumption.

I remember when speculation was flying that Broadway could be a 970FX (with 2-5 cores) I asked "what if it is based on Gekko?" So I looked around and found the 750FX and 750GX (which until late last year was a mythical chip that never surfaced for the Mac). I think we can pretty authoritatively say at this point that Broadway is some sort of PowerPC 750GX/GL/CL variant. Now the question is, what is Hollywood? Oh wait, that's the subject of this thread.
 
Why was that G3 even produced? I'm guessing IBM never adopted Motorola's G4?

That G3 seems very inefficient transistor wise. For that amount of transistors, there have been cpus with 50% more performance per clock than the G3 and clocked higher. I'm guessing that G3 is mostly cache though, which I think is more dense than logic, but I'd guess a 10-15% increase at most in per clock performance over the classic G3, and I'm assuming they did something more than add cache.*

*Oh wait, the old G3s would have had an external cache, so integrating it alone could net maybe 20% performance increase in many applications (and nothing in some as well), but I think Gecko already had an internal cache so the performance increase per clock over gecko is likely to be much smaller.

Anyhow, looking at some performance results going from a 256KB L2 cache G3 to a 1MB L2 cache G3 landed about a 15% performance increase, though the benchmark was presented by IBM and is likely to be a best case scenario.
 
The graphics processor (code-named "Hollywood") will use an ATI chipset running at 243Mhz with 3 MB of texture memory. It might also have 32 shader pipelines -- 16 fewer than the Xbox 360. However, the Nintendo GPU is rumored to run at 500 million triangles per second (100 million sustained) -- roughly equivalent to the Xbox 360. It will also be able to handle 50 billion shader operations per second, which is about the same as the 360 as well.

Found this on howstuffworks.

Now don't get me wrong, I take these specs with a pillar of shot.

I just have a question, are those numbers possible with right amount of R&D. Specifically to reduce cost by trying different routes in a simpler architecture?
 
The graphics processor (code-named "Hollywood") will use an ATI chipset running at 243Mhz with 3 MB of texture memory. It might also have 32 shader pipelines -- 16 fewer than the Xbox 360. However, the Nintendo GPU is rumored to run at 500 million triangles per second (100 million sustained) -- roughly equivalent to the Xbox 360. It will also be able to handle 50 billion shader operations per second, which is about the same as the 360 as well.

Found this on howstuffworks.

Now don't get me wrong, I take these specs with a pillar of shot.

I just have a question, are those numbers possible with right amount of R&D. Specifically to reduce cost by trying different routes in a simpler architecture?

It has a much lower clock speed, and fewer "pipelines", yet somehow is matching performance? Sounds like a ******'s wet dream, and matching theoretical specs wouldn't make sense with the extension of the cube's hardware. Cube's hardware had lower specs, but higher utilization of them, if hollywood does match xbox 360 in specs then I'd imagine it's doing less complex operations.
 
The howstuffworks' source is the bogus Rumor that han_solo was spreading something like a year ago. They've simply revised the clockspeed to fit the latest rumor. At the time that article came up, it said 550 MHz, just like the early wetdream rumor. Seriously, people. Check your source. Howstuffworks is not going to have an inside track that IGN doesn't have.
 
PGC said:
PGC: What do the programmers think of the Wii's graphic hardware? Is it as easy to program for as Nintendo claims?

Ness: It’s definitely been very easy for us to work with so far. We got everything up and running fast, within a couple of days of getting dev kits. And we haven’t hit any major technical snags either. As far as graphic hardware, we think it’s definitely suitable for the game we’re making. The ability to use bump maps to textures and anti-aliasing effects has made a big difference and the extra memory has allowed us to add more things to the screen - pedestrians, animations, destructibles, etc. – all the stuff that makes the game feel more alive.

http://www.planetgamecube.com/specialArt.cfm?artid=11911

Not much else in the interview...
 
Not sure what you mean there.. ?

I'm gonna guess he's referring to the fact that the Gamecube had all kinds of hardware features (such as emboss bump-mapping) that went virtually unused because developers simply ported from the PS2.

If you were to take a survey of cross-platform titles (which made up the bulk of current-gen games on Cube and Xbox), you wouldn't know that 2 of the 3 machines of the console generation had hardware plenty capable of bump-mapping, water shaders, specular highlights, self-shadowing, environment mapping, high bit-depth textures, and more than 2 or 3 hardware lights.
 
Status
Not open for further replies.
Back
Top