WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
V3 said:
This spec is old, I heard they added more memory.

I reckon they should achieve GC backward compatibility some other way. The way they are doing it right now, really compromise the performance of Wii that can be achieved with a given silicon.

Are you sure because I really doubt that things like multicore/VMX or a GPU wothmore (even if equal) pipelines would certanly boost performance and keep BC.

We have two (for those who dont remember) much more credible sites teling us that it is 2,5-3x GC (withor without advancements infeatures?) why would we belive in this.

Anyway going back to the RS new ss topic, I think that things like this shos that if it does look bad it isnt because of Wii HW.

RS
http://media.wii.ign.com/media/821/821973/img_3783668.html
http://media.wii.ign.com/media/821/821973/img_3784412.html

As we have this level of detail/complexity/scale with more and much better lighting,bump maping and self shadowing with a lot of action at the same time (other fxs (like EMBM) too, at least in others parts of the games).

Specially after looking at this or this.

Now I would guess that this kind of specs and low scale of RS it would be possible for someone did early RS visuals (althought what I want is early RS at large scale and it more physics).

I cant they just get dev kits on XB dev, at least it wouldnt take to long till we have some nice overviewn like in 360 case:LOL: .
 
Last edited by a moderator:
Rockster said:
The Wii might be able to do a similar amount of work per pixel considering it needs to draw far fewer.
Not on 1.5x the GC clocked higher. You'd need new hardware. Case size, they can fit in laptop-class components, and get a lot more bang for the buck. Thus to maintain BC, they could take the GC architecture and beef it up to laptop-class (doesn't need to be GoForce 7800 and the like either) with multiple 'gaphics pipes' for example, dropping down to just one on a slower clock speed when running GC games. The idea of negligable architecture changes and moderate clock increases shows no technological progress at all in 5 years!
 
Rockster said:
I know, maybe a robot came back from the future with new microprocessor technology that no one has seen yet. Wii powered by Skynet CPU's!

Yeah I heard its called a 90nm chip process and more power efficient memory among other things. Do you realise that many laptops, with far less internal area then Wii, have components far more advanced then GC chips?, obviously not.
 
Teasy said:
Yeah I heard its called a 90nm chip process and more power efficient memory among other things. Do you realise that many laptops, with far less internal area then Wii, have components far more advanced then GC chips?, obviously not.

I don't think so. Laptops are WAY bigger than the Wii. Wii is barely larger than your standard DVD drive. I mean just look at the thing: http://www.aeropause.com/archives/2006/05/game_consoles_a.php It's smaller than the PS2 slim (and with a much larger set of capabilities and computational power BTW). You'll be lucky to even get a heatsink on those chips. It's not like the GC is 10 years old. I think it's amazing what they have packed inside the Wii. And like I said before, they have to / should stay with the same basic architecture for backwards compatibility. Otherwise, the chips would have to be orders of magnitude faster to support software emulation (which means much bigger and hotter chips and likely <100% compatibility) or have both the new and old chips, which would eat up space and drive up manufacturing costs. What processors did you want / expect to see in there?
 
Rockster said:
I don't think so. Laptops are WAY bigger than the Wii.
Laptops have to be bigger not to fit in the components but to provide a useable keyboard and screen and large battery. In the depth of a laptop you have screen + keyboard + mobo and chips and heatsinks. Wii is taller, tall enough to fit a laptop's components in the bottom with DVD drive on top.
 
fearsomepirate said:
I wonder if most of Wii's hardware footprint is simply taken up by the disc drive and wireless components.

Do you mean in terms of cost, space, power...?
 
'Footprint' is the area the console takes up on whatever surface it stands on, length*width as it's a nice box shape. The disc drive is the defining element for the width I think, but Wii is a fair bit longer IIRC. I don't know what the wireless components would take up, but I can't imagine them that large. I think the Wii is as wide as it is to just fit the DVD drive, as thin as possible to look classy, and is as long as it is to fit in the rest of the components and heatsinks into that width.
 
I'm basically wondering if there's anything under the disc drive. It's awfully thin, so I'm thinking maybe all the chips are at the back of the machine, meaning the processing and memory may actually be located on a very tiny piece of circuit board real estate.
 
Ooh-videogames said:
I'm still wondering why Nintendo released devkits without Hollywood present, if it was just an overclocked Flpper.

Weren't they supposed to have shipped the final dev kits by end of June / beginning of July? I thought it's finished by now.
 
hupfinsgack said:
Weren't they supposed to have shipped the final dev kits by end of June / beginning of July? I thought it's finished by now.

When IGN started revealing specs from their sources, early devkits didn't have Hollywood present. So, what I'm saying is, why. If we are to believe these so called leaked specs, with no architectural changes. The chips were just moved to a 90nm process, with an increase in MHz.

Also in a real interview with a Ubisoft dev, it is said that Wii is more powerful than Xbox. Can you get an increase in raw polygon numbers and realworld numbers just from increasing a GPU clockspeed?
 
Ooh-videogames said:
When IGN started revealing specs from their sources, early devkits didn't have Hollywood present. So, what I'm saying is, why. If we are to believe these so called leaked specs, with no architectural changes. The chips were just moved to a 90nm process, with an increase in MHz.

Just a guess.. but if Hollywood is an overclocked Flipper and the early devkits didn't include Hollywood, whats to say the early kits didn't just include Flipper?

Wouldn't that make sense and answer any confusion?

Also in a real interview with a Ubisoft dev, it is said that Wii is more powerful than Xbox. Can you get an increase in raw polygon numbers and realworld numbers just from increasing a GPU clockspeed?

Did he say how much more powerful? 1%? 50%? Did he say exactly what he meant by more powerful?

I could easily see an overclocked GC being more powerful than an Xbox in certain processes under certain conditions. Actually, I could see a regular GC fitting those parameters.
 
RancidLunchmeat said:
Just a guess.. but if Hollywood is an overclocked Flipper and the early devkits didn't include Hollywood, whats to say the early kits didn't just include Flipper?

Wouldn't that make sense and answer any confusion?

Uh, do you get what I'm saying. Why not include Hollywood in early devkits if its just a Flipper overclocked manufacured under the 90nm process. What is believed, is the leaked specs are a representation of final hardware.



RancidLunchmeat said:
Did he say how much more powerful? 1%? 50%? Did he say exactly what he meant by more powerful?

I could easily see an overclocked GC being more powerful than an Xbox in certain processes under certain conditions. Actually, I could see a regular GC fitting those parameters.

Now you want to argue over semantics.
 
Shifty Geezer said:
'Footprint' is the area the console takes up on whatever surface it stands on, length*width as it's a nice box shape. The disc drive is the defining element for the width I think, but Wii is a fair bit longer IIRC. I don't know what the wireless components would take up, but I can't imagine them that large. I think the Wii is as wide as it is to just fit the DVD drive, as thin as possible to look classy, and is as long as it is to fit in the rest of the components and heatsinks into that width.


Thanks didnt knew the term.

It is interesting if the design case does have such a impact (reducing specs).

Ooh-videogames said:
Uh, do you get what I'm saying. Why not include Hollywood in early devkits if its just a Flipper overclocked manufacured under the 90nm process. What is believed, is the leaked specs are a representation of final hardware.

I agree this is a good question, after all, if this is true, there is similar CPUs on the same process as fast as this and the GPU is supossed to be 200Mhz at the time (aparentelly bad yields made the slower chips on the final console) they would only need better colling soluctions to have a fully functinal Wii.
 
Ooh-videogames said:
Uh, do you get what I'm saying. Why not include Hollywood in early devkits if its just a Flipper overclocked manufacured under the 90nm process. What is believed, is the leaked specs are a representation of final hardware.

Clearly not, because what you are asking makes no sense.

Taking the position that the Hollywood is 'just' a Flipper overclocked that was manufactured under the 90nm process, what is your question?

Why would N rush, or even care if Hollywood was included in the early devkits? Including a Flipper, which is NOT Hollywood, would still allow the devs to get their kits.. probably earlier and at a reduced cost.

Now you want to argue over semantics.

I suggest you look up the definition of semantics.
 
If Nintendo literally took the same gpu/cpu out of gc and oc'd them how much would that cost them? literally nothing right? So why say ATI is developing the gpu and IBM is developing the cpu etc when the work is already done? It isn't even an issue of redesigning the chip with more layers to attain an extreme overclock. 50% over the original speed with two die shrinks should be attained with the existing gc chips in boxes on the shelf now.

ot - Would this technique not be desirable for ps4 or x720? Basicly just pack as many cpu/gpu cores as possible into the most advanced chip process at the time and oc the thing up and call it a day. Not that this would be the best from a cutting edge tech perspective but talk about cheap r&d! plus bw compat would be a nobrainer.:LOL:
 
TheChefO said:
If Nintendo literally took the same gpu/cpu out of gc and oc'd them how much would that cost them? literally nothing right? So why say ATI is developing the gpu and IBM is developing the cpu etc when the work is already done? It isn't even an issue of redesigning the chip with more layers to attain an extreme overclock. 50% over the original speed with two die shrinks should be attained with the existing gc chips in boxes on the shelf now.

Well, although I am still sceptical of the "leaked" specs, I am playing devil's advocate here: They obivously had to implement a new memory controller, the bus structure is different, the dsp is probably revised and there's more embebbed eDram. So even with the new specs it'd be more than just a simple overclocking.
 
RancidLunchmeat said:
Clearly not, because what you are asking makes no sense.

Taking the position that the Hollywood is 'just' a Flipper overclocked that was manufactured under the 90nm process, what is your question?

Why would N rush, or even care if Hollywood was included in the early devkits? Including a Flipper, which is NOT Hollywood, would still allow the devs to get their kits.. probably earlier and at a reduced cost.



I suggest you look up the definition of semantics.

Nevermind.
 
Status
Not open for further replies.
Back
Top