WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
I have to agree with RancidLM here.

First of all, if the specs are true, there still is reasonably enough changes to the architecture to require redesign and secondly we can't possibly know why the first dev kits didn't include Hollywood since none of has any insight in Nintendo's decision making. It can't be used as an argument for or against the "leaked" specs. In fact, it's not an even argument at all.

Hopefully, we're going to here something from IGN or Gamespot on that matter because until now they've been awfully quiet on the matter. Maybe they're trying to get some confirmation.
 
RancidLunchmeat said:
Right.

Again, I don't seem to understand this 'quite clear' question. Was Hollywood available at 90nm and 243Mhz when the devkits came out?

That is the question, why wouldn't it be available when the devkits came out?

Was there any need for N to include it, rush it, spend any extra expense on including it when they could just include the Flipper instead?

Of course, to help developers get there games finished as soon as possible. Every console developer tries to give game developers finished hardware as soon as possible.

I don't see how the fact that the chip was designed in 2000 and was capable of 200Mhz on 180nm has any impact on the 243Mhz 90nm chip.

Then this discussion is a lost cause.. I mean how can you not see the impact?

It seems like a very plausible answer to the question is readily available, so I don't understand the mystery.

With respect its not plausible at all. What your saying is that Nintendo could easily have given final kits to devs straight away but they just decided, on a wim, not to because they just "didn't care"...
 
Last edited by a moderator:
hupfinsgack said:
I have to agree with RancidLM here.

First of all, if the specs are true, there still is reasonably enough changes to the architecture to require redesign and secondly we can't possibly know why the first dev kits didn't include Hollywood since none of has any insight in Nintendo's decision making. It can't be used as an argument for or against the "leaked" specs. In fact, it's not an even argument at all.

Hopefully, we're going to here something from IGN or Gamespot on that matter because until now they've been awfully quiet on the matter. Maybe they're trying to get some confirmation.

The site clearly says that Hollywood is identical to Flipper but clocked 50% faster. Wii has been years in the making, this kind of redisign should take months.

As I said Wii has been years in the making, long enough for an entirely knew GPU to be built for the system (just like NES, SNES, N64 and GC). Yet apparently Nintendo couldn't even get a chip identical to Flipper on a 90nm process inside there dev kits until a couple of months ago (quite close to launch).. how is that not geniunly questionable?

To be clear, obviously this doesn't prove beyond a shadow of a doubt that this rumour isn't true. Its just one of the many questions which cast a big shadow of doubt on these rumoured specs.
 
Last edited by a moderator:
RancidLunchmeat said:
So you're saying it should be easy in order to reduce wafer size and increase Mhz? Sure. Maybe it was. Maybe it was so easy to do that N was in no rush to do it.

console vendors as a rule try to provide devs with as accurate devkits as possible. if we assume that (1) hollywood is not identical to flipper, and (2) it was so easy for nintendo to come up with hollywood a year ago then there's absolutely no reason why this chip would not have been in the devkits back then.

I still don't grasp the 'If Hollywood is 'only' an overclocked Flipper at 90nm why wasn't it in the early devkits' question.

Because it wasn't available?

as in 'nobody cared to produce it, despite all necessary conditions available'? don't be surprised of people's scepticism.

Because those 'only' propositions make it a chip that is definitionally different than the Flipper (so developers who say the kits didn't include Hollywood are correct), but not practically different (so developers have a good representation of the final hardware even without Hollywood).

developers must have some representation of the final hw within reasonable terms before launch. to the length that some vendors would release 100% destop configuations as early devkits. ergo if there's a new hardware to be released and a GC devkit is the closes thing on earth, a GC devkit will be sent out to devs, regardless of whether it's identical to the final wii hw or is reasonably different than wii. i guess you don't expect ninty to keep devs without devkits <6 months befour launch?

It seems like a very plausible answer to the question is readily available, so I don't understand the mystery.

well, apparently others find there's one. regardless of whether hollywood ends up being identical, or even half of flipper's features- and performance-wise, there's some unknown factor here that makes ninty's behaviour WRT devkits suspicious. to me personally it speaks: unavailable final hw late into the pre-launch cycle. now, the next logical question would be, why is it unavailable?
 
Last edited by a moderator:
darkblu said:
developers must have some representation of the final hw within reasonable terms before launch. to the length that some vendors would release 100% destop configuations as early devkits. ergo if there's a new hardware to be released and a GC devkit is the closes thing on earth, a GC devkit will be sent out to devs, regardless of whether it's identical to the final wii hw or is reasonably different than wii. i guess you don't expect ninty to keep devs without devkits <6 months befour launch?

MS had launch devs running PC graphics cards and dual G5 processors up to about ~3-4 months before games had to go gold. Also, Wii is backwards compatible with the GCN so maybe there were some cutting of corners until final hardware was available. I would expect Hollywood to have some extra umpf and features, but if final silicon was not available until July/August 2006 I don't see why overclocked GCN dev kits could not be used, especially if there is a lot of feature/design overlap. Getting OCed chips should not be too hard in that the GCN is a 5 year old design, those chips on the newer processes should clock fairly high.

Anyhow, we will know more in ~3 months. Until then the only reliable information we truly have is Matt Cassima (and the accuracy of the information fed to him by developers, and how much that information, even if correct, resembles the finished product) and the software at E3.
 
I noticed that there hasn't been much talk about the shader patent of Nintendos which I think is safe to assume that its a part of Hollywoods architecture. The games so far in development for launch so far don't show any visible shader FXs. Nintendo decision to have developers start development with upgraded GC devkits, seem to be the reason for the lack of shader FXs. Which leads me to believe this was the motivation for Retro decision to not include bump mapping or maybe parallax mapping. Although EMBM is available, you get more defined bumps with shader based bump mapping. Emboss isn't worth their time IMO.

us007034828-013.jpg


Most developers wouldn't have enough time to include shaders, without setting back development time. Missing launch is just not in the cards this gen. Better lighting is expected, per pixel, I wish, sooner or later, I think is a given.

One title I think may feature shaders is Monolith Soft "Disaster: Day of Crisis", but knowing its a Japanese dev house, there's a good chance it won't.

Flipper can do 8 textures in a single pass, it was obviously limited by speed, bandwitdh, and available memory. If your Nintendo do you keep this spec or increase it? How many texture passes to be clear. For the benefit of pixel fill rate, it would be a good idea to increase pixel pipelines. It would be great for AA, I'm hoping available bandwitdh is high.

My stance on the specs posted from Maxconsole are that they're based on old devkits which still lacked final hardware, more respresentative of devkit stages, it had no mention of the Wiimotes speaker. Basically the progression of a architecture for the CPU, citing dev comments(IGN) about a increase in CPU processing performance, compared to earlier kits.
They also said that Hollywood wasn't present, which suggest it wasn't complete, so a over clocked Flipper was the second best thing.

Not to boast, or toot my on horn. I called this a year or so ago(Nintendo would build on the GC design, removing design flaws and keeping what proved to be great ideas) when talk of next gen ramped up, after it became apparent that MS would be the first to launch. You heard it here first, on B3D. Many of you thought it wouldn't happen.

Sorry for the small image, resized it in photobucket to 25% of its original size, big mistake. FIXED.
 
Last edited by a moderator:
Ooh-videogames said:
Although EMBM is available, you get more defined bumps with shader based bump mapping. Emboss isn't worth their time IMO.
You do know that EMBM isn't emboss, right?
Environment Mapped Bump Mapping.
One of the great advantages of EMBM is that it can do non phong shading quite readily, without having to do a lot of fiddling.
 
Squeak said:
You do know that EMBM isn't emboss, right?
Environment Mapped Bump Mapping.
One of the great advantages of EMBM is that it can do non phong shading quite readily, without having to do a lot of fiddling.

Yes, bad grammar I guess.
 
If Hollywood is a Gekko * 1.5 then the point is not that Nintendo didn't need to wait until recently to ship the hardware in dev kits, the point is that they shipped final hardware a LONG time ago. We knew about Gamecube hardware with a 50% overclock around Christmas. If those were the final specs, then devs have had final hardware to work with for ages.

What happens to the early dev kits when a developer gets the latest and greatest from Nintendo? I doubt it goes in the trash. Nintendo probably takes the hardware back in some sort of exchange. Where does it go then? My guess is that it goes to a smaller studio that has shown interest in Wii development. So some employee at a little mom and pop game development shop sees the documentation for their "new" Wii development kit and runs to the nearest (ir)reputable game site with a list of specs!
 
OtakingGX said:
What happens to the early dev kits when a developer gets the latest and greatest from Nintendo? I doubt it goes in the trash. Nintendo probably takes the hardware back in some sort of exchange. Where does it go then? My guess is that it goes to a smaller studio that has shown interest in Wii development. So some employee at a little mom and pop game development shop sees the documentation for their "new" Wii development kit and runs to the nearest (ir)reputable game site with a list of specs!

well, it could just as well be the case : )
 
Ooh-videogames said:
I noticed that there hasn't been much talk about the shader patent of Nintendos which I think is safe to assume that its a part of Hollywoods architecture. The games so far in development for launch so far don't show any visible shader FXs. Nintendo decision to have developers start development with upgraded GC devkits, seem to be the reason for the lack of shader FXs. Which leads me to believe this was the motivation for Retro decision to not include bump mapping or maybe parallax mapping. Although EMBM is available, you get more defined bumps with shader based bump mapping. Emboss isn't worth their time IMO.
i urge anyone who has a poor opinion of EMBM to check out these demos:
Radeon Arc
PVR Wheel[/ur]
[url=http://www.pvrdev.com/pub/PC/eg/h/Fire.htm]PVR Fire

and if you can find any, the Matrox G400 demos are also fine examples.

most of the stuff developers were doing with DX8 pixel shaders could also be done with a little creativity and EMBM.

as for the current rumored specs i don't believe them. we've heard ati say hollywood was a new design, nintendo announced things like the speaker and memory in the wiimote that are absent from the rumored spec... there's too much conflicting information from official sources.
 
c_k_i_t said:
There was a conference call from Mosys, which says that the Wii will use
1T-SRAM-R and not 1T-SRAM-Q.

What is 1T-SRAM-R? The standard one? The one in GCN?
 
Teasy said:
Was that supposed to be a reply to anyone in particular


Well yes to you and many others, its a fact imo. If gamers want prettier games there is other choises. Its just that simple. No need to defend something that we already known.
Cheap components, expensive controller(relative i assume) and more choises than GC is the Wii-Console imo.
 
hupfinsgack said:
What is 1T-SRAM-R? The standard one? The one in GCN?

Don't know if it the one used in the Gamecube, but someone posted this on Neogaf:

º 1T-SRAM-R, a version that includes Transparent Error Correction™ (TEC), which automatically corrects memory errors during operation, including soft errors caused by high-energy particles, and eliminates the need for laser repair in manufacturing test. This is accomplished without adding silicon area or cost. Introduced in November 2001, our 1T-SRAM-R has now become the standard for most of our licensing activities.

º 1T-SRAM-M, a lower power version that is well suited to particular applications requiring very low operating and standby power, such as cell phone handsets, PDAs and other consumer wireless devices. We introduced 1T-SRAM-M in April 2001.

º 1T-SRAM-Q, an extended density memory that has twice the density of the original version of our technology and up to four times the density of traditional SRAM. We introduced 1T-SRAM-Q in December 2002.

There also a link to a nice article about 1T-SRAM-R:
http://www.charteredsemi.com/design/memory_ip.asp
 
It doesn't make much sense to me that the specs are so low. games such as pokemon look well ahead of the cube. Iwata always insisted that games would look almost as good as other next gen machines in SD. However, many developers and publishers have strongly downplayed the capabilities:

e.g. Atari CEO :
"...accelerating some of the Wii titles, because with the technology being so close to the GameCube, we are able to effectively convert teams faster than on PS3."

Associate Producer for Activision Chris Palmisano :
"Fundamentally, it's almost using the same [character] models as the current-gen. They're a little bit better. It's actually got real-time shadows in it, and the way the Wii processes graphics is a little bit better,"

I have a feeling the specs will be close to the IGN figures but I hope not!
 
denis_carlin said:
It doesn't make much sense to me that the specs are so low. games such as pokemon look well ahead of the cube. Iwata always insisted that games would look almost as good as other next gen machines in SD.
Iwata also said Zelda looked Next-Gen, so his idea of next-gen seems different to most people's. Also for the better looking games we don't know whether they were mockups or realtime renders. When you look at the actual game footage of titles a lot of it doesn't look far removed from current gen. Some of it looks very poor (elebits) while others look a small step up from GC (that golf game). The most uncertain feature for me is the nice lighting and self-shadowing that's been shown. It's been suggested there was hardware support for such features which would explain it, and that alone made the real visual difference. Pokemon with basic gouraud shading wouldn't look anynear near as nice as what we've seen. I think it would only take better lighting and self-shadowing model to make a real difference to current gen and make those better Wii shots doable.
 
Shifty Geezer said:
Iwata also said Zelda looked Next-Gen, so his idea of next-gen seems different to most people's. Also for the better looking games we don't know whether they were mockups or realtime renders. When you look at the actual game footage of titles a lot of it doesn't look far removed from current gen. Some of it looks very poor (elebits) while others look a small step up from GC (that golf game). The most uncertain feature for me is the nice lighting and self-shadowing that's been shown. It's been suggested there was hardware support for such features which would explain it, and that alone made the real visual difference. Pokemon with basic gouraud shading wouldn't look anynear near as nice as what we've seen. I think it would only take better lighting and self-shadowing model to make a real difference to current gen and make those better Wii shots doable.

I'm pretty sure Pokemon was realtime, just as the Cube titles were. Pokemon Battle Revolution did look a generation better than the Cube ones. What other games do you think were mockups?
 
Status
Not open for further replies.
Back
Top