WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
If you read some more of the thread you'll see that Hollywood is about three times the size of Flipper. So there really doesn't seem to be any question that Hollywood must have more inside it.

I saw that but knowing Nintendo, it won't surprise me that Nintendo only gave us a 1.5 GC because it's enough for them for 480p.
 
If they did that they might as well used only GC hardware. Whats the use of 1,5GC hardware? imo thats just a waste of time and money.

Filrate and better use of vertex shader on the CPU. And maybe because it was the frequency Nintendo opted to have against the Xbox (the almost identical frequency).
 
If you read some more of the thread you'll see that Hollywood is about three times the size of Flipper. So there really doesn't seem to be any question that Hollywood must have more inside it.
If so then why aren't wii games looking any better? Zelda being arguably the best looking game looks just as good on gamecube. Red steel and rabbids, while nice, don't do anything that's really above and beyond gamecube either.

One might argue they're early titles and don't use hollywood fully blah blah. I don't buy that. Either there IS more, and then titles show tangible differences and improvements - which isn't the case - or there isn't any more in which case it's just a gamecube chip upclocked a bit. Which seems to fit the facts looking at what we got in the end.


Peace.
 
If so then why aren't wii games looking any better? Zelda being arguably the best looking game looks just as good on gamecube. Red steel and rabbids, while nice, don't do anything that's really above and beyond gamecube either.

One might argue they're early titles and don't use hollywood fully blah blah. I don't buy that. Either there IS more, and then titles show tangible differences and improvements - which isn't the case - or there isn't any more in which case it's just a gamecube chip upclocked a bit. Which seems to fit the facts looking at what we got in the end.


Peace.
Zelda is nothing but a direct port from GC, with widescreen support and apparently bit more lighting added.
The fact remains that all the current games are ~90% or even more developed on the famous overclocked GC devkits, if not normal GC devkits.
 
Either there IS more, and then titles show tangible differences and improvements - which isn't the case - or there isn't any more in which case it's just a gamecube chip upclocked a bit. Which seems to fit the facts looking at what we got in the end.

Several playable (by E3 orTGS, ie even on OC GC) games already show that, eg Mario Galaxy, MP3, RE:UC, BTWii ...

Plus I dont know why someone think it is that hard that games made in less than 1 year, to primary market that it isnt interested in gfx (at least by now), here they need to work with a brand new money, here they need to get used to a completely diferent architeture... That early games didnt show more than minor improvments.

Anyway if anyone have a better explanation for why is the GPU, at least, 2x as big and the CPU about 4/3, please share it with us.

BTW, anyone knows if this light scattering video is somehere else? Thanks in advance.
 
just for the clear view:
http://developer.nvidia.com/object/combiners.html

This is the doc about the riva tnt and gf256 texture combiners.
With this hw , you was able to do all of the effects that can be done by the gf3 ps.The advantage of the ps is that you have a more visual and clear software instead of the dxdevic->settextureblending(pointer,pointer,pointer) style programming.
This s true for the vs too.If you have a fixed pipeline the main issue is not the flexibility,but the transparency and the compatibility between the softwares.

Thanks.
In that case how hard would it be to make a gf3 level of shader like software in a "old" flipper (but didnt ERP said that the flipper TnL engine is even more simple than the gf2 one?); what about performance, what kind of lost are we talking about to have "shaders" in it; and why we neve saw anyone doing that (eg in WaveRace and Factor 5 games, they both made the waves physics on the CPU).

Anyway, I guess that, in being this the case it should be easier to make a good upgrade to the Tnl engine to ohne that is better in vertex shading.
 
Oh they're definitely both T&L units, its just that one is fixed function (does certain fixed tasks you enable or disable) and the other you program to do what you want it to do, within the boundries of transform and lighting of course.

Its just that I'd imagine the first would be the faster/cheaper option for static geometry and would allow perfect out of the box compatability with GC games. While the second would be slower but able to handle more varied tasks. Which is why I thought that possibly they could use the fast and cheap Flipper T&L unit for 100% GC compatability and static geometry. While using a additional vertex shader to do whatever the CPU would have had to do on GC.
Even "fixed function" T&L is so flexible that you really want something programmable/heavily configurable to implement it in silicon. There's up to eight different lights, which can be spots, points, directional with programmable fall-off functions, not to mention the various modes of automatic texcoord generation that many of the "fixed function" T&L engines can accelerate.
That multiplies out to a whole damn lot of different programs that need to be supported somehow. Instead of building a static logic block that manages to support everything (perhaps with lots of loop-backs), just jumping for a little processor that can do vector MADDs and dot products and is programmable just to the necessary extent is a pretty attractive solution.

Where exactly you draw the line between configurable and programmable is pretty arbitrary anyway, it's just tiny little steps between the concepts in practice. According to popular wisdom a Geforce 3's pixel and vertex pipes are programmable while a Geforce 2's are not, and that doesn't make much sense to me. Geforce 3 re-uses the exact same ALU blocks, it's just that it can loop to allow longer chains of "instructions" to be executed on it. Okay, that's forgetting the TS, but that leads me back to Flipper's TEV and that's where I want to be: what makes Flipper look "not programmable", above all else, is that Microsoft doesn't have a version number for it. Not the technology itself. It may not support everything a Geforce 3, which has a number assigned, supports, but the same thing is true in the opposite direction.

"Fixed function" T&L is perceived as not programmable because Microsoft doesn't have a version number for it. If you program the chip directly you'll never feel a difference, you'll never feel like you've just crossed that line. And down at the hardware level, transistors at large have been found to not care whether they are executing a pompous MUL instruction or are just casually forming a multiplier.
 
The fact remains that all the current games are ~90% or even more developed on the famous overclocked GC devkits, if not normal GC devkits.
Afaik, you don't actually DEVELOP on the console itself, overclocked or not. You develop on a PC (or popssibly a mac) and then run the game on a console.

And finished wii hardware ought to have been available long enough for developers to at least upt in rudimentary suppotr for any new features assuming they exist. After all it's just code on the PC side that doesn't need specific support on that end. All you need is a text editor and a compiler..

Nintendo hasn't bragged or even let slip about anything about wii. quite the opposite didn't that kaplan dudette let slip a long time ago by mistake wii would be GC * 1.5?

No wii dev has leaked any secret information on new features so either nintendo has dirt on every dev and uses it blackmailing style to make them shut up about it or nothing new exzists to leak about. You decide.

Occam's razor and all that you know..


Peace.
 
No wii dev has leaked any secret information on new features so either nintendo has dirt on every dev and uses it blackmailing style to make them shut up about it or nothing new exzists to leak about. You decide.

Occam's razor and all that you know..


Peace.

That make sense,but in that case what is that 40 sqmm on the core?A big ATI labell or a gddr3 mem controller?
 
Afaik, you don't actually DEVELOP on the console itself, overclocked or not. You develop on a PC (or popssibly a mac) and then run the game on a console.

And finished wii hardware ought to have been available long enough for developers to at least upt in rudimentary suppotr for any new features assuming they exist. After all it's just code on the PC side that doesn't need specific support on that end. All you need is a text editor and a compiler..

Nintendo hasn't bragged or even let slip about anything about wii. quite the opposite didn't that kaplan dudette let slip a long time ago by mistake wii would be GC * 1.5?

No wii dev has leaked any secret information on new features so either nintendo has dirt on every dev and uses it blackmailing style to make them shut up about it or nothing new exzists to leak about. You decide.

Occam's razor and all that you know..


Peace.

I believe the quote was XBOX * 1.5.

Interestingly RAM wise it pretty much is.
 
Afaik, you don't actually DEVELOP on the console itself, overclocked or not. You develop on a PC (or popssibly a mac) and then run the game on a console.

That is true, but they need to test/benchmark it, they need time put ideas in practise, if (like it is the case) they want playable SW they need to match it with the previus dev kits... There is a lot of things that can justifie this, remember that fr the controler they had it about 1,5 years and many of the best improvements had been in the last months. Plus there is more than pure tech reasons to that.

Nintendo hasn't bragged or even let slip about anything about wii. quite the opposite didn't that kaplan dudette let slip a long time ago by mistake wii would be GC * 1.5?

Actually at the time she said about 3x GC (that isnt possible just it a 50% more in clock).

And finished wii hardware ought to have Occam's razor and all that you know

With all the merits it can have, it can also be easly used to do traps and or make someone seems guilty from one crimes...

That is one of the reasons why I think that one can do a phylosophical argument on it.

And you still dont know why it is bigger than it should.
 
Afaik, you don't actually DEVELOP on the console itself, overclocked or not. You develop on a PC (or popssibly a mac) and then run the game on a console.

And finished wii hardware ought to have been available long enough for developers to at least upt in rudimentary suppotr for any new features assuming they exist. After all it's just code on the PC side that doesn't need specific support on that end. All you need is a text editor and a compiler..

Nintendo hasn't bragged or even let slip about anything about wii. quite the opposite didn't that kaplan dudette let slip a long time ago by mistake wii would be GC * 1.5?

No wii dev has leaked any secret information on new features so either nintendo has dirt on every dev and uses it blackmailing style to make them shut up about it or nothing new exzists to leak about. You decide.

Occam's razor and all that you know..


Peace.

Let's put it this way - they had overclocked wii devkits to play on, to test on etc - heck, even on the last computer related fair (or whatever they're called, where they show off things) the Wii stuff they showed was running on oc'd GC devkits rather than Wii's.

That "GC * 1.5" could be almost believeable, if you forgot for example that Hollywood is estimated to have, what, at least twice the transistors compared to flipper?, if not more, and Broadway was something like that too, wasn't it?
 
I saw that but knowing Nintendo, it won't surprise me that Nintendo only gave us a 1.5 GC because it's enough for them for 480p.

I don't think you understood me. The GPU is three times bigger, you don't make a GPU bigger for no reason..

If so then why aren't wii games looking any better? Zelda being arguably the best looking game looks just as good on gamecube. Red steel and rabbids, while nice, don't do anything that's really above and beyond gamecube either.

One might argue they're early titles and don't use hollywood fully blah blah. I don't buy that. Either there IS more, and then titles show tangible differences and improvements - which isn't the case - or there isn't any more in which case it's just a gamecube chip upclocked a bit. Which seems to fit the facts looking at what we got in the end.

You don't accept the idea that current Wii games don't take advantage of Wii's hardware. But you do accept the idea that a chip three times bigger then Flipper is just Flipper?

I don't see how what your saying fits the facts to be honest. As you say every game we've seen so far is no better looking then GC games. But we know for a fact that Wii has 3.5 times as much memory as GC, three times the memory bandwidth and 50% faster clock speeds. So then why isn't all that extra memory/bandwidth and clock speed making a difference? Perhaps for the same kind of reason that the extra transistors in the GPU aren't making a difference, yet.
 
Last edited by a moderator:
And finished wii hardware ought to have been available long enough for developers to at least upt in rudimentary suppotr for any new features assuming they exist. After all it's just code on the PC side that doesn't need specific support on that end. All you need is a text editor and a compiler..

Third parties started getting final Wii development kits in August AFAIR. So they had about 3 months to improve GC developed games for Wii and put them on shelves.

Rainbow Man said:
Nintendo hasn't bragged or even let slip about anything about wii. quite the opposite didn't that kaplan dudette let slip a long time ago by mistake wii would be GC * 1.5?

Do you really think it would be worth bragging even if the GPU in Wii was 3 times as powerful as GC?.. Also Kaplan didn't say anything about GC 1.5, she said it was 3 times as fast as GC. Which incidentally fits what's being speculated here about the GPU being three times as fast as Flipper, along with 3.5 times the memory, 3 times the memory bandwidth ect.
 
If indeed Wii is 3 times as powerful as GameCube, along with the known fact that Wii has 3.5 times as much fast memory, and knowing Wii does not have to spend any ounce of that extra power & RAM to hit higher resolutions, then developers who push Wii to the limits will be able to do some pretty incredible things.

I saw StarFox Assault on GameCube last week, even though it's an old game now, and even though it wasnt all that fantastic overall, the graphics are exellent and 60fps... done by the Ace Combat team
at Namco. looking at F-Zero and RE4.... the Wii is going to rock when certain developers push it.
Soul Calibur II still looks exellent on GCN.


we may never know the inner workings of the Hollywood GPU (i still think we will eventually), but we will see what the thing is capable of over the next 4-5 years.
 
I do not fear Wii's hardware and Wii developers as much as Wii publishers... you know, pushing a graphical engine that does AMAZING (tm) things does not seem anymore like the cost-driving part of the game development equation, but it is the "stuff" that takes advantage of the potential given by the engine (a game engine really only gives you potential) and wows gamers that costs a greater and greater fortune.

Texture-set creation, objects and levels modeling, gameplay design, composing sound effects and music, testing and tweaking, tweaking, more tweaking which might involve re-running through all the steps you have gone so far, game plot, characters design and dialog lines, etc...

If a publisher hears that the team wants to spend $10-15 Million developing a Wii game and push the graphical boundaries of that system, I fear that the publisher would not even let them go to the "but we are also pushing the boundaries of 3D motion sensing-based controls" part of the developer's pitch and would just say "let's move this project over Xbox 360 and/or PLAYSTATION 3!".
 
That make sense,but in that case what is that 40 sqmm on the core?A big ATI labell or a gddr3 mem controller?

Never before seen levels of DRM. Insane redundancy for improved yields. The smallest possible die size NEC's fabs are set up to output for parts running in the hundreds of mhz.
 
You don't accept the idea that current Wii games don't take advantage of Wii's hardware. But you do accept the idea that a chip three times bigger then Flipper is just Flipper?

I don't see how what your saying fits the facts to be honest.
Actually facts is what I do accept. Along with believing things when I see them.

What I don't accetp is speculation and wishful thinking etc.

What is fact and what is wishful thinking? Actual fact: no games whatsoever show any kind of substantially improved capabilities over gamecube that can not be exlpained by faster clock or more memory. No screenshots of coming games show that either. nintedno isn't talking about anything like that at all despite it wouldn't exactly harm them to do so.

Every FACT do in fact point towards there not being any major differences (apart from usb wifi RAM size clock speed and that stuff).

I don't see why people are so eager to speculate about there being secret unannounced features as if that is a fact when no such features have been announced leaked or otherwise divulged by anyone. Even to the point of chiding me for not playing along in what right now is nothing more than mass delusion..

That's not logical.


Peace.
 
Status
Not open for further replies.
Back
Top