WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
I agree. It's a testament to the futility of the internet. 100,000 views, 73 pages of posts from people not knowing what will be in Wii.

but you have to admit we're slowly getting there.. emphasis on slowly ; )

in all seriousness though, it could be a good idea if a mod sanitized this thread a bit (again) - all the 'ohhs' and 'ahhs', 'OMGs' and 'how can that be!?' comments that contribute to ~3/4s of those 73 pages. anybody up for the taks? : )
 
in all seriousness though, it could be a good idea if a mod sanitized this thread a bit (again) - all the 'ohhs' and 'ahhs', 'OMGs' and 'how can that be!?' comments that contribute to ~3/4s of those 73 pages. anybody up for the taks? : )
NO!!! That's like saying we cut down the tangly bits of rainforest so we can see the pretty flowers. As a World Internet Heritage Thread, we have to take the bad with the good. It's like Grade 1 listed buildings that are ghastly concrete lumps - they're protected to show how wrong 1960s architects were. Heritage Threads must be preseved exactly as is. If anything mods should take measures to protect posts from being edited before it's to late!
 
What is the WiiGiiPiiOo? Really. What is it? I'm sure that in 75 pages someone answered this...?
Isn't it accepted that the GPU is modelled on Flipper, using the same architecture but clocked faster and with some extras? I think we even have hints of how many functional units it has too.
 
What IBM calls 'graphics features' are just the extra instruction which were already present in the Gekko (paired SPFP ops and int-to-float / float-to-int conversions in the L/S pipe). From the 750CL datasheet those are the only significant differences from the other processors in the 750 family. The 750CL basically looks like a commercial version of the Gekko shrinked to 90nm. However we don't know if the Wii's CPU is a 750CL though it sounds likely.

I initially also thought on the data quantization they said in the the data sheet, althought I as hoping for something more and with more impact on gameplay than just gfx.

yep, i think so too. still, that extra sauce sounds like something that could have nice impact on the cpu-gpu collaboration, so i'd say mentioning that as 'graphics features' is not unjustified. i mean, console vendors have been know to brag about such features in their systems, so let's acknowledge IBM their right to boast a little too, after all they made it. neither is that document necessarily targetted at cube aficionados : )

Didnt remembered that:oops: .
 
Last edited by a moderator:
Okay, since we can't come to a reasonable conclusion, who's willing to sacrifice their Wii in the name of science come November 19? Time to look through the electroscope, fellas....
 
Last edited by a moderator:
360's CPU isn't that powerful.

How powerful is it, then?



By the way, so I was doing some thinking. And I thought, once there turns out to be an HD standard, and Nintnendo eventually goes the HD route, then that means there HAS to be a minimum level of tech required to work for HD, then. What would you guys say would be that minimum (CPU, GPU, RAM, ect)?


Edit: Sorry, forgot to mention. This is assuming the standard is 720p.
 
Last edited by a moderator:
How powerful is it, then?



By the way, so I was doing some thinking. And I thought, once there turns out to be an HD standard, and Nintnendo eventually goes the HD route, then that means there HAS to be a minimum level of tech required to work for HD, then. What would you guys say would be that minimum (CPU, GPU, RAM, ect)?


Edit: Sorry, forgot to mention. This is assuming the standard is 720p.

360's CPU is a in-order mosquito on speed. It's potentially very fast but it's really, really hard to get it there. It has a ton of drawbacks to make it cheap but still potentially fast. Lots of developer effort required and ports from other platforms are not easy to pull off. There's a whole thread about the reasoning behind the current console CPUs, by the way.

As for a magical HD spec? Nope. A Voodoo3 from 1999 can run 1280x720 just fine. Wii just isn't designed to allow for that much framebuffer. It's tailored to run 720x480 and they cost-optimized it in various ways around that limit. The eDRAM makes that optimization a critical thing. eDRAM is expensive.
 
5 years in the computer world is a long time, Wii is essentially a dinosaur. A small one sure but still a dinosaur lol. Real shame.



but even dinosaurs can output some amazing visuals in the right hands. certain arcade games running at 60fps on 6-10 year old hardware looks impressive to me, and more impressive than many games on Wii, and I know Wii is more powerful than those old arcade systems. we haven't seen nearly what Wii can do when pushed. Resident Evil 4, with textured that are not dithered, plus AA, and a higher framerate would be pretty freaking impressive.

sure C1/Xenos and RSX are alot more powerful than Hollywood but those GPUs have to spend 3x the performance at 720 just to do the same graphics at 480p.

ive seen Xbox1 and PS2 games that look more impressive than Xbox 360 games because the last-gen games are running at 60fps. that makes up for alot.

anyway Wii will not prove itself with graphics, but gameplay. although this thread is all about graphics, since its about ze mysterious Hollywood.
 
anyway Wii will not prove itself with graphics, but gameplay. although this thread is all about graphics, since its about ze mysterious Hollywood.

Yeah, didn't u start this thread eons ago? ;)

Sadly I don't think Wii can really do AA. It's not in any screenies I've seen. And, we've seen dithered Wii shots too so that's not so positive looking either.

N64 did AA in nearly every title!!!! Funny how that ended with newer hardware. But wow did N64 ever have screwy texture filtering lol.
 
Last edited by a moderator:
sure C1/Xenos and RSX are alot more powerful than Hollywood but those GPUs have to spend 3x the performance at 720 just to do the same graphics at 480p.

Is that really true, well 720x480*3=1036800 and 1280x720=921600. So it's really not 3x and the true 16:9 res would be 854x480, 720p is 2.25x bigger than that, but I quess Wii won't be using square pixels for widescreen?

Well that's not the point really, the point is, does 3x pixels require 3x power across the board? Or just 3x fillrate and various amounts in other places?
 
Well that's not the point really, the point is, does 3x pixels require 3x power across the board? Or just 3x fillrate and various amounts in other places?

well, for starters, if you want to maintain decent pixel-texel ratios your texel numbers would scale proportionally. so that's a proportional bandwidth hit right there.
 
Last edited by a moderator:
Well that's not the point really, the point is, does 3x pixels require 3x power across the board? Or just 3x fillrate and various amounts in other places?
You won't need any increase for the vertex/geometry work (assuming same geometry complexity when rendering 720p, and it's not the case that XB360 is using XB assets!), but will for the shading and drawing side of things. As that's the predominant workload on GPUs, a 3x resolution increase probably needs in the order of 2x the 'power' to render, and 3x the BW. When you up the resolution of textures and meshes, you push up the needs a little more. I think a ballpark 'requirements increase=resolution increase' is about right.
 
Well that's not the point really, the point is, does 3x pixels require 3x power across the board? Or just 3x fillrate and various amounts in other places?

The latter of course.

Wii will not give us the same level of graphics as PS3 or 360 "just at 480p". That's a ludicrous statement. PS3 and 360 will give us Ps3 and 360 graphics at 480p.
Geometry and shading, added to CPU assisted tasks are on very different levels between ps3/360 and Wii, and resolution is just one aspect of graphics.

It's not just about fillrate, but it definitely isn't "3X the power accross the board".
 
The latter of course.

Wii will not give us the same level of graphics as PS3 or 360 "just at 480p". That's a ludicrous statement. PS3 and 360 will give us Ps3 and 360 graphics at 480p.
Geometry and shading, added to CPU assisted tasks are on very different levels between ps3/360 and Wii, and resolution is just one aspect of graphics.

It's not just about fillrate, but it definitely isn't "3X the power accross the board".

Considering the thread is about Hollywood, Wii's GPU, I'd think that they're comparing the differences in GPU work load between 480p and 720p, rather then the entire systems workload. Also who said Wii will put out as good graphics at 480p as 360 or PS3?, nobody believes that,
 
Last edited by a moderator:
Didnt see in the usual places yet, but there is a new SC:DA ss here, the first one looks very good, for some reason it makes me remember the firsts ss of F.E.A.R. (althought it is not on par with fear).

PS: I hope you an see, these site sometimes block traffic.
 
Last edited by a moderator:
Considering the thread is about Hollywood, Wii's GPU, I'd think that they're comparing the differences in GPU work load between 480p and 720p, rather then the entire systems workload. Also who said Wii will put out as good graphics at 480p as 360 or PS3?, nobody believes that,

Exactly. I asked because there has been some talk here, that the penalty for using 1080p instead of 720p on PS3 games might not be as much as the pixel difference would imply. I figured that should also translate to this situation...
 
Status
Not open for further replies.
Back
Top