Nintendo Wii specs (courtesy of maxconsole.net)

hasn't more than a decade of GPU history really taught you nothing?

i mean, i'm all for software rasterizers, but you're just outright ridiculous here.

just one simple question for you, what hit do you think EMBM would impose on the SPE's LS scheme? mind you, flipper/hollywood have oomphs of transistors dedicated to the task. combined with low latency embedded memory. at all levels of the memory hierarchy.

but let's assume for the sake of argument that after some unhuman heroism a clever coder manages to squeeze some high-enough framerates out of cell at some hollywood-biased tasks. what CPU power do you expect to have left after that for actual game code?

i think some people on these forums should really cut on the hype dose..

Cell's not a typical cpu, it likely could handle software rasterizers an order of magnitude or two more powerful than what we've seen so far. Plus, the Wii's hardware is extremely outdated, at this point, if you cut out some filtering algorithms that cpus would struggle with, I believe a top of the line cpu now could handle a bit more than a geforce 3's level of graphics. (so iq would be worse, but I think it could be similar polygon counts, resolution, and framerate)

As far as EMBM, well until we start seeing Wii games making heavy use of it, I think it's more relevent that Cell would destroy Wii as a system in polygon thoroughput.
I agree that using Cell for a Wii level software rasterizer wouldn't leave much of anything to run game code though.
 
Nintendo was forced to re-use the GCN technology.

Nintendo chose to use GCN technology. They didn't have to make the case the size of a DVD drive. They didn't have to use an old chip and could have designed something new instead. Lots of "couldas." Nintendo made their decision. Now let's see if it was the right one.
 
I believe a top of the line cpu now could handle a bit more than a geforce 3's level of graphics. (so iq would be worse, but I think it could be similar polygon counts, resolution, and framerate)

i'm affraid your beliefs are way off the wild side.

As far as EMBM, well until we start seeing Wii games making heavy use of it

why wait? there're GC games that make decent use of EMBM with hi-res textures, at rock-steady framerates. i'd really love to see a recreation of that on a cell. a 512x512 true color texture is a megabyte worth. use it with EMBM and see how cell handles that. i'm affraid it won't be pretty.

, I think it's more relevent that Cell would destroy Wii as a system in polygon thoroughput.

cell could destroy wii at vertex handling. and possibly untextured polygon fillrate. unfortunately that is a bit insufficient to create a full-fledged title these days, so i can't really see how it's more relevant. when was the last time you played an untextured game?

I agree that using Cell for a Wii level software rasterizer wouldn't leave much of anything to run game code though.

well, that'd be a problem now, wouldn't it?
 
Last edited by a moderator:
why wait? there're GC games that make decent use of EMBM with hi-res textures, at rock-steady framerates. i'd really love to see a recreation of that on a cell. a 512x512 true color texture is a megabyte worth. use it with EMBM and see how cell handles that. i'm affraid it won't be pretty.


Personally I cant remember of any game that could do really nice use of it, maybe on the floor of the RL, Lego SW or Mario Galaxy (cant remember much more), but till it is possible to use it in a game like Black (in many surfaces) I personally dont think there is a heavy use of it, and that should make such game much more beatifull. That is the kind of thing I would like to see in Wii.
 
Last edited by a moderator:
hasn't more than a decade of GPU history really taught you nothing?

i mean, i'm all for software rasterizers, but you're just outright ridiculous here.

What do you know about software rasterizers???

/looks at sig

Oh. ;)

Sorry, I just find these "CELL does everything better" opinions fascinating.

As an aside — having also written a complete software pipeline (which still shipped albeit in a disabled state) — there are really nice things you can do to "fake" things (e.g. specular / overbright built-into into your colour-lookup tables) but those are hardly things reasonably current GPUs struggle at. Usually, for a software engine to make even a hint of sense, it has to be provided with the SDK, as otherwise the engineering effort is simply too high (ask any PS2 coder about clipping :)), and getting a pre-made engine usually precludes you from using all this nifty low-level trickery.
The engine was also one of the few that did actually work properly on a Matrox Mystique, without texture filtering but with stippled transparency...
 
@ [maven]

;)

Sorry, I just find these "CELL does everything better" opinions fascinating.

yes, so do i. but when somebody takes such fascinating opinions one step further and starts claiming things that are outright fantastic the original sense of fascination is quickly replaced by the scent of burning weed..

btw, thurp is neither my first nor will be my last encounter with sw rasterizers; for one reason or another it's been so that on every game project i've worked on since mid-90's there's been a sw rasteriser present in one form or another, often written by me. even last year i spent some time with pixomatic doing a 'failsafe' rendition backend for our product - turned out good. but unfortunately was not enough to save the product from its fate *shrug* but anyhow, you can say that i generally have a clue of the subject.

As an aside — having also written a complete software pipeline (which still shipped albeit in a disabled state) — there are really nice things you can do to "fake" things (e.g. specular / overbright built-into into your colour-lookup tables) but those are hardly things reasonably current GPUs struggle at. Usually, for a software engine to make even a hint of sense, it has to be provided with the SDK, as otherwise the engineering effort is simply too high (ask any PS2 coder about clipping :)), and getting a pre-made engine usually precludes you from using all this nifty low-level trickery.

shhhh! please, don't mention 'ps2' & 'clipping' in the same sentense unless you're prepared to face the grim consequences of Faf storming the building! : ))

The engine was also one of the few that did actually work properly on a Matrox Mystique, without texture filtering but with stippled transparency...

the mystique was a fine piece of vga hw, as long as you did not make it do any 3d rasterisation : ) i should still have a couple of those in some dust-collecting rigs back in my hometown.
 
darkblu, it's very fascinating reading about your thoughts about embm software rendering on cell vs other platforms, but what's the point, really? That's not how cell is going to be used, there's plenty of embm shader power and all the texture cache you're ever gonna need in RSX. You may be arguing well, but you're arguing a lost cause.

Cell will be used for software rendering, no doubt about it, but indirect addressing of 1MB textures through embm won't be the rendering scheme used. Look at Warhawk, where you have a single SPU drawing a sky full of large, fluffy volumetric clouds at 60 frames per second. THAT is how software rendering is going to be used...
 
the point, Guden, is that as potent cell might be for certain tasks, it still does not hold a lot against the general, 'big picture' class of GPU tasks. which involves nasty stuff like addressing indirection/perturbation, funky filtering modes and overall state-of-the-art latency hiding in the graphics context. particularly not when you factor in transistor budgets efficiency/power consumption.

i believe i've mentioned it before, i'd love to get my hands on cell for some wild fully-sw rasterizer fun, but when it comes to commercially-viable projects on a console-class plafrom like the ps3, rsx-assisting is much more realistic WRT to rasterizer applications of cell. all this IMHO, of course.

btw, from my observations at the rasterizers' field throughout the years i've developed the general impression that there's a ~9 year gap between GPU and CPU tech when it comes to performance under a similar feature-set (i.e. cpu's have a generally-unlimited featureset, they just don't perform optimally). of course with GPU's attempting to step further into the general-purpose territory, this time gap may shrink.
 
the point, Guden, is that as potent cell might be for certain tasks, it still does not hold a lot against the general, 'big picture' class of GPU tasks.
Yes, I know, but it's still a moot point, because nobody in their right mind is going to program a game that works that way, not when they have a dedicated traditional 3D graphics chip coupled directly to Cell that deals with these issues much better than Cell could ever hope to do. Nobody's arguing that cell is the ideal choice for every computing/rendering task. Well, except some misinformed f-person cretins maybe, heh.

And just the same, this awesome dedicated GPU can't draw those volumetric fluffy clouds with nearly the efficiency of Cell... It's all about using the right tool for the job. Like if you need to build a house, you wouldn't use the handle of a revolver to hammer all your planks together. It'd work for the odd nail, but no carpenter worth his salt would agree it'd be an ideal solution. :)
 
Back
Top