Is the Hollywood GPU software clocked like the CPU?

Flux

Regular
Is the Wii GPU software clocked by the BIOS like the CPU is?

I mean how could GC emulation mode work with the GPU still at 243Mhz?
 
To simplify this for you think of it in PC terms.
You release a video game running on a Geforce 4 video card, with GF5 as top of the line.
Your monitor has a 60Hz refresh so the game uses VirtualSync and displays at 60fps.
You buy a Geforce 8 card years later and come across this old disc.
Does the game suddenly looks better or play differently with identical settings on a compatible card?

The hardware specification of Wii are very much alike to Gamecube.
The only major change is cosmetic. Because of that all the development tools carried over with updates for higher clock and a few new features. Mostly the updates were refinements. Though in the last year the Runtime and other things may have changed.

Here is a quick link that might help. http://www.futurecto.com/4.html

Eitherway since the output is located @ 60fps and the hardware is native to the code, not emulation is taking place. Just reduced speed stepping of the clock and maybe some minor details in bios or firmware. Does that help make it a little clearer?
 
Back
Top