48 vec4s, 192 "shaders" though I don't think they called them that back then. Not to mention modern shaders have some efficency gains.
48x Vec4 + Scalar, not 48x Vec4
And they're unififed shaders just like on modern cards.
48 vec4s, 192 "shaders" though I don't think they called them that back then. Not to mention modern shaders have some efficency gains.
It would be imo such a shame if Nintendo made the dirty compromise it made for the sake of BC.Clocks at an exact 1.5x, 2x etc. versus the former hardware are only useful when the hardware is identical, or the former hardware is a subset of the new one. i.e. Wii and Gamecube, Commodore 128 and 64, Game Boy Color and monochrome.
You drop at the older clock and feature level and have insta-compatibility. But here the CPU is all new so the timings would be wrong even if you run it at Wii frequency.
But here the CPU is all new so the timings would be wrong even if you run it at Wii frequency.
Right.
There must be some sort of software layer involved. I'd have thought the CPU would be easier to deal with anyway, but it's odd they don't even render at higher resolution for Wii titles.
The problem with emulation is it's difficult to get the lat 5% of games to work flawlessly, console games far more than they should be can often be very dependent on exact timing. More often than not these issues are bugs in the original code that just don't manifest in test.
Which brings us to why "enhanced broadway" makes sense.
Actual T&L HW? lolol "special HW for lighting" *cough*I'd also suggest that the Nintendo special sauce in the GPU is GC GPU emulation.
Could it be that Nintendo went further and that the justification for the 32MB of EDRAM is to indeed emulate the multiple embedded memory bank in the Wii?Thanks for that.
Actual T&L HW? lolol "special HW for lighting" *cough*
TEVs/register combiners? XD
hm... Texture/buffer formats? I know the D3D API doesn't support RGBA6, but that's probably not limited by HW. RGB8 is fine.
Could it be that Nintendo went further and that the justification for the 32MB of EDRAM is to indeed emulate the multiple embedded memory bank in the Wii?
The Wii U is not based on the R300. Going by what the article says, it's based on an embedded technology that is much more recent.GPGPU debuted on the R300, a 2002 chip
The Wii U is not based on the R300. Going by what the article says, it's based on an embedded technology that is much more recent.
Anyone noticed Nintendo said that it would be compatible with most Wii games, not all...
I wonder if there's anything to gather from that.
Did they single out earlier games? It's more commonly games that use funky exploits that don't emulate well, which tend to come later in the console's life-cycle as devs get to learn its secrets.Thats actually a very interesting point. Can anyone here shed any light on earlier Wii games which may have been reliant on particular hardware attributes of the Wii (which may not be present in the WiiU when its in "Wii mode")??
They said the same thing for the 3DS. It's most likely just for liability reasons.Anyone noticed Nintendo said that it would be compatible with most Wii games, not all...
I wonder if there's anything to gather from that.
Did they single out earlier games? It's more commonly games that use funky exploits that don't emulate well, which tend to come later in the console's life-cycle as devs get to learn its secrets.
They said the same thing for the 3DS. It's most likely just for liability reasons.
Anyway, I believe the CPU to be Broadway/ppc7XX based. For easier BC and because "enhanced Broadway cores" are the info they gave to developers. I see no reason they would use the term "Broadway" for cores that are very different from Wii's Broadway CPU. I mean those word are for devs, not PR word play for the public.
First games are about getting something out the door, so you tend to go by the book using whatever libraries are provided. As you learn the system, opportunities present themselves, such as using a piece of hardware to do something it wasn't really intended to do (like GPGPU say) or using an exploit in a memory operation to perform some extra function, or sticking a bit of data in a system buffer where it shouldn't really go but it gets a good performance advantage when you do that. There are amazing things developers have found to do with hardware beyond it's design. One of my all-time favourites was a hardware hack on the Amiga computer. It had a clock port on the mobo designed just for a RAM and timing part that was never released, which enterprising individuals and companies used as a general expansion port to add sound cards and USB controllers and all sorts. The problem with code that targets sspecifics of the metal, even if just relying on the time it takes to perform an instruction to synchronise events, is that hardware changes stop them working. We've even seen that in supposedly identical hardware such as some PS2 revisions being incompatible with some games. The official line is that is you use the system libraries, your program will work on alternative hardware. But using system libraries introduces an inefficiency, so devs wanting more from the system will try to avoid them sometimes.Care to elaborate on 'funky exploits'? I dont quite understand the process.
It's bound to be a bit of both. Unless they put in Wii hardware as is, there'll be some software translation involved, I'm sure. The CPU could be anything from a perfect carbon copy of Broadway to a custom part with support for Broadway instructions to a non-related PPC part that's running an emulator to catch and translate the unsupported instructions.This may shed some light on whether we're looking at BC hardware or software emulation, no?
For this exact reason, I dont beleive that particular leak has come from where the source claims. I'm not saying its fake, but that its maybe just someones opinion/educated guesswork based on using that partcular dev kit. The source claimed the info came directly from warioworld (nintendo developer portal) but I highly doubt Nintendo would use such terminology in officla documentation. It holds no value to developers and provides them with little to no info about the CPU. They are likely receiving much more detailed information (such as the first leak VGleaks had, where some details about the CPU cache etc was included)
It's not a Power 7 derivative. It's directly descended from the CPU core in the Wii, there are just more of them and they are clocked a little faster. It does come up about the same as Xenon for processing power, but the clock is much, much closer to Wii than X360.