Digital Foundry Retro Discussion [2018 - 2020]

The problem was they made the Wii HW BC with GC. Basically by doing no work on the hardware design. Either they'd have had to create a new Flipper that was BC, or they'd have had to go through the effort of creating an emulator on a new architecture. A new architecture mid-cycle would have been bonkers, at the cost of a new console with none of the new-gen benefits.

What I find hilarious is that they basically castrated one of their best selling consoles hardware to offer BC to one of their worst. They neglected suppprting BC for the decade-long market leader NES on the SNES, and again neglected their next market leader SNES BC for it's successor, and so on. But the GameCube, their most mediocre console till then (market wise) was the one they felt "oh, we gotta make sure our next machine plays all its games". Historical hindsight is fun.
 
Last edited by a moderator:
The Wii had immense amounts of impact alright. The question is: will anyone remember it's significance in casualizing games, targeting normies, and allowing the phone industry to take advantage of the new audience? I think the Wii also brought down the cost of accelerometers that made their way into the iPhone and everything else after.......
 
What I find hilarious is that they basically castrated one of their best selling consoles hardware to offer BC to one of their worst. They neglected suppprting BC for the decade-long market leader NES on the SNES, and again neglected their next market leader SNES BC for it's successor, and so on. But the GameCube, their most mediocre console till then (market wise) was the one they felt "oh, we gotta make sure our next machine prays all its games". Historical hindsight is fun.

Clearly, revised GC hardware met their needs and providing BC acquired brownie points from all the gamers who see it as a feature of goodwill. Also, weren't many early Wii games developed on GC devkits? Nintendo was clearly making use of an already existing ecosystem. Pretty unrisky way of trying a new thing, which paid out handsomely.
 
Clearly, revised GC hardware met their needs and providing BC acquired brownie points from all the gamers who see it as a feature of goodwill. Also, weren't many early Wii games developed on GC devkits? Nintendo was clearly making use of an already existing ecosystem. Pretty unrisky way of trying a new thing, which paid out handsomely.

Totally true. I understand why Nintendo went that route, that's why I said "hindsight is fun". They didn't know Wii was gonna be the phenomena it was. In fact they weren't even sure it was gonna succed at all. It was a gamble, and as such, they minimized risk by making it low investment. Still, the WiiU proved you can make hardware that is both compatible with GC featureset AND incorporates modern GPU features. Had Nintendo betted higher with the Wii, they'd had requested something like that from AMD back then, and by God they would have added gyros to the standard mote as well, motion plus can go to hell, it's what the Wii should have been from the start.
 
Totally true. I understand why Nintendo went that route, that's why I said "hindsight is fun". They didn't know Wii was gonna be the phenomena it was. In fact they weren't even sure it was gonna succed at all. It was a gamble, and as such, they minimized risk by making it low investment. Still, the WiiU proved you can make hardware that is both compatible with GC featureset AND incorporates modern GPU features. Had Nintendo betted higher with the Wii, they'd had requested something like that from AMD back then, and by God they would have added gyros to the standard mote as well, motion plus can go to hell, it's what the Wii should have been from the start.

The Wii U's hardware benefitted greatly from the 6 years of CPU & GPU development between 2006 and 2012. More importantly, process advances meant the Hollywood and functional capabilities of the GC/Wii's 1T-SRAM could be affordably integrated into a single IC. Success of the original Wii funded such ambitions, even for something as overblown as creating a triple core PPC 750 instead of just adding one to the side and actually using a modern CPU core.
 
The Wii U's hardware benefitted greatly from the 6 years of CPU & GPU development between 2006 and 2012. More importantly, process advances meant the Hollywood and functional capabilities of the GC/Wii's 1T-SRAM could be affordably integrated into a single IC. Success of the original Wii funded such ambitions, even for something as overblown as creating a triple core PPC 750 instead of just adding one to the side and actually using a modern CPU core.

Yeah, they wouldn't be able to pull a WiiU on 2006. But maybe something more like a GC with DX8 level vertex and pixel shader functionality at the very least.
 
Am I the only one who likes the wii's "gamecube on steroids" hardware? I think most games could've done a lot more with it though. Mario galaxy and skyward sword blow away 99% of the rest of the library visually.

Yeah, they wouldn't be able to pull a WiiU on 2006. But maybe something more like a GC with DX8 level vertex and pixel shader functionality at the very least.

Well, it pretty much had DX8 level pixel shaders (Nintendo called it a multi stage texture environment lol). No vertex shaders, but at least it could push a decent amount of polys.
 
Last edited:
Well, it pretty much had DX8 level pixel shaders (Nintendo called it a multi stage texture environment lol). No vertex shaders, but at least it could push a decent amount of polys.

More like DX7 level features + some extra TEV blending stuff, but still less feature rich than the Xbox Original. I'm not sure if that makes me think of Nintendo as terribly cowardly or extremely balsy.
 
More like DX7 level features + some extra TEV blending stuff, but still less feature rich than the Xbox Original. I'm not sure if that makes me think of Nintendo as terribly cowardly or extremely balsy.

I'd have alot more respect for the hardware if Nintendo pushed the clocks further. In total I think the entire system pulls about 15W. Double or even triple the GCs clocks could've been possible if they really wanted it, and it still would've been a fraction of that consumed by the 360 or PS3.

But better yet, I'm still a proponent that they should've used a completely different set of hardware, because regardless of the clocks, the feature set would still be an issue of contention. The legendarily short pipelining of the PPC 750 only got you so far. Though despite the Wii Us crappy performance in alot of ports, it still impresses me that such an old CPU architecture was able to run games built around much more robust SIMD implementations. Either games are not as SIMD taxing as I figured they were, or Nintendo's claims of "eliminating memory latencies" really were true. Three PPC 750 cores tied together with more cache and clock speed should not have worked out "so well".

Then of course you have the goddamn Hollywood GPU and all the inhereted lack of features. Nintendo themselves made some relatively nice looking games, but the vast vast majority of other titles no where near beguiled the potential of the GPU. The inability to drop in downgraded assets from the HD platforms, with normal maps and materials intact certainly was not to the Wii's advantage. I can't remember a single Wii multiplatform game where the devs actually plucked around with the TEV features. Only exclusive titles got that sort of attention, and outside of Nintendo, it was a rarity, and spotty in terms of quality. The Conduit games for instance are an unbalanced mess, where you have some really impressive normal mapping on characters and particular objects, but not enough performance left to bring the environment or other assets to the same level of quality. Shin'en's Jett Rocket is the only game that comes to mind to match Nintendo's level of visual quality and balance. Though their Fun! Fun! Minigolf is really cool in that it uses a global dynamic shadow mapping system (no self shadowing though). It doesn't have Jett Rocket's same scope of texture effects though.
 
Last edited:
More like DX7 level features + some extra TEV blending stuff, but still less feature rich than the Xbox Original. I'm not sure if that makes me think of Nintendo as terribly cowardly or extremely balsy.

yeah it didn't have vertex shaders but the pixel pipeline was essentially at the same level as the xbox.
 
New DF Retro extra, a superunique -a la Diablo 2- :D one:

https://www.eurogamer.net/articles/...at-links-tron-and-pitfall-the-mayan-adventure

Bill Kroyer, an animation veteran who learnt from the original masters at Disney, pioneered CG rendering in Tron - and collaborated with Activision during the mid-90s on the fascinating Pitfall sequel.

(....) The Mayan Adventure aimed to bring Hollywood-level animation to gaming, (....) Bill Kroyer and his team were the ideal pick, having worked both at Disney and independently.

What resulted was one of the most genuinely fascinating 16-bit 2D platformers of the era, delivered across a multitude of platforms - as DF Retro has previously reported.

the tale behind how Pitfall Harry (....) Kroyer offers up his opinions on the techniques used in CG animated features of today, and his concerns that key skill-sets in the creation of hand-drawn animation could be lost forever.

 
Last edited:
I don't think it was.
Imho, the Gamecube was fairly close to the Xbox. Also, Xbox couldn't run F-Zero GX at 60 fps, imho...

In most cases, the Xbox was indeed the most powerful. The GameCube could process many many textures on a single polygon however, so there are definitely cases where the GameCube can do things faster than the Xbox. Overall, the GameCube and the Xbox were fairly close. The PS2 lagged far behind both the GameCube and the Xbox as far as real world numbers go. I think one of the reasons we didn't see at the time a lot of things on the Xbox that would be impossible/difficult to do on the other platforms is that the pixel/vertex shaders weren't available on other platforms. With the Xbox not usually being a primary SKU back then, and use of the pixel/vertex shaders making a port to other platforms difficult, I think Xbox games tended to be left by the wayside a bit, and not really exploited by developers.

The other thing that kept people from really using the pixel/vertex shaders of the Xbox is that the technology itself was fairly new. That is one of the reasons I was so impressed by how John Carmack used the shaders for Doom3, when I purchased the game for the Xbox back then.

From what I understand, the artists basically made 2 models, a high poly one and a low poly one. Then there was a tool that processed those 2 models and created shaders for the low-poly model to make it look like the high poly one. This is a great use of the shaders, and probably one of the reasons Doom3 looks so good on the original Xbox.
 
Last edited:
From what I understand, the artists basically made 2 models, a high poly one and a low poly one. Then there was a tool that processed those 2 models and created shaders for the low-poly model to make it look like the high poly one. This is a great use of the shaders, and probably one of the reasons Doom3 looks so good on the original Xbox.

That is normal mapping in a nutshell my friend. Pretty much all XB360 games and onwards used and still do use the very same techniche.
Carmack did not invent it, but certainly pioneered it's practical application in games.
The shaders Doom 3 do run at real time are simply per-pixel point and spot lights with support for normal-maps and stencil masks. Pretty advanced for it's time, but still quite a standard fare. It's no surprise other games had even beat doom to it on the Xbox, releasing earlier with very similar rendering pipelines and resulting graphics.
 
well, i'm just going by what the guy who (partially) wrote the ubershader path for dolphin said




Is that really an opinion thing? Heh

I mean, the xbox most likely could've if it was reworked as needed.
it's an opinion. I mean, I am not sure the Xbox could handle the transparencies at that speed.

On a different note, this classic song is worth sharing.


the new series of Duck Tales also has a version of this song.

 
Well, the cube has higher bandwidth (7.8 GB/s framebuffer vs 5.3 GB/s on xbox), but they both have 4 pixel pipes and the xbox is clocked almost 50% higher, so it's got a much better fill rate. So I'm not sure which is better for transparencies, probably the xbox I'm thinking.

Have you ever seen otogi? That game throws amazing screen filling particle effects all over the place. Doesn't seem like it slows down much either.

 
Last edited:
At the time, the console industry was a battlefield.

A real one, not the skirmishes that exist today. For my liking, the game seemed a doze -never played it- and many people wanted to sell it as the new lost ark. The fans of the PS3 were looking for the missing referent, almost at all costs, and that opinion was backed up because it looks good, but imho, the game looked so gritty and colourless to me.

Like rosy cheeks, if you take the bright the all bright and colourful away from me, the game loses a bit of artistic touch, imho.

Uncharted 2 and KZ2 were used in a ravish manner by ps3 fans as the referents, the standard of what games at the time should look like and to reinforce their bias that the PS3 was more powerful than X360 -it was not-.

The game has its merits though, despite his charmless colour, the physics were very good, typical of games at the time, compared to current games. Back then the physics in games were much more important.

2l8kfpf.jpg
 
Last edited:
Back
Top