So if Far Cry on Wii was given a second chance........

While the art style was great and all, I do agree, Bioshock wasn't that impressive visually. It's graphics while some parts of it lent importantly to the game like the water, were no where near as important for the feeling of the player in the game like I think the Far Cry games have been. Then again I could be wrong.

I agree, such things as texture detail and some effects isn't that good in the game IMO. For me the thing holding it together is the art.
 
You obviously missed the whole point of the comment, which is that unknowledgeable can look at a game with lots of shader effects, such as Bioshock, and think that there are only perhaps two or three...which just goes to show that neither of you guys have any idea what kind of shaders are being executed when you look at a scene. You notice things that really stick out, like normal maps and moving shadows, and that's about it, but there's a whole lot more going on. Take it all away, and I guarantee that you would notice the difference. For example, look at this sequence of shots from Haze as they add various shaders to the original wireframe. The last image is the sort that you guys would look at and say "Gosh, there aren't any shaders at all, but really, the first image is what a scene with (almost) no shaders looks like. In modern games, post-processing alone involves more shader effects than you probably think are in the entire scene.

If you go here, you can see a comparison of PC on low quality, which removes most of the shaders, with the Xbox 360, and even on low quality, not all the shaders are removed.
 
Post processing effects like AA, HDR, color filters and the like can be so annoying..........however I like depth of field:p It's just that HDR has become this bastard child in the industry. It was cool at first, but it's so overdone to me, kinda like cell shading back in the day.

However if you've played through the single player part of Call of Duty 4, in particular the American missions in Act 1, it had one of the best scenes and water cooler moments ever in a game. I won't say it out of spoilage of the game, but holy mother of God I was in awe of the sheer magnitude.
 
Post processing effects like AA
AA isn't a post-process, and how on earth can AA be annoying?! What's bad about having better image quality?
It's just that HDR has become this bastard child in the industry. It was cool at first, but it's so overdone to me, kinda like cell shading back in the day.
HDR isn't a post process either, and it doesn't give a 'look' either. You're probably confusing it with high contrast and bloom, neither of which is dependent on HDR.
 
What fearsomepirate said. :)

Basically I mean that MP3 is undoubtedly an example of some serious effort put into optimizing for the hardware in Wii. It's an exclusive game made by one of the best dev houses working on Wii, and the previous games were certainly showcases for Cube. Take what you see however you want.

It likely started development on the cube though, and would have had its design constrained by that.
Then again, super mario galaxies is quite a nice looking game but isn't doing anything the cube didn't do either. It's just doing more of it, at a higher framerate, and bigger environments. The effects used in SMG look like a direct continuation of EAD's pikmin engine, just on a bigger scale at a higher framerate. And I guess that was sort of the point of the wii, gamecube but faster so devs can get rolling right away, and thus an engine optimized for the cube is already optimized for wii.
 
Would it be unheard of for Nintendo to sell 3rd parties their engines for Wii games?
Seeing as Nintendo seems to be king on Nintendo systems,why not just buy their tech.
 
I asked that before and as a reply I got: No, because everything is made how nintendo likes it and not for example build to somewhat suit everyone like UE3 regarding the tools.
 
Then again, super mario galaxies is quite a nice looking game but isn't doing anything the cube didn't do either.

Yes it is. It's using special effects, geometry, and textures in ways that exceed the Cube's memory & fillrate constraints. That's why there's nothing as nice-looking as Mario Galaxy on the Gamecube.
 
Yes it is. It's using special effects, geometry, and textures in ways that exceed the Cube's memory & fillrate constraints. That's why there's nothing as nice-looking as Mario Galaxy on the Gamecube.

Hey, I said bigger scale. Gamecube x3 or whatever.
 
I guess if you used EMBM instead of normal mapping, you could get a similar look. It would look more plastic than Doom 3 though.
 
Hey, I said bigger scale. Gamecube x3 or whatever.

I definetly have my doubts of the system being 3 times the overall capability of the GC. Nothing has yet to show or prove it. Times like these I wish Nintendo had done more to improve the hardware in order to allow much more headway for new techniques as well as keeping full backwards compatibility.

I don't know if anyone has gotten to the bottom of it, but supposedly the Wii GPU is possibly a "doubled-up" Flipper, and if that's the case I probably would've advised a tripled or quadrupled up Flipper as well as a beefier CPU. What if Nintendo had gone with a low clocked dual-core design for the CPU or even a CPU with a vector processor or two in order to pump up shader and physics capabilities? I suppose the TEVs can definetly be used for shader purposes but *sigh* I feel left in the cold by Nintendo and other devs as well. We hard core players have been left in the dust only to revel in few games that alleviate our salivation for more titles for the hardcore gamer, not ones for the mainstream with hardcore attributes just "tacked on" in the end. That and I'm getting sick of games using UE3 tech, they all look the freaking same.
 
I definetly have my doubts of the system being 3 times the overall capability of the GC.

You really shouldn't take things people say excessively literally. And for the record, B3D has indeed "gotten to the bottom of it," and no, the Wii GPU is not a doubled up Flipper. It has little if any increased functionality and the data throughput of a Flipper overclocked by 50%. The GPU and CPU in Wii are indeed Flipper and Gekko overclocked by 50% and shrunk to reduce heat. The only significant advantage is that the system has ~3.6x the main RAM of Gamecube with ~50% bandwidth increase.

There are lots of things lots of people would have advised, and no version of a fixed-function GPU was one of them. Nintendo's out in left field with the Wii's hardware, and it has cost them dearly in 3rd party support.
 
You really shouldn't take things people say excessively literally. And for the record, B3D has indeed "gotten to the bottom of it," and no, the Wii GPU is not a doubled up Flipper. It has little if any increased functionality and the data throughput of a Flipper overclocked by 50%. The GPU and CPU in Wii are indeed Flipper and Gekko overclocked by 50% and shrunk to reduce heat. The only significant advantage is that the system has ~3.6x the main RAM of Gamecube with ~50% bandwidth increase.

There are lots of things lots of people would have advised, and no version of a fixed-function GPU was one of them. Nintendo's out in left field with the Wii's hardware, and it has cost them dearly in 3rd party support.

So was there every a reason for why the gpu is much larger than it should be at 90nm? Was it just determined to be a very inefficient shrink?
I could believe smg is running on something 3x the cube hardware, it looks better than super mario sunshine at a solid 60 fps (I think, it could be 30), as well as good as pikmin 2 but with larger environments and a better framerate. Unless the cube was more memory limited than processing limited.
 
I dunno. I asked that several times in the thread about the B3D article, but no one responded. My guess is that it either didn't shrink as much as the CPU, or the margin of error is quite large when measuring things with a ruler at that scale.

Judging by various comments ERP has made about the Cube, it was indeed largely RAM limited, although fillrate was sometimes an issue as well. 50% extra fillrate is nothing to sneeze at from a qualitative standpoint; that's basically enough to add a new flashy effect like EMBM to every single surface. When you're talking at such small scales, say 4 texture/fx layers and 4 lights, adding a few layers and vertex lights can make a visible difference.
 
I dunno. I asked that several times in the thread about the B3D article, but no one responded. My guess is that it either didn't shrink as much as the CPU, or the margin of error is quite large when measuring things with a ruler at that scale.

Judging by various comments ERP has made about the Cube, it was indeed largely RAM limited, although fillrate was sometimes an issue as well. 50% extra fillrate is nothing to sneeze at from a qualitative standpoint; that's basically enough to add a new flashy effect like EMBM to every single surface. When you're talking at such small scales, say 4 texture/fx layers and 4 lights, adding a few layers and vertex lights can make a visible difference.

Wasn't the audio chip logic integrated into the Hollywood package as well as the 24 MB of "internal" RAM? That would explain the chip size staying the same despite the smaller 90nm process. Even still, I can sit here and fantasize about a Wii with true 720p capabilties, as well as real shader support (in a DirectX frame of mind) with a dual core CPU and a GPU as good as an ATi X1600. Ulgh, this thing really just kills so many of us nerds here at B3D.:???:

Oh and I've heard rumors of dedicated physics processing in the Hollywood package as well. Unlikely as it may be, it would be pretty cool if it was true, and so far physics on the Wii have actually been quite good I must say.
 
The audio logic was already integrated even on flipper.
As far as the 24MB of ram, I have no idea. That would explain the chip size if so.
It still is annoying that gpus the same size as hollywood on the pc are much more powerful/featured.

What physics on the wii have you seen, besides elebits? The prescence of "physics" processing on the wii (besides just SIMD extensions to broadway) would imply that the gpu is capable of much more general processing than anything has let on graphically.
 
The audio logic was already integrated even on flipper.
As far as the 24MB of ram, I have no idea. That would explain the chip size if so.
It still is annoying that gpus the same size as hollywood on the pc are much more powerful/featured.

What physics on the wii have you seen, besides elebits? The prescence of "physics" processing on the wii (besides just SIMD extensions to broadway) would imply that the gpu is capable of much more general processing than anything has let on graphically.

LOL Elebits is the only extreme physics I've seen, and they were impressive I think, and I'd count Red Steel and Far Cry, but they didn't do too much stuff in terms of physics. Red Steel did get me a bit excited when you first start in the elevator: the elevator's glass had a caustics shader running on it like I first ever saw in Half Life 2. It made me say "whoa!" at first but then I saw the rest of Red Steel and said meh. Though I'm very interested to see how well the Broadway can handle more high level physics.
 
The 24 MB of RAM is still separate. There was a mobo teardown a while ago, and you could clearly see it. I think it was under the same heatsink as the GPU.

Put any rumors you've heard out of your mind. The hardware is quite well-understood now. It's an overclocked Gamecube with 64MB of GDDR3 at 3.9 GB/s replacing the 16 MB of "A-RAM" at 81 MB/s. There are no physics processors, programmable shaders, additional cores, or anything of the sort.

The glass shader in Red Steel is not all that much unlike that of Twilight Princess. But of course, Wii has more fillrate and RAM, so it's used to greater effect in Red Steel. Also, they did some nice stuff with reflections in that game. 50% extra fillrate should allow a reasonably large number of effects combinations compared to Gamecube.
 
The 24 MB of RAM is still separate. There was a mobo teardown a while ago, and you could clearly see it. I think it was under the same heatsink as the GPU.

Put any rumors you've heard out of your mind. The hardware is quite well-understood now. It's an overclocked Gamecube with 64MB of GDDR3 at 3.9 GB/s replacing the 16 MB of "A-RAM" at 81 MB/s. There are no physics processors, programmable shaders, additional cores, or anything of the sort.

The glass shader in Red Steel is not all that much unlike that of Twilight Princess. But of course, Wii has more fillrate and RAM, so it's used to greater effect in Red Steel. Also, they did some nice stuff with reflections in that game. 50% extra fillrate should allow a reasonably large number of effects combinations compared to Gamecube.

And the big conspiracy like secret of the Wii was..........nothing. LAME. LOL w/e it's nice to see it get the proper amount of RAM for the hardware though. The GC's A-RAM I hear is almost useless as it was just too slow and was mostly aimed at disc drive buffering and audio. Lot's of RAM means large environments and less loading times! YAY!
 
I dunno. I asked that several times in the thread about the B3D article, but no one responded.

So "B3D has indeed gotten to the bottom of it," simply means that noone answered your question about the reasons for the large GPU die area when you asked the question in a thread on these public message boards. And you then concluded from this that there was no reason for it?

Wow. That's really solid reasoning, right there!

The simple fact is that there has been no good leak about the capabilities of the Wii GPU.
 
Back
Top