WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
It might be running in a lower color mode when you run it in progressive scan. There are a few Gamecube titles that had no transparency dithering even on S-Video (trust me, it shows up on S-Video if it's there), but did in 480p. Eternal Darkness was one of them. The same thing's been observed in Wind Waker and Excite Truck. Also, the Cube used a deflicker filter of some sort in 480i mode that was often the only antialiasing games had, thus introducing aliasing at 480p.

OTOH, some people say RE4 didn't dither in 480p, although it was obviously running in 6:6:6:6 color mode and had pretty significant banding and aliasing. And then there are games running in the high-color mode with plenty of tranparencies and no dithering, like F-Zero and Metroid Prime. So there are ways around it, but obviously, it's a bit of a trick.
 
Last edited by a moderator:
I can do screen captures if anyone's interested of this, Wii sports or Rayman (via composite video capture card tho). I haven't been able to determine the framebuffer resolution yet - it's outputting 576 lines, screen captures would probably reveal if there's any scaling going on.

I'm interested! Try and get some shots showing the best or worst of something, that'll be interesting even for menial comparison.
 
This sounds similar to what I read in this blog:
http://kingludic.blogspot.com/2006/11/agile-wii.html

My sources are mostly snippets of forum flak, but the saturation is good enough, with the precedent of the GameCube's chip, that I'm going to stick my neck out and say it is indeed an out-of-order chip. Xenon and Cell are in-order, which is bad for gameplay innovation, according to friend of game developers everywhere, Mr. Checker:

So, as you know, graphics and physics grind on large homogenous floating point data structures in a very straight-line structured way. Then we have AI and gameplay code. Lots of exceptions, tunable parameters, indirections and often messy. We hate this code, it’s a mess, but this is the code that makes the game DIFFERENT. Here is the terrifying realization about the next generation consoles: I’m about to break a ton of NDAs here, oh well, haha, I never signed them anyway.

Gameplay code will get slower and harder to write on the next generation of consoles. Modern CPUs use out-of-order execution, which is there to make crappy code run fast. This was really good for the industry when it happened, although it annoyed many assembly language wizards in Sweden. Xenon and Cell are both in-order chips. What does this mean? It’s cheaper for them to do this. They can drop a lot of cores. One out-of-order core is about four times [did I catch that right? Alice] the size of an in-order core. What does this do to our code? It’s great for grinding on floating point, but for anything else it totally sucks. Rumours from people actually working on these chips – straight-line runs 1/3 to 1/10th the performance at the same clock speed. This sucks.

So the great strength of the PS3 and Xbox 360 come at a cost, not just in terms of manufacturing, but in terms of performance, and consequently in terms of production costs relating to programming.

Wii can do stylized graphics and multi-threaded AI, it can provide a vast range of visual aesthetics and simulations. Wii is like a Ninja, while PS3 is like a Samurai and Xbox 360 is like a Knight wearing glasses. Who do you think would win in a fight?

I would have picked better historical references, and you only made the 360 a knight cause it's "Western". Anyhow, going by the technological advantages of each.........well I can't say about a ninja and a samurai (but since real ninjas were nothing like in games and movies, I'd say the samurai, as a trained warrior with heavier equipment, would likely win) and a knight, even wearing glasses would easily win the fight. I'd doubt the weapons of the Samurai and Ninjas, typically made for cutting flash, would easily pierce the heavy and well developed armor of a knight, if they could at all. Not to mention that with all of Europe's wars, there's a good chance the knight would be considerably more battle-tested against varying fighting styles than the samurai or ninja. For that matter, wearing glasses isn't much of a disadvantage, as they fix the primary fault of the knight in this case (eyesight) and are another example of a knight having superior technology to his Japanese opponents.
 
Are you saying that Samurai vs Ninja vs Knight isn't relevant to the Wii's GPU?!?! :oops:

Here's a genuinely interesting tidbit, which you can verify for yourselves if you have a Wii. A friend of mine just got a Wii and said that the slowdown in Metroid Prime 2 near the portals is completely gone. If you recall, the game's framerate just died right as you entered one of those things, dropping to something absurd like 10 fps. Likewise, the slowdown when you spam the ground with all 20 proton bombs in Rebel Strike is gone. In both instances, the screen was filled up by indirect-texturing effects, eating all the Gamecube's fillrate.

Given how badly the framerate crashed in those situations, I doubt that 50% more clock cycles would eliminate the problem, although I haven't seen it for myself. My guess is that the Wii's GPU has more pipelines than Flipper, either 6 or 8.
 
Last edited by a moderator:
Portals, what portals? It was a while since I played through metroid prime, but I can't remember any portals I'm sad to say. If you give a hint of where I should look I can go check it out.

Peace.
 
We claim:

1. A graphics system, comprising a graphics chip having graphics processing circuitry and an embedded frame buffer for storing frame data prior to sending the frame data to an external location, wherein the embedded frame buffer is selectively configurable between the following pixel formats: RGB8 and 24 bit Z; RGBA6 and 24 bit Z; Three R5G6B5 color and 16 bit Z super-samples; and YUV 4:2:0; wherein in the YUV 4:2:0 configuration, a color buffer of the embedded frame buffer is partitioned to store 720.times.576 Y, 360.times.288 U and 360.times.288 V image planes for a YUV 4:2:0 frame and further, wherein the color buffer partitioning allocates as follows: 1024.times.640 8 bit Y image; 528.times.320 8 bit U image; and 528.times.320 8 bit V image.

2. In a graphics chip having pixel processing circuitry and an embedded frame buffer for storing pixel data prior to transferring the pixel data to an external destination, an improvement comprising: a reconfigurable embedded frame buffer which can be selectively configured to store any of the following pixel formats: 48 bit point sampled color and Z; 96 bit super-sampled color and Z; and YUV; wherein the embedded frame buffer is further selectively configurable to store the following 48 bit formats: RGB8 and 24 bit Z; and RGBA6 and 24 bit Z; and wherein the 96 bit super-sampled format includes three super-samples each having a R5G6B5 color and 16 bit Z format.

3. A method of using an embedded frame buffer in a graphics system, including the steps of: providing an embedded frame buffer that is selectively configurable to store image data in either RGB color format or YUV color format; providing an interface to the graphics system which controls the configuration of the embedded frame buffer; enabling the RGB color format to be configured as either a 48-bit point sampled color and Z format or a 96-bit super-sampled color and Z format; and defining the 96-bit super-sample format to include three super-samples each having a R5G6B5 color and 16 bit Z format.
http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=18&f=G&l=50&d=PTXT&p=1&p=1&S1=(Nintendo.ASNM.+AND+%40AD%3E%3D20011101%3C%3D20060512)&OS=AN/Nintendo+AND+APD/11/1/2001-%3E5/12/2006&RS=(AN/Nintendo+AND+APD/20011101-%3E20060512)
 
Here's a genuinely interesting tidbit, which you can verify for yourselves if you have a Wii. A friend of mine just got a Wii and said that the slowdown in Metroid Prime 2 near the portals is completely gone. If you recall, the game's framerate just died right as you entered one of those things, dropping to something absurd like 10 fps. Likewise, the slowdown when you spam the ground with all 20 proton bombs in Rebel Strike is gone. In both instances, the screen was filled up by indirect-texturing effects, eating all the Gamecube's fillrate.

I noticed the same thing with gamecube games. I tried several and tried to slow them down and it didn't happen.
Additionally, someone ran the Datel GBA emulator on the wii in gamecube mode, and it ran at 1.5x the speed, indicating that the wii does not downclock to play cube games. (and in that case, if cube games aren't cycle dependent, did they really need to go with the same family architectures?)
 
Last edited by a moderator:
I missed that the above poster wrote metroid prime >2<. Though I didn't really see much slowdown there either to be honest, its a really well-coded game. I will go check it too out however some day soon.

I am unsure we really can assume Wii runs at Wii speed even in GC mode by the fact Zelda's framerate is noticeably jerky as it is. I don't want to think what it's like on a humble cube if the game actually runs faster on Wii.


Peace.
 
Last edited by a moderator:
I missed that the above poster wrote metroid prime >2<. Though I didn't really see much slowdown there either to be honest, its a really well-coded game. I will go check it too out however some day soon.

I am unsure we really can assume Wii runs at Wii speed even in GC mode by the fact Zelda's framerate is noticeably jerky as it is. I don't want to think what it's like on a humble cube if the game actually runs faster on Wii.


Peace.

Zelda: Twilight Princess you mean? Didn't really notice any framerate problems so far, but that is a Wii game and runs in the Wii environment, and only supports widescreen on the wii, I can't imagine Nintendo downclocking the system for that game.
 
Those elebits videos is very nice, way above of what we seen on GC.
It would be very interesting to see the results if Valve tried to do (from the ground up) something like HL2 on Wii, on the XB they did a bad port but the game still got the same scenes, some physics althought sometimes very bad fremerates, but this make me think that one would get a very nice version of such game on the Wii, and sincerely very litle had been made of innovative (using specs)since that one (althought I hope we will see a bit more). This make me have good hopes for future Wii games althought still not enought for a 250$ console.

Anyway it is really interesting to see that after so many games we still dont have any defenitive clue of what is taking up to 33% and 96% more space (from the pessimitics stimates some pages ago) on the CPU and GPU.

If you read the earlier interview, one of the dev said they have trouble keeping up the frame rate with all the physics that are going on. I forgot what their solution was, I think it was to make the game to 30 fps instead of 60 fps or maybe just the physics loop or tone down the physics, I forgot, it was sometime ago when I read it.

There is a translation made by One from a article in PCwatch here the elebits guys talked about physics. Althought there isnt much more info.
 
Last edited by a moderator:
Wii isn't significantly more powerful than Xbox. Hell, it might not be faster at all considering it's been said Wii has lesser shader capabilities. So I can pretty much see how HL2 would go.

I've played a ton of Zelda now and I have to ask about the texture filtering. It looks as if the thing isn't doing any mip mapping because the distance has intense texture aliasing and swimming. I seem to recall this from RE4 as well. I also have to say that after playing Zelda at 480p on a friend's 50" plasma that I sure wish they'd added some always-on AA. I don't understand the non-interest in AA. And the lovely dithering in the desert was so bad that my technically non-saavy friend asked "why is the desert changing colors", as the ordered dithered desert floor lit up at sunrise. LOL.

Hopefully "real" Wii games can do significantly better. At least fix the damned dithering.

Wii looks best on a big SDTV thru component cables. Keep it off of HDTVs unless you prefer sharpened aliasing and low-detail textures that are truly reminisent of GeForce2/Radeon days. :) GeForce 2 in 16-bit color and 640x480. Display clarity is not the universal pleasure that it is purported to be.
 
Last edited by a moderator:
Wii isn't significantly more powerful than Xbox. Hell, it might not be faster at all considering it's been said Wii has lesser shader capabilities. So I can pretty much see how HL2 would go.

I've played a ton of Zelda now and I have to ask about the texture filtering. It looks as if the thing isn't doing any mip mapping because the distance has intense texture aliasing and swimming. I seem to recall this from RE4 as well. I also have to say that after playing Zelda at 480p on a friend's 50" plasma that I sure wish they'd added some always-on AA. I don't understand the non-interest in AA. And the lovely dithering in the desert was so bad that my technically non-saavy friend asked "why is the desert changing colors", as the ordered dithered desert floor lit up at sunrise. LOL.

Hopefully "real" Wii games can do significantly better. At least fix the damned dithering.

Wii looks best on a big SDTV thru component cables. Keep it off of HDTVs unless you prefer sharpened aliasing and low-detail textures that are truly reminisent of GeForce2/Radeon days. :) GeForce 2 in 16-bit color and 640x480. Display clarity is not the universal pleasure that it is purported to be.


If the wii have an advanced vs-ps , in that case you use that and you design your code from the ground for that, or you using the old dx7 pipeline, in that case of cource the possibilites are limited.
And the current games are designed for the dx7 way (cuse me, but I have to use te ms terminology).

If you use a full wii game, you can use the 24 megs of 1tram for interestin effects.example to move mip-maps quickly between the texture buffer and the main memory.
 
Wii isn't significantly more powerful than Xbox. Hell, it might not be faster at all considering it's been said Wii has lesser shader capabilities.
I disagree. The CPU is probably not much faster for most stuff, but if nothing else the Wii has a lot more memory bandwidth, especially if you factor in the framebuffer BW. As that was one of the largest complaints about the original Xbox IIRC, it should help some.

At least fix the damned dithering.
HELL YES! I'm happy that I decided early on to connect my Wii to a somewhat older CRT. Give me blur over edge crawl and dithering any day. It's not like there's much fine texture detail I'm missing ;)
Radeon days? Much worse than that! I haven't seen dithering this bad ever since before I bought my Matrox G400 in - what? - 1999?

I still don't get the whole dithering thing. The memory's there, the bandwidth is there. Sure, it may not bother my parents all that much while playing Wii golf, but f*ck Nintendo, at least pretend that you have some miniscule amount of consideration for us gamers left after you slapped us in the face with the Wii specs already.
(Yes, I'm still bitter about it. Especially since - after a slow start - Zelda TP proves that they can still make absolutely indispensible games)
 
Wii isn't significantly more powerful than Xbox. Hell, it might not be faster at all considering it's been said Wii has lesser shader capabilities. So I can pretty much see how HL2 would go.

Why not, like we saw there is a good amount of new silicone to be used even if it cant do every thing that XB can do as fast as XB can, it should be faster overall, so overall it should do "better games" (from a tech POV). In somethings I even think GC is better than XB (eg water in games).

I really think that in a year or two there will be games that from a tech POV will be surpassing XB ones, just not by making use of the same kind of tech for example the E3 RS trailer or even the rebirth demo are all made with the GC feature set and are way better than anything on the XB, and that is why I said/think if they built (something like, or even the real game) HL2 from the ground upone would we would see a pleasent suprisse, instead of what we se in those, almost only GC dev games.

I disagree. The CPU is probably not much faster for most stuff, but if nothing else the Wii has a lot more memory bandwidth, especially if you factor in the framebuffer BW. As that was one of the largest complaints about the original Xbox IIRC, it should help some.

Actually, IIRC, Faf said that Gekko in floating point is faster than the XCPU (mainly due to the memory architeture), if this one is 50% faster and it does have more silicone so it probably should surpasse the XCPU is many things easly.
 
Actually, IIRC, Faf said that Gekko in floating point is faster than the XCPU (mainly due to the memory architeture), if this one is 50% faster and it does have more silicone so it probably should surpasse the XCPU is many things easly.

Sure, but I don't think that's any substitute for XCPU's natural endowment...
 
Sure, but I don't think that's any substitute for XCPU's natural endowment...

Meaning?, Gekko and XCPU where pretty closely matched and according to devs Broadway has about twice the performance of Gekko.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top