Can Wii achieve the same level of Xbox's Doom3?

It seems most X360 games benefit from being rendered at 720p and downscaled to 480i/p. It's effectively supersampling, so a hypothetical SD Wii360 would need enough memory and bandwidth to handle 3x SSAA, a programmable shader GPU (so that devs won't have to put much effort into getting normal mapping and shaders working), a faster GPU for setting up more triangles, and faster still if it's gonna have X360-level physics and AI. If they'd go to the trouble of turning the Wii into an SD 360, they might as well build a whole new console.
 
IMHO,NV2A has far more pixel shader power.
NV1X and NV2X's Register Combiner is a 4-in-3-out ALU,it can does 2 simple ALU instruction per Stage,like dot3-dot3 or dot3-mul and so on. But these complex instructions are not available in traditional DirectX,OpenGL Extensions is the only way to utilize it on PC. Fortunately, XBOX's DirectX has a special version of VertexShader and PixelShader——xvs and xps,xps contains these instrunctions and the final-combiner instruction,so NV2A can unleash its true power on XBOX.
NV2A has 4 pixel pipelines,and NV1X-NV2X can excute 2 combiner-stages in 1 cycle,so in theory,NV2A can do 4x2x2=16 dot3 instructions per cycle at most.that's far more powerful than Flipper and Hollywood if Hollywood is a overclocked Flipper。
 
IMHO,NV2A has far more pixel shader power.
NV1X and NV2X's Register Combiner is a 4-in-3-out ALU,it can does 2 simple ALU instruction per Stage,like dot3-dot3 or dot3-mul and so on. But these complex instructions are not available in traditional DirectX,OpenGL Extensions is the only way to utilize it on PC. Fortunately, XBOX's DirectX has a special version of VertexShader and PixelShader——xvs and xps,xps contains these instrunctions and the final-combiner instruction,so NV2A can unleash its true power on XBOX.
NV2A has 4 pixel pipelines,and NV1X-NV2X can excute 2 combiner-stages in 1 cycle,so in theory,NV2A can do 4x2x2=16 dot3 instructions per cycle at most.that's far more powerful than Flipper and Hollywood if Hollywood is a overclocked Flipper。
Of course Hollywood is nothing more than an overclocked flipper. So at the end, having a little more pixel fillrate doesn't allow hollywood to be at the same level of NV2A.

Alstrong said:
NV2A's trilinear fillrate is worse.
Man, sorry I haven't seen this yesterday, but this is absolutely false.
Are you comparing me the outdated Flipper T&L (even for when it was released on 2001) unit with the 2 vertex shaders the NV2A had?
Nintendo itself didn't give any trilinear fillrate spec on GC because knew that in that specific point their were even behind PS2 without any doubt.

PS2 could push 70M polys/sec, being generous for GC, lets say it could push 20 or even 30M polys/sec, do you really think Wii has 4 times GC's trilinear fill rate? And even having 4 times that trilinear fillrate, it would stay below Xbox's 125M polys/second!!!

Teasy said:
You seem to be trying to discount 4GB of memory bandwidth because a ARM9 processor has access to it for security/OS data How much bandwidth do you think is used for a small OS (basically a fancy menu system) like the one on Wii and for some security tasks?, its insignificant. Also how can Wii's extra bandwidth compared to XBox be lost because of these tasks when the XBox didn't had its own OS...

The XBox has 6.4GB memory bandwidth for everything. That includes OS, Sound, game code, textures, geometry, everything. Wii has a total of 20GB memory bandwidth (not including the on-chip 1MB texture buffer) and you're telling me that the Wii OS and security are using that extra 13.6GB/s up?..
You're counting the frame buffer bandwidth when you say wii has 20 Gb/s of total bandwidth, and this hasn't to be counted because those 3MB of eDram on the GPU are also cloned on the main ram.

Even the buffers have to be stored on the main RAM, then put on those 2 MB of framebufer the GC/Wii have and finally drawn by the GPU.
It's the same for the Xbox, buffers stored to the main ram and then painted by the GPU as in GC/Wii.

So you have to compare only the 7.9Gb/s of Wii to the Xbox's 6,4Gb/s and think that Xbox OS was made by a professional company while nintendo is a total beginner in this aspect.

But even in theorical cappabilities, Xbox is far more powerful than wii.

Teasy said:
Having seen your comment in another post about XBox having many SD XBox 360 games obviously the later is true. XBox had no games that looked as good as XBox 360 games at SD resolutions.
Of course it had. Xbox best games can be taken by Xbox 360 SD meddium or low qualitty games.
Rally Sport Challenge 2 for example, play it and then you tell me if it can't be considered a true Xbox360 SD game.
Ninja gaiden black is another example of that.

And now, take the Wii's most powerful game and compare it to those games, its obvious that the difference is huge.

Regards.

EDIT: We all know what Wii really is, and this PROFESSIONAL PROGRAMER REMINDS US THAT WII IS EVEN LESS POWERFUL THAN THE iPhone.
http://www.telltalegames.com/forums/showthread.php?t=10909

Not only inferior to the first xbox, but also inferior to the iPhone. Nintendo this time surpassed their own limits...

Some interesting quotes of what a PROFESSIONAL programer that worked on wii says about it compared to the iPhone:
The extra RAM is really what makes the difference (iPhone has 128MB of RAM). Of the Wii's 88 MB of RAM, a not insignificant chunk of that is always being used by the OS and is unavailable to developers(Maybe more than the 32MB X360 OS uses? Of course, it has to be more than that). The Wii's RAM is also split into two separate banks, each of which has different read/write metrics and you can't really spill from one to another if you need to.So if you decide to use the 24MB bank then you can't use the 64MB bank and the same if you decide to use the part that the OS is not using of the 64MB bank. At the end, you lose one of the 2 banks of memory and of course the bandwidth for games is only 3.9 or 4 Gb/s depending on wich bank you're using.
The italic letters are my own thougths about what he is saying.

The iPhone is responsible for 153,600 pixels where as the Wii is responsible for 307,200 pixels. There's a lot of work that goes into each pixel, and the Wii is outputing twice the content, requiring twice as much work and consuming bigger textures, geometry databases, etc. So in the end, the iPhone needs to do less work, yet has similar resources to the Wii.
So what he means is that if the iPhone and Wii worked at the same resolutions, then wii probably would have better graphics.
But Wii is working with twice the ammount of pixels than the iPhone, and this is why the iPhone can show better graphics (although in half the resolution).

But in terms of CPU and GPU, they are on a similar level.
 
Last edited by a moderator:
PS2 could push 70M polys/sec, being generous for GC, lets say it could push 20 or even 30M polys/sec, do you really think Wii has 4 times GC's trilinear fill rate? And even having 4 times that trilinear fillrate, it would stay below Xbox's 125M polys/second!!!

.

Paper specs are nice and all bt could you please explain to me why the two games with the highest poly count last generation (Rebel Strike and RE4) were both GCN titles? Surely if even the Wii couldn't hope of pushing as much geometry as either the PS2 or Xbox this would have simply not been possible? These games weren't light on the effects either, in fact its safe to say that Rebel Strike matches upto anything on Xbox.

I won't even entertain your notion that the Wii can't match upto the iPhone.
 
The italic letters are my own thougths about what he is saying.
What on earth makes you think Wii could be using as much RAM for OS as XB360?! :oops: XB360 has in-game menus with many, many features. Wii has...nothing. Playing a game isolates the user from the interface. There's no need for a menu interface graphics buffer, audio buffers for chat, space reserved for personal music playlists. Nothing! 10% is a 'not insignificant' amount which would be 8MBs. As console programmers will tell you, every byte matters, and even 4 MBs would be considered significant by some. But your speculation to 32+ MBs is way, way, way off the mark! (unless Nintendo are bizarrely insane)
 
What on earth makes you think Wii could be using as much RAM for OS as XB360?! :oops: XB360 has in-game menus with many, many features. Wii has...nothing. Playing a game isolates the user from the interface. There's no need for a menu interface graphics buffer, audio buffers for chat, space reserved for personal music playlists. Nothing! 10% is a 'not insignificant' amount which would be 8MBs. As console programmers will tell you, every byte matters, and even 4 MBs would be considered significant by some. But your speculation to 32+ MBs is way, way, way off the mark! (unless Nintendo are bizarrely insane)
While playing you can have acces to an in-game menú when you press the home botton.
About the OS, we don't know how much memory it is spent on it, but we know that the security takes off 12MB all by itself (and Wii's security is not exceptional precisely).
Microsoft has been doing OS for years, while nintendo hasn't.
All this experience behind an OS is what makes X360 OS much better than Wii's, and still using less memory than nintendo.

Wii OS can't do what Microsoft OS does, but in terms of computational costs it is at least at the same level of X360 OS or even more.

Regards.
 
While playing you can have acces to an in-game menú when you press the home botton.
What happens when you press the Guide button on the 360?
Microsoft has been doing OS for years, while nintendo hasn't.
All this experience behind an OS is what makes X360 OS much better than Wii's, and still using less memory than nintendo.
Microsoft has been doing notoriously bloated and inefficient OSes for years. Just look at slow adoption of Vista, and the record-breaking pre-orders of the more efficient Windows 7, for proof of that.

Wii OS can't do what Microsoft OS does, but in terms of computational costs it is at least at the same level of X360 OS or even more.
There's no way you can say that. Evidence to the contrary is that Wii is pushing a lower res, so the OS graphics can consume less memory and bandwidth. When you hit the Home button, it doesn't give you access to anywhere near as many settings as the 360 guide. The 360 also has an MP3 player going for custom playlists, full voice communication, friends lists, achievements, IM, and lots of other functions. The Wii Home menu has a battery gauge, rumble, and speaker volume controls, and a button to exit the current game/app and load up the the main channel interface into memory.
 
While playing you can have acces to an in-game menú when you press the home botton.
About the OS, we don't know how much memory it is spent on it, but we know that the security takes off 12MB all by itself (and Wii's security is not exceptional precisely).
Microsoft has been doing OS for years, while nintendo hasn't.
All this experience behind an OS is what makes X360 OS much better than Wii's, and still using less memory than nintendo.

Wii OS can't do what Microsoft OS does, but in terms of computational costs it is at least at the same level of X360 OS or even more.

Regards.

So you honestly believe as much as half of the Wii's RAM is used up by its OS whilst ingame? Rightio.
 
The home button doesn't call a running OS. It's just a tiny little bit of code and graphics that is a mandatory part of every game; like a pause menu. That's why some older games home menu looks a bit different than the newer ones. When the Wii boots a game, the start up software is eschewed completely in favour of the game. That's the main reason the Wii reboots every time you go to the start up screen.

I simply don't believe what the alleged developer says on the Telltale forum about the use of RAM on the Wii. The stuff about only being able to use one of either pool at one time. Of course you are able to use both! You just have to design the software around the two pools, so that stuff that isn't much affected by latency is in the DDR3 RAM and stuff that benefits from speed is in the 1T SRAM.

Overall the TT guys or their games doesn't give the air of being very technically skilled programmers.

The iPhones hardware isn't even near the Wii in performance. The ARM processor and MBX lite are made for mobile use where power usage is the main concern. And most importantly the iPhone does have an OS.

About the xbox vs. Wii comparison let's just do a real quick recap.

G3 plus CPU vs. Pentium 3 derivative.
Clock for clock the G3 is much faster than P3 in most multimedia task sometimes as much as 2x. The G3 Macs showed that beyond doubt.

24Mb more memory and overall much faster memory.

Far greater bandwidth. xbox had 6.4 for everything, including framebuffer. Wii has allegedly ~ 4Gb of bandwidth for just the GPU and CPU.

Then you have the large buffers and caches on either chip (256Kb L2 vs. 128Kb on xbox) with huge bandwidth.

The flash RAM is not to be forgotten. It can be (and has been) used as a very fast temporary buffer).

Do a search for ERP and TEV. That will explain some of the differences between the TEV combiner and pixelshader. As I understand it, one of the main difficulties with using TEV for other stuff the EMBM on the GC, was that it would place extra burden on the CPU. I can only conclude that the challenge has been made somewhat easier by the the upgrade of the CPU. Now the CPU has the resources and the bandwidth to take on much more of the geometry transformation work.
 
Last edited by a moderator:
Freezamite

You're mixing up fillrate with T&L power, Alstrong was talking about trilinear filtered fillrate, nothing to do with transformation and lighting. Hollywood's pixel fillrate is higher.

The data your using to compare T&L is a mix of real life data (for GC) vs theoretical data (for XBox and PS2), which is hardly fair, XBox could never push 125M polys per second in game, its a theoretical maximum. The most it could realistically achieve in a real game would be around 30M.

Also I bet you won't find a single person who agrees that XBox produced games that were like a 360 game in SD, unless maybe your comparing it to the worst looking 360 games possible, in which case GC and Wii have also done that.

You're counting the frame buffer bandwidth when you say wii has 20 Gb/s of total bandwidth, and this hasn't to be counted because those 3MB of eDram on the GPU are also cloned on the main ram.

Even the buffers have to be stored on the main RAM, then put on those 2 MB of framebufer the GC/Wii have and finally drawn by the GPU.
It's the same for the Xbox, buffers stored to the main ram and then painted by the GPU as in GC/Wii.

You don't understand how a frame is rendered. There's no need to send a frame to be stored in a second buffer before displaying it. On both XBox and GC/Wii the frame is displayed from main memory. However unlike XBox, GC/Wii doesn't render the frame in main memory. All rendering is done first in the 2MB buffer inside the GPU. Once the frame is complete that finished frame is sent to main memory to be displayed.

That means that all the bandwidth intensive work that goes into rendering a frame (which can take anywhere from several hundred MB's of bandwidth too GB's of bandwidth) never hits GC/Wii's main memory. I mean why do you think Wii even has an internal frame/Z buffer with 11.5GB/s bandwidth in the first place? Its there to deal with the bandwidth requirements of rendering a frame, so main memory doesn't have to.

Not only inferior to the first xbox, but also inferior to the iPhone. Nintendo this time surpassed their own limits...

That's one of the most bizarre things I've ever heard on Beyond3D. Should we start comparing Wii's best looking games to Iphones best now? :runaway: If your going to say things that are based on nothing other then something someone said on the internet then at least base it on a real developer, that guys a WiiWare developer.

So what he means is that if the iPhone and Wii worked at the same resolutions, then wii probably would have better graphics.
But Wii is working with twice the ammount of pixels than the iPhone, and this is why the iPhone can show better graphics (although in half the resolution).

But in terms of CPU and GPU, they are on a similar level.

Yeah Wii has to output two times as many pixels as the iphone. Then its handy for Wii that it can output over 7 times as many pixels per second as the iphone. Granted iPhone uses a PowerVR based GPU, but its still not going to have more then half the effective fillrate of Hollywood even in the most complex games on the system. Are we really going to discuss Wii's tech specs vs an iphone on Beyond3D? Do you actually know what Iphones specs are?, because I do and the comparison with Wii isn't pretty.
 
Last edited by a moderator:
Comparing Wii to iPhone is doubly silly in a Wii-compared-to-XB thread! We can keep our discussion solely to the performance of these two machines. Comaprisons based on 'looks better' are not appropriate here, and everyone should stick to the facts to make valid comparisons.
 
Of course Hollywood is nothing more than an overclocked flipper. So at the end, having a little more pixel fillrate doesn't allow hollywood to be at the same level of NV2A.


Man, sorry I haven't seen this yesterday, but this is absolutely false.
Are you comparing me the outdated Flipper T&L (even for when it was released on 2001) unit with the 2 vertex shaders the NV2A had?
Nintendo itself didn't give any trilinear fillrate spec on GC because knew that in that specific point their were even behind PS2 without any doubt.

PS2 could push 70M polys/sec, being generous for GC, lets say it could push 20 or even 30M polys/sec, do you really think Wii has 4 times GC's trilinear fill rate? And even having 4 times that trilinear fillrate, it would stay below Xbox's 125M polys/second!!!


You're counting the frame buffer bandwidth when you say wii has 20 Gb/s of total bandwidth, and this hasn't to be counted because those 3MB of eDram on the GPU are also cloned on the main ram.

Even the buffers have to be stored on the main RAM, then put on those 2 MB of framebufer the GC/Wii have and finally drawn by the GPU.
It's the same for the Xbox, buffers stored to the main ram and then painted by the GPU as in GC/Wii.

So you have to compare only the 7.9Gb/s of Wii to the Xbox's 6,4Gb/s and think that Xbox OS was made by a professional company while nintendo is a total beginner in this aspect.

But even in theorical cappabilities, Xbox is far more powerful than wii.


Of course it had. Xbox best games can be taken by Xbox 360 SD meddium or low qualitty games.
Rally Sport Challenge 2 for example, play it and then you tell me if it can't be considered a true Xbox360 SD game.
Ninja gaiden black is another example of that.

And now, take the Wii's most powerful game and compare it to those games, its obvious that the difference is huge.

Regards.

EDIT: We all know what Wii really is, and this PROFESSIONAL PROGRAMER REMINDS US THAT WII IS EVEN LESS POWERFUL THAN THE iPhone.
http://www.telltalegames.com/forums/showthread.php?t=10909

Not only inferior to the first xbox, but also inferior to the iPhone. Nintendo this time surpassed their own limits...

Some interesting quotes of what a PROFESSIONAL programer that worked on wii says about it compared to the iPhone:

The italic letters are my own thougths about what he is saying.


So what he means is that if the iPhone and Wii worked at the same resolutions, then wii probably would have better graphics.
But Wii is working with twice the ammount of pixels than the iPhone, and this is why the iPhone can show better graphics (although in half the resolution).

But in terms of CPU and GPU, they are on a similar level.


I'm not trying to be insulting here or anything, but you obviously have a severe lack of knowledge about how these technologies work. You're confusing how many polygons can be transformed/processed per second with trilinear filtering. Trilinear filtering involves textures that are pasted on the geometry. Not that the two are entirely independent but if a game is limited by its poly count then trilinear filtering might be able to be applied in abundance as long as there is enough fillrate.

Please read up on how a graphics pipeline works before presenting yourself as an authority on the subject. Again I'm not trying to be insulting but you clearly don't know what you're talking about.
 
Well as far as Xbox games that look not to far off from 360 games in SD..........at least in style and graphics technological implementation........you do have Chronicles of Riddick, the Far Cry games (though the extremely low shadow resolution looks horrendous), Halo 2, many of the racing games........ Sure, at HD resolutions you'd notice how crappy they really look, but many of the shader techniques used today are there, just much lower in quality.

And as for the Wii, there is one game that I'd love to see in HD (and we have via emulator) and that is Super Mario Galaxy which owes it's graphical prowess as much to advanced technique (for the Wii of course) as it does to artistry. Even Rogue Squadron II and RE4 would be nice to see at 720p. However, I think at those resolutions we might start to notice the relatively low fillrates for this day ;) It's just funny how Freezamite just completely unacknowledges how relatively clean yet very detailed alot of GC games look. Despite the lower bandwidths, I hear the GC was relatively efficient even against the brute power of the Xbox, and while I don't think all GC developers took advantage of this, many did, and ran with what the GC was made to do despite it's limitations. With the Wii, many of those limitations have pretty much been eliminated for the theoretical performances of the system, so alot more comes down to actual rendering theoreticals, and not memory and memory bandwidth limits. While I have no problem in accepting the Xbox as still more capable than the Wii on many fronts, I think more GC games made me go wow than Xbox titles when it came to graphics technology and artistry. It's just too bad Wii devs don't really have the incentive to push the visuals close to what the GC could do.
 
I think more GC games made me go wow than Xbox titles when it came to graphics technology and artistry. It's just too bad Wii devs don't really have the incentive to push the visuals close to what the GC could do.
I think that's the trouble with a lot of development; mindset and attitude.
If you don't believe in something and/or don't understand the underlying concept and rationale, you won't get the best out of it. Double so if you are pressed for time.
 
People says that Wii's G3 is faster than the P3 that the Xbox had because on a Mac, some multimedia applications worked faster than on a PC.
But that wasn't because of the G3, that was because of the OS.
First Xbox OS was specially done for the console, so we don't know how well a G3 would work against a P3 in that situation.

About what I know about GPUs, I admit I am not a professional, but things like Doom 3, with those tons of bump mapping and complex lighting can't be done on the wii because of the ultra low range T&L unit the wii have.

Regards.
 
I think that's the trouble with a lot of development; mindset and attitude.
If you don't believe in something and/or don't understand the underlying concept and rationale, you won't get the best out of it. Double so if you are pressed for time.

Well, like I said, the incentive for many devs isn't there. Don't think families really give a crap out bumpmapping, reflections, and dynamic shadowing in their cheaply made mini-golf game. As long as the game's graphics are usable and clean, I guess that's what really matters in that situation.
 
People says that Wii's G3 is faster than the P3 that the Xbox had because on a Mac, some multimedia applications worked faster than on a PC.
But that wasn't because of the G3, that was because of the OS.
Mod edit : This sort of sarcasm is very unbecoming in the Technology forum. We don't need it, and would rather facts and realities be debated.
First Xbox OS was specially done for the console, so we don't know how well a G3 would work against a P3 in that situation.
The Wii and GC OSes are specifically made for a console housing a CPU that's tailored for console gaming functions. If it stands to reason for you that a specialized OS would be more efficient, why wouldn't specialized hardware be? The only thing special about the Xbox's CPU is the L2 cache was cut in half from a real P3.

About what I know about GPUs, I admit I am not a professional, but things like Doom 3, with those tons of bump mapping and complex lighting can't be done on the wii because of the ultra low range T&L unit the wii have.
Regards.
You should take a look at this thread, where the Wii is doing tons of bump mapping and "complex lighting," in a homebrew hobbiest project.
 
I always thought it was funny how some people who were very serious about their Xbox just hated having their treasured box's CPU called a Celeron. :)
 
Mod edit : This sort of sarcasm is very unbecoming in the Technology forum. We don't need it, and would rather facts and realities be debated.
The Wii and GC OSes are specifically made for a console housing a CPU that's tailored for console gaming functions. If it stands to reason for you that a specialized OS would be more efficient, why wouldn't specialized hardware be? The only thing special about the Xbox's CPU is the L2 cache was cut in half from a real P3.


You should take a look at this thread, where the Wii is doing tons of bump mapping and "complex lighting," in a homebrew hobbiest project.
I saw this thread, and there is nothing special in doing that. The first Quake is a very simple game, even with all that bump mapping, it can't be compared to a riddick or something similar to that.

About Wii and GC OSes, even being specifically made for a console, they weren't made by a professional but for a begginer of OS systems.
Maybe while playing it doesn't take resources from the CPU, but it still takes a lot of RAM (12MB at minimum).
 
About Wii and GC OSes, even being specifically made for a console, they weren't made by a professional but for a begginer of OS systems.
Maybe while playing it doesn't take resources from the CPU, but it still takes a lot of RAM (12MB at minimum).

Nintendo aren't professionals anymore:?:.
 
Back
Top