Nintendo's hardware philosophy: Always old, outdated tech?

IMHO, Kinect is not competing with the Wii, it's replacing it in households. Faster harder better stronger. It's really almost a new console on the market that happens to use an X360 as one of its components.
How's that not competing? Thats the point of competition and they did it tremendously well. Its either Wii or Kinect in households and Kinect gave a large blow to Nintendo
 
What's your point. I never said they came up with it. That was never "the" or even "a" point in what I said.

The point was that they made there product specifically to compete with the Wii as matter of fact and that in doing so, they were aiming directly at the Wii's market.

In other words I was pointing out that there would be no Kinect or Move right now if not for Nintendo's success and that Sony and Microsoft made there device with the intention of leaching off of its immediate success. Nothing more, nothing less.

I don't have a problem with different companies competing in the same market. Why do you consider this a bad thing?

There is so much wrong here but I fear the purposeful misconstruing that would be applied by certain parties if i followed though with an explanation.

Don't be afraid. Be strong. Tell us all what's wrong.

...??? Of course they are different, they are two different things made by two different companies at two different times. Microsoft's design was not the result of mere influence. Sega and Microsoft had been in an unofficial partnership since the Dreamcast.

MS and Sega had an *official* partnership with the Dreamcast. There was even a Windows logo on the Dreamcast. Almost all the Xbox design work was done before the DC was canned.

That is why so many Dreamcast games used MicrsoftCE. Microsoft even bragged about the ease of portability of PC games to the Dreamcast that came as a result. They're was a lot of Windows related tech implemented in the system.

Not many DC games actually used Win CE (at least not as a proportion of all DC games). Most used the Katana dev stuff. There was no Windows tech implemented in the DC hardware that I'm aware of.

The design of the Xbox was taken directly from the Dreamcast. Even Yu Suzuki made note of this when making Shenmue 2 for the Xbox. It was an okay thing with both companies.

The DC and Xbox were so different that it's simply not possible that that the design of the Xbox was taken directly from the DC. I'm a big fan of Yu Suzuki - he's the greatest game designer of all time - I'd be interested in any links you could provide about his views on the Xbox being a direct copy of the Dreamcast. I don't understand why he would say this given how different the Xbox was from the DC.

Strange thing. There was no suggestion of this from me either. The thought of them delaying their system for anything was never a factor at all.

So they aren't going to delay, they're already going for the best tech they can (Sony's stated goal), but they're going to see what Nintendo has to offer and then release something with higher specs (your prediction)?

Sounds like you're expecting something so potent from Nintendo that it'll outperform the device that Sony are planning to release sometime afterwards, and that this will spur Sony on to release something with higher specs.

This would require Sony being caught out and then having to redesign so they can be more powerful. Sounds like that would cause a significant delay.

I never even inferred anything of the sort. You missed the intended point of what I wrote entirely. Please don't alter the context of what I say then try make an argument against it as if I said it. I will not even bother to reply to such a thing again.

I know this reply was to mrcorbo, but I really have to ask you how a company can compete in the same market as an established competitor without "copying a concept" of what that market wants?
 
GameCube's GPU, Flipper, was pretty advanced concidering it was designed mainly in 1999.

flipper-chip1.jpg


GameCube 101: Graphics
Learn all about GameCube's graphics chip with insightful quotes from ATI's Greg Buchner.
January 16, 2001


You've seen the demonstrations, and you've heard the developer hype -- GameCube's graphical abilities are astounding. Even when compared to Xbox, (see Malice demo) GameCube is a shining example of great next-generation graphics technology. The people responsible for these technological achievements can be found at Nintendo Co. LTD, Nintendo Technology Development, NOA, ArtX, ATI, NEC, and MoSys to name a few. Many people say that a picture is worth a thousand words, but when one compares still screenshots of next-generation software it's hard to walk away with any appreciation of the technology behind it. In consideration of that, we're bringing you the best breakdown of GameCube's graphics hardware as we can offer. Some of it will be repetition of old news and some of it will be eye-opening news, but hopefully at the very least most of it will be intriguing. We want you to come away with an appreciation for the power, efficiency, and, most of all, balance that encompasses not just GameCube's graphics chip, but the entire system.

Lets start by looking at some of the best looking titles on each respective system: Metal Gear Solid 2, Legend of Zelda, and Malice. Certainly, they all look wonderful and do a great job of showing off the potential of each system. Information on details such as texture usage, polygon performance, and lighting can also be discerned. What is often not understood by the average consumer is the technology behind it and how easy (or hard) it may be to use.

mgs_13.jpg

zeldacube2.jpg

malice3.jpg


So, what makes the visuals in the pictures above so exceptional? In particular, what lies behind the high-quality graphics in the GameCube (this is IGNcube, after all) Legend of Zelda demo? There's no way to pinpoint some boilling core of power in GameCube because the entire system holds hands to get the performance it does, but if we had to pick the "hottest" spot it would definitely be the graphics chip, codenamed Flipper.



Symbols Chart

NOTE: The block diagram shown above is NOT to scale. Each respective symbol only represents a general layout. eFB: Embedded Frame Buffer
eTM: Embedded Texture Memory
TF: Texture Filtering
TC: Texture Coordinate generation
TEV: Texture Environment
PEC: Unknown, includes motion compensation, data filtering to go outside of chip
C/Z: Color/Z-buffer
RAS0/1: Rasterizing Units 0/1
RAS2: Rasterizing Unit 2
SU: Triangle Setup
CP: Command Processing
DSP: Digital Signal Processor (sound)
NB: "North Bridge"
XF: Transform, geometry calculations


Flipper Specifications


•Developed by primarily ATI, ArtX, NCL, and NTD.
•Manufacturing Process: 0.18 microns NEC Embedded DRAM Process
•Clock Frequency: 202.5MHz
•Embedded RGB Buffer: Approx. 1MB, Sustainable Latency: 5ns (1T-SRAM)
•Embedded Z Buffer: Approx. 1MB, Sustainable Latency: 5ns (1T-SRAM)
•Embedded Texture Cache: Approx. 1MB, Sustainable Latency: 5ns (1T-SRAM)
•Estimated Internal Bandwidth: 20-25GB/second (peak)
•Texture Read Bandwidth: 12.8GB/second (peak)
•Main Memory Bandwidth: 3.2GB/second (peak)
•Color, Z Buffer: Each is 24bits
•Image Processing Function: Fog, subpixel anti-aliasing, HW light x8, alpha blending, virtual texture design, multi-texture mapping/bump/environment mapping, MIPMAP, bilinear filtering, trilinear filtering, anisitropic filtering, real-time texture decompression (S3TC), and more
•Simultaneous textures: 8
•Estimated raw display capability: 90 million polygons per second
•Actual display capability: 6 million to 12 million polygons per second (display capability assuming actual game with complexity model, texture, etc.)
•Other: Real-time decompression of display list, HW motion compensation capability
(The following sound related functions are all incorporated into the system LSI)
•Sound processor: Special 16bit DSP
•Instruction memory: 8KB RAM + 8KB ROM
•Data memory: 8KB RAM + 4KB ROM
•Clock frequency: 101.25mhz
•Minimum number of simultaneously produced sounds: ADPCM: 64ch
•Sampling frequency: 48khz
This is the Flipper chip. Made up of 51 million transistors, it is just under 110mm2 of pure engineering genius. The chip is responsible for creating high-polygon models (the building blocks of 3D), texturing large worlds (the "paint" for the polygons), adding graphical effects ("paint" accents), real-time lighting, and a separate chunk dedicated to the sound, but that's another story. Some of the hardware features that allow this are embedded frame and texture memory, S3 Texture Compression, hardwired filtering effects, and a geometry and lighting unit. By this time anyone who doesn't involve themselves in the video hardware circle is probably getting overwhelmed. As such, lets start slow and try to understand some of the basics of the Flipper chip before moving into the more tech-heavy details.

Laying the Foundation
As is mentioned above, Flipper is composed of 51 million transistors. In essence a transistor is a microscopic gate that opens or closes a circuit via electricity, which is the heart of how computer devices "talk" to each other. In layman's terms open can mean "yes" and closed could mean "no." These transistors compose microprocessors, which is exactly what the Flipper chip is. At it's base, though, the chip is just a very tiny set of aluminum wires. How can you fit 51 million of these "gates" into a 10.5mm (about .4") square? Well, these gates are very tiny because of the way they are carved out in the manufacturing process called fabrication. Flipper is manufactured in a .18 micron process, which is what makes it so small. In comparison the PS2 graphics chip has 43 million transistors and is fabricated in a .25 micron process yielding chip size of 279mm2 -- over twice the size of Flipper. Both chips are slated to be manufactured at smaller sizes as time goes by.

Looking on the Inside
Now that you understand what Flipper is composed of, lets consider some of the components that are part of its architecture. It should be noted that every chip has a different architecture just as houses would. In that sense, Flipper certainly has its share of unique rooms. Some of the big square footage that lies behind its performance is the embedded 1T-SRAM developed by MoSys. ATI's Vice President of Research and Development, Greg Buchner, claims, "If we used DRAM, we would have been able to do a lot less. [1T-SRAM has] much less restrictions and much higher performance for the same clock rate. Everything about it is better." By utilizing this embedded memory developers can keep information close to the graphics chip without having to access outside memory. Using the metaphor of houses again, think of it as storing your personal belongings in your own garage or having to drive down to the local rent-and-store facility. The ability to access that embedded memory can be that much quicker.

Another important part of Flipper's design is the S3 texture compression. There is physical metal found in the chip that decompresses stored textures at a 6:1 ratio. So, while they might take up 20MB of memory, they could theoretically be equivalent to 120MB of textures. The difference is astounding, and more than evident if you watch some of the GameCube demonstrations. Furthermore, because the S3TC is part of the chip and not a software program there is no overhead. Meaning that there's no slowdown in the processing pipeline. In contrast the PlayStation 2 would have to run a software program to try to mock the same process, which would greatly hinder the system's performance. Case in point, GameCube gets huge textures for a very small price. Say goodbye to those blurry N64 graphics and make way for crisp, detailed visuals.

Next one should consider the effects that are applied to those textures. There are a great number of effects that can be used to make objects look shiny, smooth, dirty, bumpy, etc. GameCube's graphics chip has an especially powerful advantage in this area. Not only do some effects come at a small cost in performance, but also others have never been implemented in consumer graphics before. As ATI's Greg Buchner put it, "There's some cool sh-- in this chip." Exactly what that cool stuff is, and how it works is beyond public knowledge. Following we'll offer some insight on some of Flipper's functions including trilinear filtering, anisitropic filtering, subpixel anti-aliasing, multi-texture mapping/bump/environment mapping, virtual texture design, alpha blending, and HW light x8.


•Trilinear Filtering
As you get close to a 3D object it begins to take of more of the screen; therefore, the textures on it take up more of the screen. So it begins to take up more pixels (the tiny squares your screen is made up of) even though it's the same texture. These new pixels that get taken up by the object are "colored" by looking at the four adjacent texels (colored pixels) and blurring them together to create the new texel. Some image quality is lost in the process, but trilinear filtering is a more accurate method than was used in past consoles.


•Anisitropic Filtering
This is the next level of filtering after trilinear filtering. It creates an even more accurate image by obtaining more information from the surrounding textures. While it's still a costly feature on GameCube, it should be utilized by the more talented developers.


•Sub-pixel Anti-aliasing
Still not described in any detail it is assumed to be a more accurate technique of regular anti-aliasing. Regular anti-aliasing "blurs" a texture or line to make it look more smooth. Sub-pixel anti-aliasing is thought to be a method of breaking down the pixels on screen into sub-pixels in memory, resulting in more available "spaces" to anti-alias your line or texture. Greg Buchner offered no solid information but said, "There's a couple of different types of algorithms, or I'd say 10,000 foot level algorithms for taking anti-aliasing, below those you get down to the 1,000 foot level, they start to get different, each of those begins to get different with its different implementations. There's many many ways to try to improve the image quality, there's some novel things that we've done that are on the 10,000 foot level similar to one of the two or three main schemes for doing anti-aliasing. And as we get down to the implementation detail there's some cool stuff that [ATI does which] nobody else does."


•Multi-texture Mapping
GameCube's graphics chip offers the unparalleled ability to apply eight textures simultaneously. As an example a developer could create a shield that applies trilinear filtering, bump-mapping, dirt mapping (to "absorb" light"), transparency, and a number of other techniques. The end result will prove to be far more detailed than a plainly textured object, which can appear flat. Greg Buchner comments on the ease of implementing this in GameCube: "There's two ways you could handle multi-texture. One is make people jump through hoops render things multiple times, pass it through the hardware multiple times, and have to work hard. And there's [the other way] where you make the hardware more autonomous and deal with things for the developer."


•Virtual Texture Design
This is another one of those terms that Nintendo is hiding behind. All we know is that it makes dealing with textures more efficient and easy. Convenient, you might say. Mr. Buchner offers up the comment, "There's a lot of stuff aimed at making the developers job easier in dealing with the textures, in terms of how they have to load them, what they don't have to load ... just making their lives easier. Rather than explicitly loading a texture, like you have to do on some systems before you can use them. The hardware may take care of things for you."


•Alpha-blending
Even the N64 featured this technique, but it would cost you a lot of your fill rate (how fast the chip can put the image on the screen). It is used to make something appear transparent. On GameCube IGNcube has been told that it comes at a significantly low cost in performance. This is why in the Luigi's Mansion demonstration many, many transparent ghosts surround Luigi.


•Hardware Lighting x8
Hardware lighting is just like any other real-time lighting you see in games, but it comes with far less of a hit on the hardware. GameCube can handle eight lights via hardware and still pump out millions and millions of polygons. Even with this many lights the 6-12 million polygon per second number may be conservative. GameCube is that powerful. Buchner says, "This is one of those things where it's more of a performance issue than anything else. In terms of everything that is in Flipper, at least in terms of a graphics point of view, it is a tradeoff. Do you do dedicated gates for a function or do you run it on a general purpose CPU? You could do it all on a CPU and make it easy for people. You could have...a huge program take care of everything for you in software. You could still do that. That would be very easy for the developers, but it would be pretty pitifully slow. This is an area where I'd say it's more for performance issues. If you try to do all of the lighting calculations, all the matrix calculations that are needed for all these lights, if you did that all in the CPU you would chew up so much of the CPU resource."

The Visual Buzz
Everything we've been hearing tells us that GameCube is a beast in the graphics department. In terms of polygon performance even with tons of filters, lighting, AI, and everything else its pushing anywhere from 15-25 million polygons per second. Some developers even say GameCube can push 25-40 million polygons per second in-game depending on the amount of effects used. In the end, though, the number of polygons won't matter as much as how the eye is fooled. Again, we've been hearing that the eye might have trouble discerning between a well-done FMV cut-scene and the in-game visuals. Also, now that Xbox hardware is slowly seeping into the hands of developers IGNcube is being told that GameCube has a greater and greater chance of not only being just as powerful as Xbox, but possibly even more powerful than the Xbox in some areas. As for the PlayStation 2, it's really not even an issue anymore. GameCube is clearly much more powerful than the PS2, even in its first-generation of software.

What one really needs to understand about the GameCube graphics chip is not technical jargon, but rather plain English. The design of the entire system, especially the graphics chip, is intelligent, efficient, and extremely balanced. While many developers fight for performance on a system like the PS2 they'll be able to be express their creative freedom on GameCube. Instead of spending a month trying to implement their basic ideas, they'll spend a week implementing the basic idea they had, and that extra three weeks coming up with new creative ideas. Nintendo's vision for the GameCube was to give developers that creative freedom, and ease them of technical strife. The graphics chip plays perfectly into the hands of this vision.

So as a fan, as a techical guru, or as someone with casual GameCube interest, sit back and enjoy the simple brilliance of GameCube's design. The software is guaranteed to look amazing. All we need to hope for is lots of it.

http://cube.ign.com/articles/090/090003p1.html
 
Last edited by a moderator:
I don't have a problem with different companies competing in the same market. Why do you consider this a bad thing?
I never said it was. I just go through explaining that.


MS and Sega had an *official* partnership with the Dreamcast. There was even a Windows logo on the Dreamcast. Almost all the Xbox design work was done before the DC was canned.

Not many DC games actually used Win CE (at least not as a proportion of all DC games). Most used the Katana dev stuff. There was no Windows tech implemented in the DC hardware that I'm aware of.


The DC and Xbox were so different that it's simply not possible that that the design of the Xbox was taken directly from the DC. I'm a big fan of Yu Suzuki - he's the greatest game designer of all time - I'd be interested in any links you could provide about his views on the Xbox being a direct copy of the Dreamcast. I don't understand why he would say this given how different the Xbox was from the DC.
...I'm not talking about the physical components that make the machine. I'm talking about what they produce primarily on the software side of things. The games the Xbox produced were pretty much enhanced Dreamcast games and The UI was like an enhanced version of the Dreamcast. Suzuki called the Xbox the successor to the Dreamcast because of this. The SNES does not have any of the tech from the NES in it but they still follow the same direct principles. This is what I was pointing out.

So they aren't going to delay, they're already going for the best tech they can (Sony's stated goal), but they're going to see what Nintendo has to offer and then release something with higher specs (your prediction)?
... That was a circumstantial prediction. Do you read? I was saying that "IF" the circumstances occurred as I said they would then that would be the result and if Sony truly said that their idea was to release the best tech then they lied or you heard wrong because there were downgrades applied to the PS3 even before it was released. The CELL was originally suppose to be GPU if you remember. Even Hideo Kojima said that the system didn't do what Sony originally stated it would.

Sounds like you're expecting something so potent from Nintendo that it'll outperform the device that Sony are planning to release sometime afterwards, and that this will spur Sony on to release something with higher specs.



This would require Sony being caught out and then having to redesign so they can be more powerful. Sounds like that would cause a significant delay.[/QUOTE]
You seem to have me mistaken for someone who thinks in tandem with yourself. I don't lobby in favor of any companies. I'm a PC gamer. I'm simply stating what I understand to be fact and what I've found not to be fact.

You couldn't have possibly derived that from my post. I'm sure I said that I wouldn't be surprised if the next console from Nintendo was weaker than the PS3 and Xbox 360.

I know this reply was to mrcorbo, but I really have to ask you how a company can compete in the same market as an established competitor without "copying a concept" of what that market wants?
I really have to ask, do you know how to read? I never suggested such a thing. None of that matters. It was never a point. I only stated it to support the point.
 
... That was a circumstantial prediction. Do you read? I was saying that "IF" the circumstances occurred as I said they would then that would be the result and if Sony truly said that their idea was to release the best tech then they lied or you heard wrong because there were downgrades applied to the PS3 even before it was released. The CELL was originally suppose to be GPU if you remember. Even Hideo Kojima said that the system didn't do what Sony originally stated it would.

I don't remember that CELL was originally supposed to be a GPU, or that PS3 was originally conceived without a GPU.

Sony patented some design ideas for Cell-based systems, but Sony never stated that those were intended to be designs for the PS3.

PS2 combined a multi-core vector processor with a purpose-built GPU, and PS3 has the same.

The RSX may not have been what Sony originally wanted (they were supposedly talking with Toshiba about a rendering chip), but Cell was meant to replace the Emotion Engine, not the Graphics Synthesizer.
 
Hmm I remember Sony saying the same thing. Cell being also supposed to act as a gpu. Might be wrong information but there was quite a bit of talk about it way back.
 
I don't think Nintendo has a "Hardware Policy", they have product visions.
They don't use hardware as a loss leader, they set target price points and they decide what's important to the product. With Wii the product was the controller.

I think all 3 console manufacturers are going to be looking for ways to differentiate their next generation consoles, other than bigger better faster.

Aye, if this generation has shown us anything is that it doesn't matter who is bigger or faster, with multiplatform games which tend to dominate sales, it's all about platform parity. So the lowest common denominator on both of the HD platforms ends up being the determining factor in graphics. So basically it isn't about how good the PS3/X360 is, it's all about what bottlenecks the PS3/X360 have with regards to each other and then design your game around those bottlenecks.

Regards,
SB
 
There has been quite a lot of talk about it, yes, but not from Sony.

Actually, while the Cell was in development, Sony made several statements that the CPU would be as good for graphics as the GPUs from that time (~2004). Of course, fast-forward 2 years and there's a whole different kind of performance from GPUs out there (and you have to put in there a current GPU in order to stay competitive).

But I remember watching some of the initial Cell demos in E3 2005, a bathtub with rubber ducks where Phil Harrison told the crowd there was no GPU involved, everything was being done by the Cell.
 
Nah, Sony made it clear with their new portable. They are all about the biggest numbers, and Microsoft is all about copying whats working for everyone else and reverse engineering it till it looks like something original.

Everyone copies everyone. Get over it. All the current consoles arguably are copies of the Atari 2600 and even the original Pong game.

Other than those superficial things all consoles are basically copies of each other. I mean they are all machines meant to play games on your TV, how much more similar can you get than that?

Once you get away from that it's obvious the design philosophy of each is far different.

The interesting thing is that going forward it's quite likely ALL console manufacturer's will be following in MS's footsteps with the original Xbox and arguably Sega's Dreamcast as well. Using common PC components (or slightly modified PC components) to create their consoles rather than using all custom or in house designed hardware (PS2).

Hell Sony blatantly copied Microsofts Freestyle Pro game controller for the Six Axis if you go by your logic. :p

you do realize kinect was offered to both sony and nintendo as well as MS, i.e. like most things from MS it was developed by another company they just brought the rights to it/ownership. If youve been following IT stuff as long as I have you'll realize this is the chief modus operandi of MS, even back to their first OS MS-Dos, was actually developed by someone else. MS just brought the rights to it.

The camera sure, but as has been noted quite frequently, the skeletal tracking and image recognition that MS is doing is completely revolutionary and acknowledged as such in those fields. Hence, Kinect is far more innovative than you give it credit. People whose entire job has been working on these things are amazed at what MS have been able to do in that field. Primesense themselves acknowledge that the work MS does behind the scenes is amazing.

I would say that's a mis-characterization. IMO it would be more correct to say that they aren't known to create innovative products. What products they do create, however, often have innovative features or implement their features in innovative ways.

That isn't even necessarily correct. MS had game oriented motion controls way back in 1998 with the Freestyle Pro. Technology back then wasn't advanced enough for a really polished product and it didn't have any pointing capabilities, but it is arguably the first motion gaming product.

They've done quite a lot of innovating with regards to controllers for games. The Game Voice in 2003 which attempted to use voice controls in games and the Strategic Commander (still one of my favorite innovative game controllers).

Sure just like every company, MS incorporates other companies ideas. After all Apple has been copying quite a LOT of MS Windows technology ideas in the past few years.

But at the end of the day it's all about whether you can make a product that people want to use and are willing to pay for. Companies that do profit, and those that don't sink.

Nintendo has been quite successful at it considering they are the longest lasting console manufacturer in the business having long surpassed their closest rivals and bringing in more years of positive cash flow than any other console maker. Until their design philosophy fails, it's a good one.

Regards,
SB
 
My biggest complaint this generation has been with Nintendo's philosophy of charging new tech prices for old tech equipment. As a consumer I was appalled that a company would completely break away from the mold set forth by thier predecessors to release cutting edge tech at a loss as a benift for the consumer. Even worse, I was flabber gasted when consumers would stick up for this practice, stating Nintendos pfofibility is important to the consumer and that I should be happy to pay that price for that equipment.

Beyond the cost of the machine the quality of the titles suffered greatly with this approach. A vast majority of the games on the Wii are for all intents and purposes extremely weak versions of N64 ports!! Once in a while a gem will appear that actually tries to use the old tech but because of Nintendos Philosophy that "gamers don't care" about graphics it filtered through to some awful 3rd party games.

Now looking at the specs for the new Wii I'm again wondering if Nintendo simply doesn't care about the consumer's idea of "value" but only thier own? I no longer expect cutting edge technology from Nintendo; they've proven themselves unwilling to compete in that realm any further.

Yet I could settle for just "weaker" tech then thier rivals of next generation. I hope this generation they realize that if they don't take the hardware of thier console seriously niether will anyone else. This time however I will lose all faith in humanity if another console with such outdated tech is sold to the masses at new tech prices and consumers don't voice thier frustrations about it.
 
Given how 70+ million people bought a wii its hard to claim that nintendo does not care about consumers idea. If any Nintendo has ''proven'' that your average consumer rather buys old tech for 250 dollars than pay 600 dollars for a top of the line ps3.

Also the lack of quality 3rd party games has little to do with nintendo and all with 3rd party devs releasing a constant stream of utter shit. Nintendo has proven that quality titles on wii can sell millions. Sure, they don't compete on gfx. Do they really need to? I mean, if gfx is all the gamers cared about than why isn't everybody gaming on pc?

At the end of the day its all a matter of bang for your buck. With wii it seems like people prefferd a cheap base system and games that are 10~20 euro's cheaper than ps360 games over a expensive high end box. You might think a low end box isnt worth it but others might think otherwise. I bought a wii at launch and even with the lack of games I think it was well worth the money because it was cheap and games are cheap too.
 
My biggest complaint this generation has been with Nintendo's philosophy of charging new tech prices for old tech equipment. As a consumer I was appalled that a company would completely break away from the mold set forth by thier predecessors to release cutting edge tech at a loss as a benift for the consumer. Even worse, I was flabber gasted when consumers would stick up for this practice, stating Nintendos pfofibility is important to the consumer and that I should be happy to pay that price for that equipment.

Beyond the cost of the machine the quality of the titles suffered greatly with this approach. A vast majority of the games on the Wii are for all intents and purposes extremely weak versions of N64 ports!! Once in a while a gem will appear that actually tries to use the old tech but because of Nintendos Philosophy that "gamers don't care" about graphics it filtered through to some awful 3rd party games.

Now looking at the specs for the new Wii I'm again wondering if Nintendo simply doesn't care about the consumer's idea of "value" but only thier own? I no longer expect cutting edge technology from Nintendo; they've proven themselves unwilling to compete in that realm any further.

Yet I could settle for just "weaker" tech then thier rivals of next generation. I hope this generation they realize that if they don't take the hardware of thier console seriously niether will anyone else. This time however I will lose all faith in humanity if another console with such outdated tech is sold to the masses at new tech prices and consumers don't voice thier frustrations about it.

I'll assume you meant GameCube their rather then N64 :LOL:

Also you really can't compare the specs we're hearing about Project Cafe/Stream to Wii, they're far more up to date for their time.
 
Unfortuantly Teasy I have been to friends houses that put in games I didn't hear of for the Wii that looked like I was back on the N64!! Maybe they weren't quite that "bad" but I played a lot of N64 games that looked just as good then some of the Wii games I've been subjected to because they were "party" games. Carnival something...AAHHHHAHHHHH!!!

I agree I can be a little dramatic, possibly considered a drama queen but I honestly felt betrayed by Nintendo this past generation. I thought the GC was a good machine and was bummed Nintendo didn't try harder to shake the "kiddy" image from themselves to attract more developers to it. I mean it came standard purple and had a handle, the most unmasculating console I had ever seen.

They went from being competitive hardware wise to dropping that philosophy and focusing soley on waggle and abandoning improvements to what HELPS make games great..the tech inside the machine!!

The old addage gets thrown around that the best graphics don't neccessarily make a great game. But you know what, the best controlling method doesn't necessarily make a great game either. Games are about balance and when your too tipped or favor one aspect of your machine much more then the other you're basically setting yourself up to fail.
 
All fair points, I just thought that comment about N64 ports was a massive exaggeration. I'm not sure there is a single retail Wii game that doesn't look better then N64 level, let alone the vast majority of them.

Also I don't think your dislike for Wii graphically can be thrown at Project Cafe, like I said it seems far more up to date for its time then Wii was.
 
Also I don't think your dislike for Wii graphically can be thrown at Project Cafe, like I said it seems far more up to date for its time then Wii was.

We still don't know what Café has. If the "just a notch above X360" rumour is right, then it'll be 2006 console performance being sold in 2012.
Just like the Wii's "just a notch" above XBox1 performance from 2001 being sold in 2006.
 
Going too close to Xenos would make PC ports harder..

Then again, would mean Nintendo using old tech - again.
 
We still don't know what Café has. If the "just a notch above X360" rumour is right, then it'll be 2006 console performance being sold in 2012.
Just like the Wii's "just a notch" above XBox1 performance from 2001 being sold in 2006.

I agree Wii is a notch up from XBox in raw performance in a lot of ways (outside of geometry), but its also using an older less feature rich GPU, a R7xx isn't older or less feature rich then Xenos, quite the opposite, so I don't think its a fitting comparison.

If you think about the architecture of Wii's graphics chip, it was nearly 8 years old when Wii released, R7xx will be less then 4 years old in early 2012 when the system is rumoured to release. I just don't see the two situations as similar at all.
 
Last edited by a moderator:

Totally random speculation there, and quite funny that their source would rate a HD4770 as only "close the Xenos".. Let alone "would operate at a deficit to Xenos in some applications", like what applications?

Xenos is a R5xx with some R6xx features added yeah? 48 unified shaders, 240Gflops. HD4770 is R7xx and 960Gflops AFAIR, can't remember how many shaders it has though.
 
Last edited by a moderator:
Back
Top