Editorial On "Is Wii Next-Gen?" (answer: no)

Yeah, I know what you meant. But let me clarify. My point was, the smaller you make the hardware, the more it's going to cost you if you want to make it powerful. So while it certainly is possible to have a monster under the hood like the Mac Mini, Nintendo would be losing a lot of money if they made it that powerful with such a small size.
Hope that was clear.
But aren't Apple products marked-up a lot?
 
Since when has Nintendo claimed they were "Next-Gen?"

Didn't Reggie say himself "The Wii is entirely new..." Hence, "New Gen" would be more appropriate.

Also, as much as the editorial mentions tech, I'd figure that that same potential about A.I. and such would get used this generation. Alas, it never eally came, unless there was someone out there dedicated to making te tools for it.

In the end, it's all up to the programers, but often times we see them seldom use these new capabilities for something more simple, and, quite ironically, more enjoyable.
 
Last edited by a moderator:
Yeah, I know what you meant. But let me clarify. My point was, the
smaller you make the hardware, the more it's going to cost you if you want to make it powerful.
While that's true, smaller==more expensive, Wii isn't sitting on the cutting edge of the price/size/performance edge. We're talking a 16mm CPU die! Putting cutting edge tech into a Mac Mini is costing lots (though with Apple's markup too ob those RRPs don't forget) but putting older tech in would cost a lot less.

Thus...
So while it certainly is possible to have a monster under the hood like the Mac Mini, Nintendo would be losing a lot of money if they made it that powerful with such a small size.
...putting a monster under the hood isn't what people are grumbling about. It's putting pretty much the bare minimum in that is making people wonder what's going on. They could have put into the same size as a Mac Mini smaller, simpler components than those listed Mac Minis but still got considerably more available performance than the Wii, and all at a price that $250 could accommodate. There's a happy medium between Wii and Mac Mini that they could have (and I think should have) targetted. Wii is a great idea, but Nintendo could have done a lot more with it. Sticking to SDTV res', they could have got pretty gorgeous graphics from 3 year old tech and really balanced the system between interaction+games and graphics. Instead they've got only the games.
 
Did you ever saw a MacMini (or a name close to that), it is ~the same size of a Wii but it does have a Core Duo (that beat a P4D) and a X1600 and up to 512Mgs total (IIRC) and that is a lot for a console (specially at 480p).

The Mac Mini doesn't have anything near a X1600. It used to have a Radeon 9200, now it has a crappy integrated GMA950. Performance-wise it's somewhat below Geforce4. The CPU is nice, though. However, the system costs about 2,5x as much as the Wii, and it doesn't come with any peripherals (Edit: It actually does ship with the frontview remote. However, the wiimote+nunchuck+sensorbar would still be a couple of bucks more expensive).
 
Last edited by a moderator:
fearsomepirate said:
And COD3 is basically a DX9 game, judging by the shots and vids. How long has PC DX9 hardware been out? Is it really 4 years now? I guess it's using SM3.0, judging by the way HDR is in everything now, so that's only what, 2 years old? But it's nice to see that DX9 software development is finally maturing, because it sure wasn't anything to crow about in 2002.
darkblu said:
and i somehow thought here that the 'technological superiority' of the 360 would allow for it for some conspicuous quantitive advancements over the obsolete desktop tech. stupid me.

Guys are you talking for CoD2 ? CoD3 is not yet out and AFAIK there is not pc version of this game.
 
Guys are you talking for CoD2 ? CoD3 is not yet out and AFAIK there is not pc version of this game.

yes, CoD2. my bad.

The Mac Mini doesn't have anything near a X1600. It used to have a Radeon 9200, now it has a crappy integrated GMA950. Performance-wise it's somewhat below Geforce4. The CPU is nice, though. However, the system costs about 2,5x as much as the Wii, and it doesn't come with any peripherals (Edit: It actually does ship with the frontview remote. However, the wiimote+nunchuck+sensorbar would still be a couple of bucks more expensive).

exactly. the mini is a nice cuddly animal, but don't even dream to see it in the same price range as a wii. btw, it supports up to 1GB of RAM ..and the G4 version is outright lovable, but the core version is not bad either.
 
Last edited by a moderator:
Also, as much as the editorial mentions tech, I'd figure that that same potential about A.I. and such would get used this generation. Alas, it never eally came, unless there was someone out there dedicated to making te tools for it.

In the end, it's all up to the programers, but often times we see them seldom use these new capabilities for something more simple, and, quite ironically, more enjoyable.

If Halo is any good it is just because of its AI, so for me AI is one of the most enjoyable things.

Also there is midleware like.

http://www.ai-implant.com/
http://www.spirops.com/

...putting a monster under the hood isn't what people are grumbling about. It's putting pretty much the bare minimum in that is making people wonder what's going on. They could have put into the same size as a Mac Mini smaller, simpler components than those listed Mac Minis but still got considerably more available performance than the Wii, and all at a price that $250 could accommodate. There's a happy medium between Wii and Mac Mini that they could have (and I think should have) targetted.

That explains perfectely.

The Mac Mini doesn't have anything near a X1600. It used to have a Radeon 9200, now it has a crappy integrated GMA950. Performance-wise it's somewhat below Geforce4. The CPU is nice, though. However, the system costs about 2,5x as much as the Wii, and it doesn't come with any peripherals (Edit: It actually does ship with the frontview remote. However, the wiimote+nunchuck+sensorbar would still be a couple of bucks more expensive).

Are you sure that had never existed, I has almost certain there is a version with a X1600? Anyway the above post explain the idea perfectely.
 
Are you sure that had never existed, I has almost certain there is a version with a X1600? Anyway the above post explain the idea perfectely.

never.

some of the higher macs used to have an x1600 integrated.. some of the imacs, and possibly the book pro, according to a friend of mine here.
 
never.

some of the higher macs used to have an x1600 integrated.. some of the imacs, and possibly the book pro, according to a friend of mine here.

My mistake:oops: .

Still the original reply stands, as they still could do much more than they did.

I also think that the size of Wii is because they decided to take advantage of the rest of the design as they could easly made it 50% bigger and none would care.
 
My mistake:oops: .

Still the original reply stands, as they still could do much more than they did.

I also think that the size of Wii is because they decided to take advantage of the rest of the design as they could easly made it 50% bigger and none would care.

There was a Mac Mini-like with an ATI Radeon X1400, but it was a Shuttle "mini X 100HA" , not an Apple.
Curiously, it looks remarkably like a Nintendo Wii... :D

large.jpg
 
There was a Mac Mini-like with an ATI Radeon X1400, but it was a Shuttle "mini X 100HA" , not an Apple.
Curiously, it looks remarkably like a Nintendo Wii... :D

large.jpg

I noticed that yesterday in a PC World magazine. The sad thing is that it a lot more powerful than the Wii, but it will probably cost up to 4 times as much.
 
The reason is simple:
You'd end up paying through the nose for each those chips, since ATI would never let go of their IP when the product is already on the market.

That's why MS was stuck on the deal with Nvidia for the original Xbox.
It was Nvidia who contracted TSMC to make the chips, just like they do for any other product (and they don't give their partners the IP for a Geforce, right ? Otherwise, who would stop Asus from quietly "sharing" it with the competition ?).


You make a lot more money by supplying customers with ready-made components than by licensing specific IP to third parties and let them choose their fab(s), how large is a specific batch of chips, what is your marging gonna be when a die shrink is needed, etc, etc.

I understand that but at the same time couldn't they have created a midified version of the chip specifically for the Wii? (since Sony ended up with a custom GF 7800 GTX in the RSX IIRC and i'm sure it didn't cost Nvidia *that* much to whipp up..)

I guess as the other guy mentioned it could have screwed up BC but at the price GCs are going for nowadays couldn't they have just banged in the GC CPU/GPU aswell?
At least then they could probably work the system to somehow let devs use the GC hw & new processors together when developing Wii games to get even more processing capability couldn't they..?

Then they could fire up a storming marketing campaign about how the Wii would be the first dual-CPU, dual-GPU console in the world!!

:LOL:
 
While that's true, smaller==more expensive, Wii isn't sitting on the cutting edge of the price/size/performance edge. We're talking a 16mm CPU die! Putting cutting edge tech into a Mac Mini is costing lots (though with Apple's markup too ob those RRPs don't forget) but putting older tech in would cost a lot less.

Thus...
...putting a monster under the hood isn't what people are grumbling about. It's putting pretty much the bare minimum in that is making people wonder what's going on. They could have put into the same size as a Mac Mini smaller, simpler components than those listed Mac Minis but still got considerably more available performance than the Wii, and all at a price that $250 could accommodate. There's a happy medium between Wii and Mac Mini that they could have (and I think should have) targetted. Wii is a great idea, but Nintendo could have done a lot more with it. Sticking to SDTV res', they could have got pretty gorgeous graphics from 3 year old tech and really balanced the system between interaction+games and graphics. Instead they've got only the games.

Wait, are you arguing with me or agreeing? Cause that's what I was implying :p.
 
I had no idea that by "make use" of a PC video card, you were talking about personally writing tech demos. Most people use video cards for playing video games, so that's naturally what I assumed. Anyway, you can get just as much use out of a console devkit. It's just a little harder to get.
that does not tell anything about the tech level of his desktop vis-a-vis his console.
For some time now, you've been arguing against some imaginary person who said that consoles are more powerful than PCs are something. For example, this later statement:
and i somehow thought here that the 'technological superiority' of the 360
Now "technological superiority" is in quotes. Who exactly are you quoting? I'm curious. What else does this guy say? From implication, I've gathered that your imaginary argument foil thinks consoles have a technological edge on PCs, or are easier to program for, or something. I just want to clarify that I personally have said nothing of the sort.
your definitive statements on halo's origins.
You said Halo was a "port" from the desktop. No, Halo began development on desktop, but switched over to Xbox and was completed there, and the game was changed quite a bit in the process. That's more than just a "port." Outside of tech demos, there wasn't much in the way of PC software showing off that kind of stuff.
you can run folding@home on your sm3 destop gpu as we speak it'd be curious to know what can you do with a non-mature xenos at the same time.
folding@home isn't any more a mature piece of sm3.0 software than Kameo, Splinter Cell, COD2, GRAW, Saints' Row, Dead Rising, Oblivion, etc. I'm going to go out on a limb and say that folding@home leaves lots of features of an sm3.0 GPU dormant.
 
Last edited by a moderator:
That didn't make Halo any more enjoyable than... say Goldeneye. (I still enjoy that game to this day)

It's all about good game design, not just good game technology.

I didnt said that, I said that Halo without AI as good as it is, would be a very bad game. Of curse this is just one part of the game it AI must be in line with the rest of the game and actually (for that time and for the game it is) the GoldenEye AI is quite good IMO.

Personally one of the things that I usually make me like more is the AI in a game.
 
The state of AI is definitely the most disappointing and overlooked aspect of gaming. MMOs are IMO the cheap cheat, because they don't need AI. Back in the late '90s, people were spouting on about artificial life and advanced AI and it went absolutely nowhere in the mainstream gaming world. And yes, FPS is mainstream gaming these days.

Games like The Sims and Spore at least attempt advancements in AI. Ducking, crouching, and strafing, or AI running back to get the nearby AI buddies, isn't good AI. Of course, what can we really expect from the rail shooters anyway?

I'd say the most painful example is Oblivion. It makes AI technology deficiencies horribly obvious. Oblivion's pre-release hype sure was a lot more interesting than the actual game. RADIANT AI scarcely qualifies as AI. It's more like advanced hand-built scripting. The whole world is filled with non-interactive idiot NPCs and psychic telepathic guards. How is this different than Daggerfall? The game is prettier and smaller. Wow.

I too actually loaded up Goldeneye again this week. For the first time in years. I'm blasting thru the levels on 00 Agent difficulty and remembering how to beat them as I go. That game was sure something special back then. I also picked up Perfect Dark from the local used games store just two days ago. What exciting times those were. Xbox, PS2, and Cube didn't do nearly as much for the industry as N64, PS1, and Saturn IMO. They birthed the current controllers and all of the current 3D genres.

Actually I'd say that only graphics get attention in the industry. Controls don't (except from Nintendo). (Did you know Perfect Dark has control modes that use two controllers to play single player? I need to get another controller.) Sound hasn't improved since HRTF showed up in DirectX and Aureal died from mismanagement and Creative Labs. Physics sorta have improved but they are almost worthy of being put in the graphics category. Basically if you can't see it with your eyes, it seems to be believed that people won't notice/care. And maybe that really is true. It's not for me, but I would say I know a lot of people who would need such improvements demoed for them or they wouldn't notice.

The "next-gen" has so far been same-old with new graphics. 360's library is rather laughable in my opinion. PS3's library is almost 360's library lol. Next year looks like it may bring some good, interesting titles finally with Bioware and some other devs bringing their games out. I'm curious to see if Wii works out.
 
Back
Top