mckmas8808 said:
Comparing console games to PC is really not fair. Those games require you to upgrade your GPU more often than getting a new console. For over $1000 I better have something thats better than a $300 console. Today a great GPU may cost you over $500 whereas any of the consoles today would cost less than $150.
Go back and read what I originally wrote. Some of the consoles are using ~1999 technology. Their graphics are NOT good relatively compared to what is available. Very few games are HD, they lack memory (32MB-64MB), they lack AA and AF, they are mostly Fixed Function (i.e. very little programmable shaders), normal maps are rare, they are are relatively low polygon, etc...
This has NOTHING to do with comparing RE4 GCN to the PS2 port. Yeah, RE4 may look good for a console game--but that has more to do with the art direction than the technical merits of the game.
As for comparing to a PC, it is fair. Look at the consoles. They are getting top end GPU parts from PC makers. Their CPUs are better gaming CPUs than PC CPUs. They have as much memory as high end games require (I do not know of any game that requires more than 512MB).
Even more consoles are sold in the millions of units at a loss. Every PC part, EVERY part, is sold for a profit and usually on a smaller scale. ANd every part has their own company backing and advertising campaign whereas Consoles are marketed as a total product. e.g. The AMD64 3800+ 90nm comes from the same bin as the 3000+ (just different multiplier). Yet there was like a $500 difference between these two chips.
Ditto GPUs. People were buying crippled NV40's as 6800s as $250 6800 LEs (and note: 3rd party card maker buys the chip, buys the memory, and makes the cards, markets it, distributes it, and then makes a profit + the retailer profit... the chip itself is very little of the actual cost). On the other end of the scale people were selling NV40's as 6800 Ultras for almost $600. Same GPU chip and totally different price range.
So when you realize the economy of consoles--that they are sold to the retailer at a loss and the retailer sells it at cost--and then compare that to PCs--the parts on every card are sold at a profit after all expenses (advertising, manufacturing, R&D, etc), every card/part is sold at a profit, the system maker sells it at a profit, and the retailer sells at a profit--it is very fair to compare. Especially when you have $600 retail PCs (the actual cost of making that is probably half of that) with a monitor, ~3000MHz CPU, 6600 AGP card, HDD, printer, KB, MS, and what not.
That PC can play the new games at HD resolutions that look substantially better than console games. And that is now. That is not looking forward to the fact the consoles are 2005/2006 devices with high end parts.
Since the new consoles are going to own the PC for a while, it is a fair comparison (they have a lot of cross breeding in technology).
Anyhow, my point was clearly how games do not look that great in very specific areas (like texturing due to memory limitations, non-HD, etc...) and are using 1999ish technology. You can talk about how great PS2 RE4 looks, but when you look at the new consoles, look at the PC for the last 2 years, you shake your head. They do not look technically great. Great for an old console, good art, but nothing special.
And for the record I don't agree with the 4 year thing. 5 years are ok, 4 seems to short. And going by what you said the PS3 should have came out in the year 2004. Well if it came out in 2004 the X360 and NR would have killed anything Sony could have thrown at it. That would have put Sony at a horrible place.
Again, go back and read what was written. I was argueing that 4 is better than 6. I am fine with 5, but will take 4 over 6.
So according to ME, PS3 should have came out in... 2005! And Sony is launching in Spring 2006, so technically it would seem possible if Sony had planned as much. It is all about planning... Sony has very little of interest come this fall in games. Ditta Nintendo (sans Zelda) and MS. The last 2 years have seen a ton of block buster games, but 2006 is really slowing down. And with no actualy in-game playable media at E3 it seems Sony is not currently read for a 2005 launch. But that all has to do with planning.
Sony was milking this generation for all it was worth. And as the market leader they should do that. But I do not think that should dictate everyone else follow a 5-6 year cycle.
Right now it looks like Sony might implement some kind of gyro technology in the PS3. They couldn't have done that in 2004 properly.
Says WHO? Source? Nintendo has been working with some gyro company for years. Maybe Sony could not have done that in 2004 because they never had the idea
The CELL chip or RSX wouldn't have been finished.
CELL could have been ready for 2005. RSX, if they had contracted with NV sooner could have to as it is a implimention of the next gen GF (which is launching this month commercially). Both technologies are here and now and would have been ready for 2005. There was no need technologically for 2006. With XDR in production it would seem the part that would be in shortest supply would be BR.
It would have been a PS 2.5 instead. I think 4 is too short while 6 maybe a little too long.
Your opinion. Like I said, a 2005 launch would have been nearly identical in HW. A 2004 launch would have been different, but we do not know how different. CELL would have been accelerated, so we may have seen a smaller CELL (like 1:4) or Sony push 90nm production. As for the GPU we would have seen a NV40 class GPU--very feature rich chip with SM3.0 and FP16 blending. So a 2004 PS3 launch may have been less powerful than the 2006 PS3, but it would not have been a PS 2.5 at all. You are talking about a system leaps and bounds more powerful than the PS2.
I think its worth waiting a little bit longer for a Blu-ray drive, Gigabit ethernet port, RSX, Agegia like physics, etc. A 2004 PS3 couldn't have had none of this stuff. As a matter a fact PC games like F.E.A.R. would have already surpassed the PS3 if it would have came out in 2004.
:Roll: FEAR hardly taps the potentual of a top end GPU. Closed boxes are way more proficient and developers develop for the actual HW features and strengths/limitations. That is one reason you will be seeing console games look great in 2007.
And the other stuff is the crux of the debate: You would rather wait 6 years to get new fangled technology. Since BR is not an industry standard and has no media available I could care less. Gbit ethernet has been around for years and could have been included in 2005 or 2004. The RSX is a beefed up NV40 + some bells and whistles it seems, and the Ageias physics? Come on now... Tim Sweeny said the Xbox 360 will have Ageias level physics and it is supposedly 1/2 the FP performance.
But it goes back to: is your console an all in one media device first, or a game system most. I would rather have a console that is games first and uses whatever technology that best suits its needs at the time. Extending a life cycle 6 years to get some unestablished formats or to milk money out your customers is good for business but not something I want.
I would rather a company be more progressive than too slow. Look at the Xbox 360. A 4year cycle did not make it a weak machine. 8x the memory, HD output standard, AA standard, a GPU that may rival Sony's 2006 offering, a nice CPU, Online standard, etc... This is a progressive machine in features and services. If MS could do that in 2005 so could Sony.