I don't see why going with consoles that are much more powerful than Xbox 360/PS3 would be so expensive. We've had several generations of huge leaps in processor and graphics.
I wouldn't say more expansive but manufacturers (Sony and MS as Nintendo made money on hardware from scratch) may very well chose to lose less money if any.
If in 2002, you were told of Xbox360 and PS3 specs, that they would be doing HD resolutions, and be a large leap (not massive but large) over PS2 and Xbox, you might say that would be finantially impossible. but it happened. It's cost Sony, Microsoft and developers/publishers alot of money, but it happened.
Yes but Sony, Ms and some huge developers lose some money. The risks are getting pretty high on medium anf high bufget games.
The generation before, with PS2, GCN and Xbox, they provided, more or less, a massive 100x leap over PS1/N64 graphics at $299. You could say that should've been too expensive, but it also happened.
Are you sure about the figure?
Anyway, process have made huge improvement and manufacturers as AT/Nvidia have pushed 3D rendering pretty far this last decade which is cool
Next-gen won't see another increase in resolutions, except that more games will be 1080p which is already in limited use today. New hardware will just provide more geometry detail, better/faster shaders, better/more AA, the ability to do better framerates, fewer-sub 30fps games, more games locked at 30fps and more at 60fps than this gen. More physics, better A.I., all of that.
So I don't understand why hardware power cannot improve another 10 to 20 times, and games cannot look, say 4-5x better than they do today.
I'm not sure about your metric how 20 time more power translate in 4/5 time better game than today.
What I can say from my restricted understanding is that manufacturers have already psuhed pretty important factors as power consumption and heat disspation.
Chips used in the pc space and in the ps3 and the 360 are already almost as huge as they can, hot and power hungry.
Look to the actual overall best gpu available on the market, the 4870 (Nvidia fans don't crush me
). The chip is big as it can while sutaining acceptable yelds, is super hot and is power hungry. GPU have impressive product in every way... CPU manufacturers would love to have that kind of tolerance to size/heat disspation/power hungry.
The 4870 is almost the same size as xenos (when it launched) it has ~4 time the raw power and is likely a more efficient architecture but it is warmer hungrier.
Actual machines are pretty huge, console manufacturers have no room to add bigger cooling solutions. Sony or MS making a machine now could not use it it consume more than the whole 360.
I see the arguements for a modest improvement in power, but I also see the arguements for another large increase in power. I am torn, though I know I want another large leap beyond current-gen
Every body wants more
but honestly even if manufacturers go with something conservative the jump in power will be more than noticeable.
Take your 4/5 times better and apply it to graphic.
A 4870 could fit the bill, it's actually to warm and power hungry but the same @ 40nm or 32 nm would be fine and really tiny.
That kind of gpu + say a 4 core xenon (just for the figue) could be pack on chip (fusion style) easily, it would be a pretty tiny say between 150 and 200mm².
This kind of machine would be cheap to produce! One chip a pool of RAM and let's go.
Manufacturers can use better and still use to a one chip solution.
Imho no matter what manufacturers choices are the next generation systems will deliver way better expeirence in graphic, physic, ai/whatever.
Conservative doesn't mean a Wii, the change between the gamecube and the Wii is more more what Intel would call their tic toc strategy which is supossed to happen in... 18 months (if my memory is right) than something new.
I completely dismiss the possibility of all the manufacturers will go that route fortheir next gen system but aiming for tinier/cooler/cheaper box well it's another story.