Predict: The Next Generation Console Tech

Status
Not open for further replies.
In fact a thought that just occured to me, perhaps the reason Wii is so simple is because Nntendo were hedging their bets and had a contingency plan case Wii was a flop. If so it wouldn't have lost them a huge amount of investment. If they had gone with a PS360 type high-end system and Wiimote proved a turn-off, they'd have been scuppered.

Nintendo would have had to have incurred the full fixed cost of researching and developing an entire game system and then quietly eaten that expense without ever selling a unit to defray the expenditure.

Besides, if the Wiimote PS360 system failed, what would have stopped them from going without the Wiimote (when there's so much software on the Wii that has or really should have done that anyway)?
 
Just for the sake of wanting overall more & better graphics in next-gen, I truly do hope that next-gen Xbox and PS4 both get to use a 256-bit bus. There is only so much that can be done to stretch the viability of a 128-bit bus. Faster GDDR5 (or beyond GDDR5) memory. Larger pools of EDRAM. Unless something like true quad data rate memory comes along, I don't see 128-bit bus supporting anything other than modest leaps in performance. I can understand consoles not using 384, 448 or 512 bit memory bus interface, but I do not see why they cannot use 256-bit bus, which has been standard in PC cards since 2002.
 
Whats the chances of them using an optical connection inside the console instead? Its high bandwidth, its cost should scale really well will time and it should lower latencies a little as well?

HDD/Optical drive <Sata 3> South bridge <optical> CPU <Optical> GPU <Optical> Ram

Its a technology being developed by IBM, surely it would be mature enough to use by 2011/2012?
 
Something new and high-tech, rather than just more of the same, would be exciting. Optical tech, along with other new technologies that could seperate new next-gen console architectures from that of midrange to highend PCs is one thing that consoles need to maintain their relavance. I don't want to see the circle complete with next-gen consoles being absolutely no different from the PC. Hopefully the course will change with the next-gen.
 
I don't think that optical connection would be that usefull.
They would be welcome if you want multiple cpu or gpu act as one die.
It sounds way too cutting edge.

Anyway even if manufacturers are more conservative next time around I don't expect to be as much as Nintendo, and even if they are on a tiny budget (power, heat disspation, die size) the jump in processing power will be pretty significant.
 
I have to wonder if too much is being made of following Nintendo's lead. If you are Sony or MS, do you really want to try and emulate Nintendo and go after them on their own turf while simultaneously hamstringing yourself in power. Power is at least part of what got them the audiences they have. Do you really want to abandom them, your exisinting fans, and go Nintendo's route? You would lose what differentiates you from your competitors. Probably the last time a control of some type had as much effect as the Wiimote was the mouse. I.e. - not likely to happen again soon.
 
I think there's a balance to be had, definitely. There's nothing to stop you making a next gen console that has the power, that has the remote-styled controls and at the same time appeal to the hardcore gamer too.

The thing is that I fully expect cost to be much more of a priority next time around, so that does put a limit on stuff like storage, GPU and CPU.

Bearing in mind what 360 and PS3 are capable of doing at 720p at the moment, would a 'next gen' console with 2x-3x the graphical power at the same resolution but at a great price really be such a disaster? My big worry would be shovelware as seen on Wii - GameCube games ported over with little done to improve the visuals. On the other hand though, familiarity with existing architecture and dev tools could see more being gleaned from the hardware sooner in the console's life cycle.
 
A disaster? Depends. I remain somewhat unimpressed by what I have seen visually so far. I fully expect the motion control, the cameras possibly, and voice command to play a much larger role. Not so sure about the 3d tech. I remain skeptical.

I'm not sure how to read the "2-3x" the power. When using numbers as general as that it doesn't really bring something specific to mind. Same with listing shading powers and the like. I know what I would like to see, but that is something different entirely. (Not radically different as in stupid computational power) but more in line with fixing some of the visual problems I have with where we currently are. Draw distance, pop-in, bad LOD, repetitive textures, sterile environments (Mass Effect). The closest I have seen to what I find visually appealing is Assassins Creed. Cut out he load times and I would be quite happy with that as a rough visual starting point. Need a day night cycle. Throw in minimal content (COD4). It is tough to separate what is currently possible with what developers have chosen to implement.

Hmm, a way to put it. With all the different genres it is rather difficult for me to point out what I would find acceptable. A 400$ initial price I find wholly acceptable. For that, in next gen, I would expect a hdd, built in wireless (preferrably N since it should be well mature by then), the usual wireless controller, better streaming video support. But some of that is peripheral to the games.

I tend to think in terms of something more than just resolution. I much prefer a lower resolution with all the bells and whistles one can tack on to a higher resolution. Some of this is software and some of it is hardware.
 
I have to wonder if too much is being made of following Nintendo's lead. If you are Sony or MS, do you really want to try and emulate Nintendo and go after them on their own turf while simultaneously hamstringing yourself in power. Power is at least part of what got them the audiences they have. Do you really want to abandom them, your exisinting fans, and go Nintendo's route? You would lose what differentiates you from your competitors.


I've thought about this myself and I very much agree.
I think Microsoft and Sony best chance for success are, developing or finding new GPU architecture that is very very high in cost/performance, something that can outperform PC for 6 months. The current HD consoles have GPUs that are barely adaquate for HD gaming. They struggle with 720p, nevermind 1080p. The next-gen consoles should be able to breeze by doing 1080p, 60fps with graphic complexity/detail/lighting/animation that easily surpass the best of 360/PS3. I suppose Crysis at the highest detail setting would be a starting point, but next-gen consoles should be able to go beyond that too, in the 2nd and 3rd years of their lifecycles.

Microsoft and Sony need to be very sensitive about pricing ($299 for the base SKU, and no more than $399 for the better models), have a new control scheme that isnt gimmicky, that beats what Nintendo is doing now, but not putting all the eggs in the new controller, still offering high-powered console. On Sony's side: get better media and web browsing software on the box so that it can really compete with the PC (get MS where it hurts), on Microsoft's side go after movie, IPTV, deals that will hurt Sony's strength (movie studios).
 
Last edited by a moderator:
more in line with fixing some of the visual problems I have with where we currently are. Draw distance

CPU load and memory, plus software know-how


Streaming know-how

bad LOD, repetitive textures, sterile environments (Mass Effect)

Art know-how.

Most of what you listed comes down to the abilities of the developer, not the hardware. If you take developers who produced the bottom 70-80% of the current generation and give them a considerably more powerful machine, you will get bad draw distance, bad LOD, repetitive textures and sterile environments. If you take the top developers and give them two 360s duct taped together, you won't have these problems. (Actually, you will have, but you won't notice them, which is the whole point.)

Now take the "good" developers from this generation, take away the hardware knowledge they have, give them a buggy compiler, no profiler, and make them rewrite significant portions of their code and retrain artist - or let them choose to be "bad" developers and keep the current-gen engines/pipelines on the next gen.

I know this is an unpopular opinion on this forum obsessed with ALU counts and bandwidth and technologies from outer space (gee, optical interconnect? Intel have a $600 SSD which doesn't suck therefore in two years it will be economical to equip $199 Arcade720s with one?) - but having good tools from the get-go and helping developers preserve and reuse their experience in the next-gen will give better returns than cranking up the tech.

Bring the duct tape!
 
Hey, that sure isn't unpopular with me. I don't care how it is achieved. I was simply under the impression that manhy of the problems were hardware limitations. Draw distance and LOD I have yet to see where there is not serious pop-in. that would include succesful developers, but I'm not sure what the board thinks of their technical prowess.

All the high-tech sounds great, but if it really is tools and devs, then maybe the money is better spent there. People around here certainly have more knowledge than I in this respect. As a little aside to this. Am I the only one that finds 3-4 years for a proper sequel a tad bit long? ( particularly large games with expansion packs for the previous releases and the like not withstanding.)
 
I think Microsoft and Sony best chance for success are, developing or finding new GPU architecture that is very very high in cost/performance, something that can outperform PC for 6 months.

Given that NV and ATI already produce the very fastest GPU's they are capable of with virtually no regard to power draw or heat output, that would suppose that Sony or MS have considerably greater expertise at designing GPU's. So I really don't see that happening.

Even ignoring the use of dual GPU's which are bound to be prevailent by the time the next gen consoles launch, no console has ever had a clear cut lead over the fastest PC's on day 1 of their launch, let alone for 6 whole months. And given how power draw and heat output are increasing, it seems even less likely that that would happen now than it has in the past.
 
I think Microsoft and Sony best chance for success are, developing or finding new GPU architecture that is very very high in cost/performance, something that can outperform PC for 6 months.

Thats like telling Skoda to design a better racing car thats better than what Ferrari and Porsche can come up with.

Neither Microsoft or Sony have any expertise in developing GPU's, wheras AMD and Nvidia has been creating them for years. The chance of Sony and Microsoft developing something better than AMD\Nvidia products is slim to none.

And since all gpu's will be based on either AMD or Nvidia products, you cannot really get something that can outperform a PC. Chances that Intel or somebody else will have a GPU architecture that is better than AMD\Nvidia are very small.

(The goal of outperforming a PC for 6 months is crazy, the PCs have the same kind of GPU technology avaliable, except that their tech is not gimped by cheap busses and cheap memory, with no regards to power use).
 
CPU load and memory, plus software know-how



Streaming know-how



Art know-how.

Most of what you listed comes down to the abilities of the developer, not the hardware. If you take developers who produced the bottom 70-80% of the current generation and give them a considerably more powerful machine, you will get bad draw distance, bad LOD, repetitive textures and sterile environments. If you take the top developers and give them two 360s duct taped together, you won't have these problems. (Actually, you will have, but you won't notice them, which is the whole point.)

Now take the "good" developers from this generation, take away the hardware knowledge they have, give them a buggy compiler, no profiler, and make them rewrite significant portions of their code and retrain artist - or let them choose to be "bad" developers and keep the current-gen engines/pipelines on the next gen.

I know this is an unpopular opinion on this forum obsessed with ALU counts and bandwidth and technologies from outer space (gee, optical interconnect? Intel have a $600 SSD which doesn't suck therefore in two years it will be economical to equip $199 Arcade720s with one?) - but having good tools from the get-go and helping developers preserve and reuse their experience in the next-gen will give better returns than cranking up the tech.

Bring the duct tape!

Why not bring the tech and the tools etc? There's no reason not to. All the components for next gen systems will be evolutionary not revolutionary so it is a more than plausible proposition.

Nothing yet suggests massive investment in new tech by MS, and Sony is required. A lot suggests the dollars both entities have already spent should be able to go pretty far next round. I've yet to see any reason duct tape is needed at all to be honest.
 
From Anandtech's Jasper article:

Remember that when it was released, the Xbox 360 had a bit more graphics power than a Radeon X800 XT; today's high end GPUs are around 4x the speed of that.

Microsoft doesn't want to replace the Xbox 360 with a new console until 2011 or 2012, meaning high end PCs will probably have more than six times the graphics horsepower of what's in the Xbox 360. It's possible that once this performance gap gets wide enough we'll see more developers take advantage of the raw horsepower available on PCs, which has traditionally been the case whenever a console got far into its lifespan.

I'm actually a bit surprised that we haven't seen more focus on delivering incredible visuals on PC games given the existing performance gap, but the Xbox 360 as a platform is attractive enough to keep developers primarily focused there.


Overall, I'd like to see:

.............................[8-core Intel Sandy Bridge CPU]
................................[48 core Intel Larrabee]
[upper midrange ATI DX12 GPU] [upper midrange ATI DX12 GPU]

An Xbox3 like with a great deal of silicon muscle in 2012 could carry through the rest of the decade, rather than a more modest upgrade over 360 in 2011 that would need to be replaced in 2017-2018
 
Last edited by a moderator:
From Anandtech's Jasper article:




Overall, I'd like to see:

.............................[8-core Intel Sandy Bridge CPU]
................................[48 core Intel Larrabee]
[upper midrange ATI DX12 GPU] [upper midrange ATI DX12 GPU]

An Xbox3 like with a great deal of silicon muscle in 2012 could carry through the rest of the decade, rather than a more modest upgrade over 360 in 2011 that would need to be replaced in 2017-2018

Why would you need a powerful OOO CPU, a dedicated GPU AND a Larrabee? Cutting out Larrabee and maybe going 6 cores (or maybe even 4) on the Sandy Bridge would look quite good to me. Heavy physics can probably be handled by the GPU.
 
If MS is to use Intel as its provider I could see them go with a "fusion like chip".
Some huge OoO cores (2 could be enough) bunch of larrabee1/2 cores.
~300 mm²
As much RAM as possible.
 
Last edited by a moderator:
Regarding next gen, I think the console vendors should look back at the way things have panned out in the past.

The overriding aim of your CPU should be not to cripple games performance. If you're the lead platform it doesn't matter how bad it is, if you're not the lead platform it only matters that you can easily match lead platform performance with little or no difficult work required.

The aim of a GPU should be to wow gamers for the first year or two, to differentiate your system from what's gone before, to build up buzz and draw in core gamers who are prepared to spend a lot of money. After this, your GPU needs to get cheaper, fast, because most of the users you're now trying to attract don't care much about graphics (so don't lumber it with a big fat memory bus that will need lots of memory chips for the product's life). Also try to make the graphics chip not self destruct the system.

Of course, if you have a fancy new controller to hook people you can get away with using the same GPU that Babbage's Difference Engine used. Most people won't care, the rest will swear it can do Toy Story graphics (just at SD resolutions).

My probably wildly inaccurate predictions:

Xbox 3: very similar to this one, IBM cpu (something like 4 - 8 cores, maybe with OoOE this time as they have a longer development period), ATI gpu with embedded frambuffer, 2GB ram, a few GBs of flash memory for scratch disk, OS, game saves. 2011 or 2012, probably 2011 though.

PS4: A cheaper PS3 with more RAM and a fancier GPU. They don't want to get into a hardware pissing contest with MS this time, so they should go the year before MS with very cost effective hardware. They should aim for 2010 just to be on the safe side, but may struggle to get a new machine out before 2011 in which case they might clash and end up in a hardware pissing contest anyway.

Wii 2: No idea, but hopefully something from PowerVR (because it'd be powerful and cheap)?
 
Is it even in Sony's best interest to release a slightly upgraded console?

IMO, it is probable that a new Xbox and a redesigned Wii will come out around 2011-2012.

They are probably not going to be in the financial situation to go a spending war with Microsoft again..... so why even bother?

IMO, they should be focusing efforts on a rebranding strategy to target the Wii audience at a mass market price...... let Microsoft take the graphic whore sector.

perhaps a PS3 slim with waggle controls.

Sony has many first and second party developers under their wings

eg

1) Polyphony Digital (gran turismo 5)
2) Guerilla Games (Killzone 2)
3) Quadratic Dreams (Heavy Rain)
4) Team Ico (Secret title)
5) Level 5 (White Knight Chronicles)
6) Sony Santa Monica (God of War)
7) Sony Liverpool (Wipeout series)
8) Insomniac (Rachet and Clank, Resistance)
9) Suckerpunch (Infamous)
10) Naughty Dog (Uncharted)
11) Zipper Interactive (Socom, MAG)
12) Evolution Studios (Motorstorm)
13) Media Molecule (LittleBigPlanet)
14) Clap Hanz (Hotshots Golf)
15) SCE Japan Studio (Project Siren, Locoroco, Patapon)
16) SCE Bend Studio (Syphon Filter)
17) SCE Foster City Studio (Lair)
18) SCE San Diego (sports games)
19) SCE Studio Cambridge/Ninja Theory (Heavenly Sword)
20) Many PSN developers including Dark Star and Q entertainment.

They should be using their army of talented developers to find new ways to target the Wii audience while at the same time serving as a cheap middle road for hardcore gamers. Franchises like Gran Turismo, God of War, Syphon Filter etc will still attract core gamers to the system and they wouldn't alienate loyal fans who already bought the console at 1000 aussie dollars by releasing a new console too early.

The WiiHD wouldn't be that much more powerful than the PS3 if at all so current gen tech wouldn't really be obsolete.

They should be focusing software as the differentiating factor rather than hardware imo.
 
Nah, the crowd that just plays Wii Sports mostly won't go for the SCE games, not even LBP.

In fact, it's questionable that most people who have the Wii will automatically buy a new Nintendo console or anything at all in a couple of years.

They seem to be content to buy the Wii and a couple of games and keep playing them to death, especially in social gatherings.
 
Status
Not open for further replies.
Back
Top