Predict: The Next Generation Console Tech

Status
Not open for further replies.
Is this a joke? :rolleyes:
Well prepare yourself to be even more disappoinited because i expect to get less and less 60fps games.

Well, next gen consoles should be much more powerful, and high quality (hardware taxing) assets aren't getting cheaper at the same rate. So perhaps 1080p60 won't be all that uncommon, especially in first gen titles.
 
I don't see problems in using a 1080p resolution. The first next gen hardware won't come untile at least end of 2011. By that time, we will have easily 5 Teraflops GPUs, with a 28nm manufactoring process.
That is 25x the performances of Xenos, and those gpus will be also more efficient in delivering that kind of performance. We won't have realtime GI next-generation, but we will be very close to CG graphics in terms of shader complexity,shadowing, texturing quality and polycounts. I expect also an excellent implementation of a physics model, maybe also some nice fluidodinamics.
 
Well, next gen consoles should be much more powerful, and high quality (hardware taxing) assets aren't getting cheaper at the same rate. So perhaps 1080p60 won't be all that uncommon, especially in first gen titles.
Is we take a 10x increase on performance, and have to divide that by 4 to get stereoscopic 60 fps update, we've only got an extra 2.5x increase in performance to render more and better looking stuff. That's actually very poor, like PS2 to XB or GC to Wii. A truckload more RAM would help with texture quality, but we'll have very limited progress in terms of lighting, shading, model detail, animation, etc. We won't be anywhere near the EA Madden Next-gen trailer of 2005.
 
Twas the same argument before the 360 & PS3 were released about 720p being standard, but what we got was something between SDTV & 720p. Plus the realization that most people playing games don't even have an HDTV. I see the same thing happening next gen. 720p will probably be the sweet spot with 1080p getting a little more action.

Tommy McClain

I see 1080P as likely, but I think we'll have the same situation as this gen with many times games being rendered at ~10-20% fewer pixels than the target 1080P. For example, just pulling it out of air, 1740X1000.

In fact this may well be the most graphically effective way to do things, no matter what the standard resolution. Aim a little below to squeeze more out of your GPU, while not taking too large a IQ hit from upscaling by aiming too far low.
 
What about 720p/60fps + upscaler to 1080p to make 3d less demanding?

And are you sure that nextgen will be so centered on 3d?
 
Well, the 3D input doesn't need a 3D end. Some stuff, like the building blocks example for PS3, would definitely benefit from a 3D display, as positioning stuff like that is a bit awkward, although good contact-shadow tech can help. Whereas other tech demos like throwing the...whatever-they-were at Milo, need the 3D input to track the object's virtual trajectory for the target to interact with it. Take 3D in a sport like baseball, and it'll be a huge plus to be able to see the ball moving towards you in 3D regardless of whether you swing a bat in 3D space to hit it, or time a button press.

Wii proves the point really. You don't need 3D out to benefit from 3D in.
 
I think people are still thinking at todays standard when they think about the capabilities of next generation consoles. I assume industry is aiming at 2012 for the launch of these consoles. That is exactly 3 years ahead of today. Besides the true next generation games targeting new consoles will come 1 to years after the first consoles are being sold (2013 or 2014). Look at the examples of Gears of War and Bioshock with respect to Xbox360's launch. Lot's of things will change until then which may make the increased resolutions easier to implement.

Regardless of that I don't think there will be strict lowest requirements in terms of resolutions and framerates that will be imposed onto developers. Developers will continue to have the freedom to chose the rendering resolutions they think fit to their specific needs. And I don't think todays sweet spot resolution is anything below 720p. Most of modern games are rendered at this resolution which proves that the sweet spot is 720p today, even though there are many examples which target lower resolutions as trade offs. So 3 to 4 years from now that sweet spot might be shifted to 1080p.
 
Is we take a 10x increase on performance, and have to divide that by 4 to get stereoscopic 60 fps update, we've only got an extra 2.5x increase in performance to render more and better looking stuff. That's actually very poor, like PS2 to XB or GC to Wii. A truckload more RAM would help with texture quality, but we'll have very limited progress in terms of lighting, shading, model detail, animation, etc. We won't be anywhere near the EA Madden Next-gen trailer of 2005.

Well I was counting 30fps stereoscopic as 60fps, since 60 frames are still being rendered each second. For non-3D stuff (if there still is such a thing) 60fps should not be an uncommon sight.

And is rendering the scene from two slightly different viewpoints really 2x more expensive? Is there no clever optimization to lessen the hit, considering the two frames would be nearly identical?

One more thing; the next gen will only be 10x more powerful? The 5870 is already like >60x Xenos in raw shader power, no? 2011 should bring us such power in a reasonably small package.
 
One more thing; the next gen will only be 10x more powerful? The 5870 is already like >60x Xenos in raw shader power, no? 2011 should bring us such power in a reasonably small package.

This kind of growth stopped long, long ago, where the high-end of last year becomes this year's middle ground becomes next year's low end. Try to buy a $50-60 videocard and weep.
 
Is we take a 10x increase on performance, and have to divide that by 4 to get stereoscopic 60 fps update, we've only got an extra 2.5x increase in performance to render more and better looking stuff. That's actually very poor, like PS2 to XB or GC to Wii. A truckload more RAM would help with texture quality, but we'll have very limited progress in terms of lighting, shading, model detail, animation, etc. We won't be anywhere near the EA Madden Next-gen trailer of 2005.

Why divide by 4? If we go by what John Carmack said, dividing by 3 sounds about right if we work backwards from his Rage -> Doom comments with regards to onscreen action and performance. Maybe theres something im missing here.

Also would not shaders be more efficient following a compute model as the Xenos shader units are more efficient than the RSX which followed an non unified model? If early DX11 games are maybe 15% more efficient at rendering the same scene, would that not translate to higher performance on screen with the same hardware? Say comparing DX10 and DX11 hardware with the same raw performance.

Overall even if realisable performance is only 3* greater than with current generation hardware then I don't see how anyone aside from the technophiles will be upset. However if games can push 3D without duplicating frames then it would be the best solution going forward for performance, cost and flexibility.
 
One more thing; the next gen will only be 10x more powerful? The 5870 is already like >60x Xenos in raw shader power, no? 2011 should bring us such power in a reasonably small package.

60 times when defined by what metric exactly?

I always laugh when I see figures like this which, more often than not are pulled straight out of thin air..
 
60 times when defined by what metric exactly?

I always laugh when I see figures like this which, more often than not are pulled straight out of thin air..

It has to be pulled out of thin air as Xenos is rated at 240Gflops and RV870 has a 2.4 and 2.7Tflop SKU. So whats that? 10-11 times? But I doubt that in real world applications you'd see a 10* difference in performance even if you were comparing both as console parts.
 
It has to be pulled out of thin air as Xenos is rated at 240Gflops and RV870 has a 2.4 and 2.7Tflop SKU. So whats that? 10-11 times? But I doubt that in real world applications you'd see a 10* difference in performance even if you were comparing both as console parts.

Running both Xenos and HD5870 at "todays" PC resolutions should show quite a disparity in performance figures.
Getting 1080p out of todays performance cards isn't an issue, getting 1080p out of cards 3 generations away shouldn't be an issue either.
 
Running both Xenos and HD5870 at "todays" PC resolutions should show quite a disparity in performance figures.
Getting 1080p out of todays performance cards isn't an issue, getting 1080p out of cards 3 generations away shouldn't be an issue either.
I doubt it will, but you're not top-end comparing PC cards 3 years from now - you're comparing GPU's that have to fit within a very restricted (by comparison) heat and price envelope.
 
This kind of growth stopped long, long ago, where the high-end of last year becomes this year's middle ground becomes next year's low end. Try to buy a $50-60 videocard and weep.

It still hold mostly true for the middle ground, which is where a console GPU would fall.

It has to be pulled out of thin air as Xenos is rated at 240Gflops and RV870 has a 2.4 and 2.7Tflop SKU. So whats that? 10-11 times? But I doubt that in real world applications you'd see a 10* difference in performance even if you were comparing both as console parts.

Hmm I did not pull the figures out of thin air, but my maths for Xenos were off by a factor of 10 :cry:. But why would Cypress show at least a 10x improvement over Xenos in terms of raw compute power? I thought it was more efficient.
 
Hmm I did not pull the figures out of thin air, but my maths for Xenos were off by a factor of 10 :cry:. But why would Cypress show at least a 10x improvement over Xenos in terms of raw compute power? I thought it was more efficient.

Perhaps more than 10x more efficient on a per unit basis but the overall architecture has scaling issues which make it harder to extract all of the performance delta. The most obvious inefficiency is that Cypress has 10* the compute but it obviously doesn't have 10* the memory bandwidth for example. Keeping those fat stream processor pipes fed is a massive undertaking in itself. This is the reason why I believe that some form of on die frame-buffer would not go astray for the next generation consoles.
 
Status
Not open for further replies.
Back
Top