Predict: The Next Generation Console Tech

Status
Not open for further replies.
Battlefield 3 minimum PC specs:

PC MINIMUM System Requirements
• Operating System: Windows 7 or Vista with Service Pack 3
(It will work on 32 or 64-bit Windows 7 and Vista)
• Processor: Intel Core2Duo 2.4 GHz / AMD Athlon X2 2.7 GHz OR Better
• RAM: 2 GB
• Video/Graphics Card: nVidia GeForce 8800 GT / ATI Radeon HD 3870 OR Better

GeForce 8800GTX launched November 2006 says hello. Battlefield 3 on minimum PC settings is at least a match for the console version.
 
Edram isn't directly compariable to GPU memory bandwidth like that. The PS3 handles many of the same games as the 360 with a similar amount of badwidth to the X1900XTX which has to be shared between both the GPU and the CPU.

And while your likely correct that the CPU would struggle with Crysis 2 at console settings I don't see why the X1900XTX wouldn't be able to handle them. What is it about that card you think is weaker than Xenos? The link below shows a 9800GT (maybe twice as fast as the X1900XTX) averaging 39 fps at 1650x1080 at console settings. I don't see why 30 fps average wouldn't be possible at sub 720p with a reasonable amount of optimisation.

Anything using a lot of bandwidth for framebuffer reads/writes would likely crawl on a X1900 XTX. While you're right that the PS3 handles multiplatform titles pretty well nowadays, multiplatform titles don't make good use of the bandwidth advantage of the 360. The Halo games do. Also, I seem to remember that many multiplatform games have reduced resolution for alpha-blended effects on the PS3, due to lack of bandwidth. Like you said in a previous post, peak vertex shading throughput is also much lower on the X1900 XTX.

The architecture isn't the question. Speed/performance are and it simply wasn;t that much faster than the X1900XTX. As for the CPU, I was running Gears perfectly fine on a 2.4Ghz COnroe back in the day. That's maybe 30% faster than the top end X2's of 2005 but I certainly had much more than 30% spare performance to lose from a CPU point of view (I had the game locked at 60fps).

Conroe had 128-bit SIMD units, while the Athlon 64 X2 had 64-bit SIMD units. Also Gears of War is a 2006 title and while it looked good at the time, it is completely obsolete compared to what is running on the Xbox 360 nowadays. Moreover, it is probably not as CPU intensive as other games. For instance, I believe a racing game like Forza 4 takes a huge advantage from VMX128 and I doubt you could run something like that on an Athlon 64 X2.
 
Last edited by a moderator:
Anything using a lot of bandwidth for framebuffer reads/writes would likely crawl on a X1900 XTX. While you're right that the PS3 handles multiplatform titles pretty well nowadays, multiplatform titles don't make good use of the bandwidth advantage of the 360. The Halo games do. Also, I seem to remember that many multiplatform games have reduced resolution for alpha-blended effects on the PS3, due to lack of bandwidth. Like you said in a previous post, peak vertex shading throughput is also much lower on the X1900 XTX.



Conroe had 128-bit SIMD units, while the Athlon 64 X2 had 64-bit SIMD units. Also Gears of War is a 2006 title and while it looked good at the time, it is completely obsolete compared to what is running on the Xbox 360 nowadays. Moreover, it is probably not as CPU intensive as other games. For instance, I believe a racing game like Forza 4 takes a huge advantage from VMX128 and I doubt you could run something like that on an Athlon 64 X2.

I totally agree, thats why I asked the question : did you take into account the multi core dual threaded architecture of the 360 ? I mean even the WiiU multiplatform launch games struggle due to the extreme parallalization of the code running on ps3/xbox360 CPUs.

Anyway, my example of gears of war was a bit extreme, talking about a launch title not using every inch of features and power offered by the xbox360 architecture. a bette example would indeed be the latest unreal 3 engine games (gears of war 3) / frostbite2 engine (battlefield 3) / crytek engine (crysis 2/3) / inhouse optimized engines for 360 (Halo4, forza4) or even open world sophisticated engines a la GTA4 and the next GTA...

I bet all these games I mentioned would struggle running correclty on 2005 PC hardware.

But my point is : the ps1 (september 1994) /PS2 (March 2000) / xbox360 (november 2005) miracles (running code way ahead of what PCs could handle at the time of releases of these consoles) wont happen unfortunately for nextgen consoles, thats for sure, and every comment of every developer is confirming this.

we are even at this stage discussing the following : how much less underpowered those future fall 2013 nextgen consoles would be compared to PCs released 2 years earlier (december 2011) ? the more favourably those consoles of 2013 would compare to a PC of december 2011, the better for games and gamers....but the days of consoles being ahead of PCs in graphics gaming technology are really behind us :cry:
 
It's got enough registers that for the right workload, where you can actually accommodate the memory latency, you could hit close to 100%

But that's not really indicative of real workloads.

In my experience, effective average latency is a better first order approximation to actual performance than peak flop throughput.

That's why dual core PC CPUs of the day (Athlon X2s and Core2 duos) killed the consoles in actual CPU performance despite having only a fraction of FP peak: Better cache systems, better prefetchers and, most importantly, OOO execution enabling many more memory requests to be in flight.

I've always maintained than both the 360 CPU and Cell were focused on the wrong things, designed by marketing.
Paper flops look great on paper.

Cheers
 
I think a better example is Crysis 3. Crysis 3 requires a DX11 GPU, so even the best DX10/10.1 card won't be able to play it, whereas the old 360/PS3 will.

True, but that's literally the only game out of hundreds of cross platform games and it's not even out yet. And even then it's almost certainly down to ease of development that it's limited to DX11 rather than performance reasons.

i.e, if they wanted to make a DX10 renderer for this game, Crytek could do it, and it would run better on an 8800GTX than it would on the PS3.
 
Last edited by a moderator:
Anything using a lot of bandwidth for framebuffer reads/writes would likely crawl on a X1900 XTX.

Why? It has more bandwidth than Llano for example which can easily exceed console performance.

While you're right that the PS3 handles multiplatform titles pretty well nowadays, multiplatform titles don't make good use of the bandwidth advantage of the 360. The Halo games do.

Do you have evidence to support this? I find it hard to believe that Halo is making any better use of the memory system in the 360 than say Crysis 2 or Battlefield 3. Just because those games aren't exclusive to the 360 doesn't mean they aren't taking full advantage of it's architecture.

Also, I seem to remember that many multiplatform games have reduced resolution for alpha-blended effects on the PS3, due to lack of bandwidth.

While the X1900XTX has over double the graphics memory bandwidth of RSX if you look at actual graphics memory. And as I stated above, Llano with less bandwidth is able to take on all the effects of 360 ports and more.

Like you said in a previous post, peak vertex shading throughput is also much lower on the X1900 XTX.

But higher than RSX as I said which still copes fine. RSX + Cell though and the actual real world impact of the lower peak vertex shading throughput than Xenos - that's an open question.

Don't forget the overall vertex shader throughput of R580 may only be 1/4 of Xenos but Xenos must also do pixel shading with those same resources. So unless the 360 is allocating more than 25% of it's shader resources to vertex shading often enough to have a major effect on framerate then the fixed nature of the shaders in R580 wouldn't be much of a disadvantage. Maybe it is though, that's something I have no idea about, I'm sure there are others here that could advise on that one.

Conroe had 128-bit SIMD units, while the Athlon 64 X2 had 64-bit SIMD units. Also Gears of War is a 2006 title and while it looked good at the time, it is completely obsolete compared to what is running on the Xbox 360 nowadays. Moreover, it is probably not as CPU intensive as other games. For instance, I believe a racing game like Forza 4 takes a huge advantage from VMX128 and I doubt you could run something like that on an Athlon 64 X2.

I guess we'll never know on that one but as has been said many times before (even in this thread). SIMD isn't everything when it comes to games. If it were we should be seeing around double the performance of Conroe vs the Athlon X2 at the same clock speeds and clearly that is never the case.
 
But my point is : the ps1 (september 1994) /PS2 (March 2000) / xbox360 (november 2005) miracles (running code way ahead of what PCs could handle at the time of releases of these consoles) wont happen unfortunately for nextgen consoles, thats for sure, and every comment of every developer is confirming this.

Still, I am not sure this actually matters. Even if the performance of the next generation consoles is below current high-end PCs, it will still be way ahead of the PCs - especially laptops - most people have. Considering developers are currently targeting midrange PCs and 6/7 years-old consoles and adding some eye candy for high-end PCs, I still expect a pretty big jump for the next generation. While it's pretty easy to scale graphics settings and make the code run on a wide range of performance targets, I still believe its nearly impossible to do that with respect to what is running on the CPUs, such AI or physics. That is exactly why I hope MS and Sony do not cheap out on the CPUs in their next generation systems.
 
Why? It has more bandwidth than Llano for example which can easily exceed console performance.

Modern GPUs make much more efficient use of bandwidth than the X1900 XTX.

Do you have evidence to support this? I find it hard to believe that Halo is making any better use of the memory system in the 360 than say Crysis 2 or Battlefield 3. Just because those games aren't exclusive to the 360 doesn't mean they aren't taking full advantage of it's architecture.

I have no evidence other than the fact that Halo games use a lot of transparencies for explosions and particle effects. On the other hand, do you have any evidence that an Athlon 64 X2 and a Radeon X1900 XTX could run Crysis 2 or Battlefield 3? :)

Don't forget the overall vertex shader throughput of R580 may only be 1/4 of Xenos but Xenos must also do pixel shading with those same resources. So unless the 360 is allocating more than 25% of it's shader resources to vertex shading often enough to have a major effect on framerate then the fixed nature of the shaders in R580 wouldn't be much of a disadvantage. Maybe it is though, that's something I have no idea about, I'm sure there are others here that could advise on that one.

The whole point of the unified architecture was that gaming workloads tend to shift frequently between vertex-heavy and pixel-heavy situations. For instance during post-processing vertex shaders remain completely idle. I also seem to remember that Xenos had more flexible shader units than the X1900 XTX vertex shaders, especially when dealing with branches. So it's not just a matter of raw throughput.

I guess we'll never know on that one but as has been said many times before (even in this thread). SIMD isn't everything when it comes to games. If it were we should be seeing around double the performance of Conroe vs the Athlon X2 at the same clock speeds and clearly that is never the case.

I agree with you on the fact that an Athlon 64 X2 can probably match Xenon performance in many games. What I am arguing is that there are certain gaming workloads in which Xenon is going to perform better. Moreover, even if the Athlon 64 X2 could sustain a higher average framerate than Xenon in a game, there might be scenarios in which the framerate would drop due to lack of SIMD performance. Admittedly, the opposite might also be true in other scenarios.

We could probably argue forever about this. The point is that the Xbox 360 was pretty much bleeding edge when released.
 
So Psm3 in the UK are going with a quad core APU also a Gpu in the 7970 class and 4 gb of ram for the ps4. Looks pretty sweet if true.
 
So Psm3 in the UK are going with a quad core APU also a Gpu in the 7970 class and 4 gb of ram for the ps4. Looks pretty sweet if true.
If it's true you can bet it'll be 599 again.
(not that I have any problem with this possibility)
 
Modern GPUs make much more efficient use of bandwidth than the X1900 XTX.

Llano has 25.6Gb/s with DDR3 1600 and that has to be shared between ther CPU and the GPU. With this it can exceed console performance.

Do you think modern GPU's are so much more efficient at using memory bandwith that R580 with twice that raw bandwidth dedicated to the GPU alone would be unable to match console performance due to bandiwdth limitations? That seems quite a stretch to me.

Anyway we've probably gone a bit too far off topic now.
 
GeForce 8800GTX launched November 2006 says hello. Battlefield 3 on minimum PC settings is at least a match for the console version.
I was just giving the history of the minimum specs for Battlefield 3. That is the 8800 GT.

http://en.wikipedia.org/wiki/GeForce_8_Series#8800_GT

The 8800 GT, codenamed G92, was released on 29 October 2007. The card is the first to transition to 65 nm process, and supports PCI-Express 2.0.[14]

EDIT: Another line mentions the GPU you are talking about.

Performance benchmarks at stock speeds place it above the 8800 GTS (640 MB and 320 MB versions) and slightly below the 8800 GTX
 
If it's true you can bet it'll be 599 again.
(not that I have any problem with this possibility)
Doubt they would go that high again, but how much would this stuff cost? I doubt a 7970 would be anywhere near the retail price for Sony, maybe £50~£100 per unit?

It's doable for for much less with a profit.

If we are talking dollars, maybe $500?
 
In my experience, effective average latency is a better first order approximation to actual performance than peak flop throughput.

That's why dual core PC CPUs of the day (Athlon X2s and Core2 duos) killed the consoles in actual CPU performance despite having only a fraction of FP peak: Better cache systems, better prefetchers and, most importantly, OOO execution enabling many more memory requests to be in flight.


Paper flops look great on paper.

Cheers

Those raw GFLOPS I think are the very reason as to why the 360 still has relevance in the general gaming world. Without the 128VMX units, it would be a much more highly limited system in comparison to modern PCs, especially since the Xenon has to process sound and games have become ridiculously physics driven. The Cell BE coming a year later was itself a juggernaut (but a paper tiger). It wouldn't amount to being substantial enough to change the landscape, since it ended up being complicated and a crutch for the RSX, and multiplatforming became the norm.

Looking at the first quad core CPUs, which were hugely expensive, Xenon generally still has 3/4 of the GFLOPS. Everything else it would be slaughtered in, true indeed, but in a console, it had what it needed to stay within a realistically competitive realm of performance to run PC ports. This was especially when newer PC titles still were still aiming at high end single cores as the minimum CPU required where Xenon could have 6x the GFLOPS peak, and dual cores as the general norm for gaming systems sat at around 2/3s.

Considering how many of us still have quad core Athlon IIs and how many have i3s and older dual core i5s, Xenon still has quite a measure of relevance and not surprising that it can run something like BF3, GTAIV, etc. I would really like to see how for both games how Xenon is utilized for various processes.
 
Doubt they would go that high again, but how much would this stuff cost? I doubt a 7970 would be anywhere near the retail price for Sony, maybe £50~£100 per unit?

It's doable for for much less with a profit.

If we are talking dollars, maybe $500?
It's a big chip with 250W TDP, so everything would get expensive, very difficult to cool and keep silent, it could be over 350W for the whole console.
 
Just curious.

We have the 7970M outputing 29 GLOPS per TDP, and the upcoming Sea Islands HD 8850/8870 are rumored to output 23-25 GFLOPS per TDP.

If the next Xbox isn't going to use an off the shelf PC GPU what kind of GFLOP/TDP numbers should we expect? Over 30?
 
Status
Not open for further replies.
Back
Top