Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    GeForce 8800GTX launched November 2006 says hello. Battlefield 3 on minimum PC settings is at least a match for the console version.
     
  2. Hornet

    Newcomer

    Joined:
    Nov 28, 2009
    Messages:
    120
    Likes Received:
    0
    Location:
    Italy
    Anything using a lot of bandwidth for framebuffer reads/writes would likely crawl on a X1900 XTX. While you're right that the PS3 handles multiplatform titles pretty well nowadays, multiplatform titles don't make good use of the bandwidth advantage of the 360. The Halo games do. Also, I seem to remember that many multiplatform games have reduced resolution for alpha-blended effects on the PS3, due to lack of bandwidth. Like you said in a previous post, peak vertex shading throughput is also much lower on the X1900 XTX.

    Conroe had 128-bit SIMD units, while the Athlon 64 X2 had 64-bit SIMD units. Also Gears of War is a 2006 title and while it looked good at the time, it is completely obsolete compared to what is running on the Xbox 360 nowadays. Moreover, it is probably not as CPU intensive as other games. For instance, I believe a racing game like Forza 4 takes a huge advantage from VMX128 and I doubt you could run something like that on an Athlon 64 X2.
     
    #16402 Hornet, Dec 14, 2012
    Last edited by a moderator: Dec 14, 2012
  3. Svensk Viking

    Regular

    Joined:
    Oct 11, 2009
    Messages:
    627
    Likes Received:
    208
    I think a better example is Crysis 3. Crysis 3 requires a DX11 GPU, so even the best DX10/10.1 card won't be able to play it, whereas the old 360/PS3 will.
     
  4. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    I totally agree, thats why I asked the question : did you take into account the multi core dual threaded architecture of the 360 ? I mean even the WiiU multiplatform launch games struggle due to the extreme parallalization of the code running on ps3/xbox360 CPUs.

    Anyway, my example of gears of war was a bit extreme, talking about a launch title not using every inch of features and power offered by the xbox360 architecture. a bette example would indeed be the latest unreal 3 engine games (gears of war 3) / frostbite2 engine (battlefield 3) / crytek engine (crysis 2/3) / inhouse optimized engines for 360 (Halo4, forza4) or even open world sophisticated engines a la GTA4 and the next GTA...

    I bet all these games I mentioned would struggle running correclty on 2005 PC hardware.

    But my point is : the ps1 (september 1994) /PS2 (March 2000) / xbox360 (november 2005) miracles (running code way ahead of what PCs could handle at the time of releases of these consoles) wont happen unfortunately for nextgen consoles, thats for sure, and every comment of every developer is confirming this.

    we are even at this stage discussing the following : how much less underpowered those future fall 2013 nextgen consoles would be compared to PCs released 2 years earlier (december 2011) ? the more favourably those consoles of 2013 would compare to a PC of december 2011, the better for games and gamers....but the days of consoles being ahead of PCs in graphics gaming technology are really behind us :cry:
     
  5. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,661
    Likes Received:
    1,114
    In my experience, effective average latency is a better first order approximation to actual performance than peak flop throughput.

    That's why dual core PC CPUs of the day (Athlon X2s and Core2 duos) killed the consoles in actual CPU performance despite having only a fraction of FP peak: Better cache systems, better prefetchers and, most importantly, OOO execution enabling many more memory requests to be in flight.

    Paper flops look great on paper.

    Cheers
     
  6. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    True, but that's literally the only game out of hundreds of cross platform games and it's not even out yet. And even then it's almost certainly down to ease of development that it's limited to DX11 rather than performance reasons.

    i.e, if they wanted to make a DX10 renderer for this game, Crytek could do it, and it would run better on an 8800GTX than it would on the PS3.
     
    #16406 pjbliverpool, Dec 14, 2012
    Last edited by a moderator: Dec 14, 2012
  7. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Why? It has more bandwidth than Llano for example which can easily exceed console performance.

    Do you have evidence to support this? I find it hard to believe that Halo is making any better use of the memory system in the 360 than say Crysis 2 or Battlefield 3. Just because those games aren't exclusive to the 360 doesn't mean they aren't taking full advantage of it's architecture.

    While the X1900XTX has over double the graphics memory bandwidth of RSX if you look at actual graphics memory. And as I stated above, Llano with less bandwidth is able to take on all the effects of 360 ports and more.

    But higher than RSX as I said which still copes fine. RSX + Cell though and the actual real world impact of the lower peak vertex shading throughput than Xenos - that's an open question.

    Don't forget the overall vertex shader throughput of R580 may only be 1/4 of Xenos but Xenos must also do pixel shading with those same resources. So unless the 360 is allocating more than 25% of it's shader resources to vertex shading often enough to have a major effect on framerate then the fixed nature of the shaders in R580 wouldn't be much of a disadvantage. Maybe it is though, that's something I have no idea about, I'm sure there are others here that could advise on that one.

    I guess we'll never know on that one but as has been said many times before (even in this thread). SIMD isn't everything when it comes to games. If it were we should be seeing around double the performance of Conroe vs the Athlon X2 at the same clock speeds and clearly that is never the case.
     
  8. Hornet

    Newcomer

    Joined:
    Nov 28, 2009
    Messages:
    120
    Likes Received:
    0
    Location:
    Italy
    Still, I am not sure this actually matters. Even if the performance of the next generation consoles is below current high-end PCs, it will still be way ahead of the PCs - especially laptops - most people have. Considering developers are currently targeting midrange PCs and 6/7 years-old consoles and adding some eye candy for high-end PCs, I still expect a pretty big jump for the next generation. While it's pretty easy to scale graphics settings and make the code run on a wide range of performance targets, I still believe its nearly impossible to do that with respect to what is running on the CPUs, such AI or physics. That is exactly why I hope MS and Sony do not cheap out on the CPUs in their next generation systems.
     
  9. Hornet

    Newcomer

    Joined:
    Nov 28, 2009
    Messages:
    120
    Likes Received:
    0
    Location:
    Italy
    Modern GPUs make much more efficient use of bandwidth than the X1900 XTX.

    I have no evidence other than the fact that Halo games use a lot of transparencies for explosions and particle effects. On the other hand, do you have any evidence that an Athlon 64 X2 and a Radeon X1900 XTX could run Crysis 2 or Battlefield 3? :)

    The whole point of the unified architecture was that gaming workloads tend to shift frequently between vertex-heavy and pixel-heavy situations. For instance during post-processing vertex shaders remain completely idle. I also seem to remember that Xenos had more flexible shader units than the X1900 XTX vertex shaders, especially when dealing with branches. So it's not just a matter of raw throughput.

    I agree with you on the fact that an Athlon 64 X2 can probably match Xenon performance in many games. What I am arguing is that there are certain gaming workloads in which Xenon is going to perform better. Moreover, even if the Athlon 64 X2 could sustain a higher average framerate than Xenon in a game, there might be scenarios in which the framerate would drop due to lack of SIMD performance. Admittedly, the opposite might also be true in other scenarios.

    We could probably argue forever about this. The point is that the Xbox 360 was pretty much bleeding edge when released.
     
  10. DJ12

    Veteran

    Joined:
    Oct 20, 2006
    Messages:
    3,105
    Likes Received:
    198
    So Psm3 in the UK are going with a quad core APU also a Gpu in the 7970 class and 4 gb of ram for the ps4. Looks pretty sweet if true.
     
  11. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    If it's true you can bet it'll be 599 again.
    (not that I have any problem with this possibility)
     
  12. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Llano has 25.6Gb/s with DDR3 1600 and that has to be shared between ther CPU and the GPU. With this it can exceed console performance.

    Do you think modern GPU's are so much more efficient at using memory bandwith that R580 with twice that raw bandwidth dedicated to the GPU alone would be unable to match console performance due to bandiwdth limitations? That seems quite a stretch to me.

    Anyway we've probably gone a bit too far off topic now.
     
  13. Lucid_Dreamer

    Veteran

    Joined:
    Mar 28, 2008
    Messages:
    1,210
    Likes Received:
    3
    I was just giving the history of the minimum specs for Battlefield 3. That is the 8800 GT.

    http://en.wikipedia.org/wiki/GeForce_8_Series#8800_GT

    EDIT: Another line mentions the GPU you are talking about.

     
  14. cal_guy

    Newcomer

    Joined:
    Jun 27, 2008
    Messages:
    217
    Likes Received:
    3
  15. anexanhume

    Veteran

    Joined:
    Dec 5, 2011
    Messages:
    2,078
    Likes Received:
    1,535
    If it's 599 that better be GDDR5.
     
  16. DJ12

    Veteran

    Joined:
    Oct 20, 2006
    Messages:
    3,105
    Likes Received:
    198
    Doubt they would go that high again, but how much would this stuff cost? I doubt a 7970 would be anywhere near the retail price for Sony, maybe £50~£100 per unit?

    It's doable for for much less with a profit.

    If we are talking dollars, maybe $500?
     
  17. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,715
    Likes Received:
    293
    Those raw GFLOPS I think are the very reason as to why the 360 still has relevance in the general gaming world. Without the 128VMX units, it would be a much more highly limited system in comparison to modern PCs, especially since the Xenon has to process sound and games have become ridiculously physics driven. The Cell BE coming a year later was itself a juggernaut (but a paper tiger). It wouldn't amount to being substantial enough to change the landscape, since it ended up being complicated and a crutch for the RSX, and multiplatforming became the norm.

    Looking at the first quad core CPUs, which were hugely expensive, Xenon generally still has 3/4 of the GFLOPS. Everything else it would be slaughtered in, true indeed, but in a console, it had what it needed to stay within a realistically competitive realm of performance to run PC ports. This was especially when newer PC titles still were still aiming at high end single cores as the minimum CPU required where Xenon could have 6x the GFLOPS peak, and dual cores as the general norm for gaming systems sat at around 2/3s.

    Considering how many of us still have quad core Athlon IIs and how many have i3s and older dual core i5s, Xenon still has quite a measure of relevance and not surprising that it can run something like BF3, GTAIV, etc. I would really like to see how for both games how Xenon is utilized for various processes.
     
  18. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    It's a big chip with 250W TDP, so everything would get expensive, very difficult to cool and keep silent, it could be over 350W for the whole console.
     
  19. ultragpu

    Banned

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,306
    Location:
    Australia
    599 means roughly $1000 AUD here, nothing to panic about for me as long as it's got a beefy GPU.
     
  20. Rylan

    Newcomer

    Joined:
    Jun 14, 2008
    Messages:
    8
    Likes Received:
    0
    Just curious.

    We have the 7970M outputing 29 GLOPS per TDP, and the upcoming Sea Islands HD 8850/8870 are rumored to output 23-25 GFLOPS per TDP.

    If the next Xbox isn't going to use an off the shelf PC GPU what kind of GFLOP/TDP numbers should we expect? Over 30?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...