Console that fared best vs PC's available at launch

Discussion in 'Console Technology' started by Butta, Apr 3, 2008.

  1. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    If resolution is that relevant, I say PS2 should win by default. It's the only console of 3d accelerated era that actually ~matched the max supported PC resolution at the time of release. And purely on hw, it had an actual edge as well.

    Thing is though - looking at actual launch-SW offering, nothing with 3d acceleration looked that good relative to PC(a genre or two might stand out, but overall it would go to PC) outside of PS1 which competed with non-accelerated PCs.
     
  2. Extra411

    Newcomer

    Joined:
    Jan 24, 2008
    Messages:
    33
    Likes Received:
    0
    I'm not sure if I understand this.

    The majority of PS2's games are 640x480 interlaced (with only a few games supporting 480p and some other funky resolutions). Who has not played a game in 1024x768+ when they had a Geforce 2, which was also released in the same year (I think Geforce 2 was released around May 2000, a full 5 months before PS2's US launch)? In fact, Geforce 2 GTS was able to play Quake 3 and UT in 1600x1200 at very playable framerates. So how is the PS2 matching that?
     
  3. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    I was playing in 1024x768 occasionally in 1998 with my Matrox G200. Banshee was around then, too, and could do 1024x768. In 1999, I was gaming almost exclusively at that resolution with a TNT and then a G400 MAX.

    In 2000, >1024x768 was totally an option. I was on a 17" CRT so I stuck with it though.
     
  4. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    Huh yeah I was playing games at 1024*768 and higher in year 1999. HD gaming since year 1999 for me. :???:
     
  5. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    But they lock it to 30fps becouse the framerate fluctuates to much. And seeing how many games run at below 30fps on 30fps locked games then you know why they locked it. ;)

    I could ask you the same about the above. I doubt devs would lock a game at 30fps that run at average 50fps instead of using the spare processing power to deliver more detail at the 30fps lock. And seeing how the 30fps "locks" are instable i would think it is the case! ;)

    And there are several 60fps games that really would be average 50fps or lower, having the framerate bounce between sub 30fps and 60 fps. That was their choice wile others chose to add more "candy" at 30fps!
     
    #105 Neb, Apr 13, 2008
    Last edited by a moderator: Apr 13, 2008
  6. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    how is that comment useful to the discussion?
     
  7. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    But how many console games are locked at 30fps 100% of the time? I.e. have zero noticable slowdown, ever?

    Its very few as far as I can see, at least at the higher end of the graphics scale.

    38fps average is more than enough to throw a 30fps cap on there and stick at 30fps or close for the vast majority of the time IMO which seems to be the norm for most "30fps" console games.
     
  8. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
    #108 Heinrich4, Apr 13, 2008
    Last edited by a moderator: Apr 14, 2008
  9. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    But is that down to unified shaders or something else? Note again that the non unified R580 is more than able to keep up with the unified GPU's.

    I think its obvious that in some game types R6xx has an advantage over G71 but it seems to me that its games that put more emphasis on SM3 shaders which would of course be the more heavyweight console games. I don't really see why console games would try to leverage unified shaders anyway since the PS3 lacks them.

    Lost Planet is another game were R6xx looks particulary good against G71 (although not to the same extent as CoD4 and Bioshock) and again, its a console originated game that puts a heavy emphasis on shaders.

    http://www.firingsquad.com/hardware/radeon_hd_2600_performance_preview/page15.asp

    However looking at the other games in that review, the 2600XT is gererally behind the 7900GS. This actually leads me to revise my conclusion of 2600HD + 25% for Xenos as when it comes to shader limited games (were the 2600XT does well vs G71) I expect its more in line with Xenos performance than that. After all it has 90% of Xenos's raw shader performance arranged in a more efficient design.

    In other areas vs Xenos, the 2600 may fair even worse than 80% of its performance. Especially with 4xAA and framebuffer limited situations.

    I don't think framerates would change. Rather devs would add noticably more detail into the 360 verson of games. Afterall, 50% extra raw GPU power at the same framerate is quite a bit. And given how obsessive the internet is at picking apart the tiniest differences in cross platform games, i'm sure we would know about it by now :wink:

    Either that or lower resolution for PS3 versions of games - again, something we would definatly know about.

    True, perhaps memory isn't the problem. However I still don't think its down to unified shaders. The R580 just performs too well in comparison to R6xx, expecially in the firing squad benchies.

    I don't think there's anything particularly strange about R600's performance here. It performance pretty much in line with what its core/shader clocks would suggest vs the 3850 and 3870. Obviously memory bandwidth is not a limitation in this game. R580 (in the quite modest 1900XT 256MB form) is only 15fps behind the theoretically more powerful and unified 3850.

    What does it matter if they are targetted primarily at the console or not? In fact surely if a game is targetted primarily for one architecture over another then it tells very little of the relative performance of those two architectures. The fairer comparison is in games which are made with all platforms in mind with similar levels of optimisation on each.

    And even then the PC is at a disadvantage given that as we know very well on this forum, consoles recieve much higher levels of optimisation in cross platform games than do PC's. Its one of the regularly used arguments for why consoles don't need as much power as PC's to achieve the same results.

    I agree that a direct comparison between the 360 and PC using these benchmarks is impossible without having actual performance numbers from the consoles however what these benchmarks do show us is that the R580 is more than capable of playing the same games, at what we can presume are similar framerates at fairly significantly higher image quality/resolution settings.

    Its not proof, but its certainly compelling evididence in favour of R580 being more capable. Combine that with the fact that its newer, bigger (in terms of trasistors), does not operate in such a heat/power restricted environment and is a fair bit more powerful on paper and it seems to me that there is a far more compelling argument for R580's superior performance over Xenos than the other way round.

    But what i'm not understanding is why you don't think ROP performance or texturing performance is not also higher. R580 runs with 30% more clock speed than Xenos and the same number of texture units. Why would Xenos be faster in that area? In ROPs the situation is even more obvious since R580 has twice as many on top of the 30% clock speed advantage. No it doesn't have edram but as we have seen from the CoD4 benchmarks, even R600/R6xx isn't memory bandwidth limited at these levels so its unlikely that R580 with 64GB/s would be heavily bandwidth limited.

    But if your saying that RV630 cannot perform even close to its ROP potential in fill rate limited situations because of memory bandwidth then thats obviously a memory bandwidth, and not fill rate limited situation. R600 has 4x the available bandwidth and thus there is the potential for R600 to perform 4x faster - if bandwidth is truly the limitation. Sure, percentage wise R600 may also be just as restricted in the ROPs due to bandwidth but that wouldn't stop it performing 4x faster in what is essentially a bandwidth limited situation.

    Of course in reality, RV670 shows that R600 isn't bandwidth limited in pretty much any situation and so given that RV630 has, as you say the same bandwidth per ROP/clock then doesn't that also hold true for RV630?

    I mean, if R600's ROPs aren't being limited by bandwidth and all the ratio's are the same with RV630, then why would RV630 be limited?

    I think we are both assigning different meanings to how a workload applies to a platform.

    I agree that CoD4 is a different type of workload to older games in that its much more shader heavy. I don't see that as making it a "console" workload as opposed to a PC workload though. I simply see it as being a more modern worload which is representitive of both modern console and PC games.

    The older "PC" worloads which you refer to are simply older games which also represented worloads for previous generation consoles.

    Even if CoD4 did specifically leans towards unified shader exploitation (which I personally don't believe) I still wouldn't consider it a console based workload. Afterall, only 1 of the 3 current generation consoles uses unified shaders were as every currently in production PC GPU actually uses a more sophisticated unified shader design.

    Take a look at Crysis for example. That has a huge vertex load (much higher than CoD4) and yet its a PC exclusive. CoD4, Bioshock, Lost Planet, Crysis, Supreme Commander etc.. etc... these are all examples of modern game workloads, not console based worloads IMO.

    It does seem to be modern workloads that G71 suffers in comparitively but I don't see that as being down to unified shaders as R580 shows none of the same symptoms. My guess is thats its weak when it comes to SM3 shaders but thats just a shot in the dark :smile:

    I think you might have misunderstood my argument a bit there, apoogies if thats my fault. I'm not saying R580 ~ = G71 as I do completely agree with you that that doesn't seem to be the case for the more modern workloads. In older worloads yes G71 can keep up well with R580 and thus will probably outperform or at least perform similarly to Xenos but in more modern workloads it does fall behind both R580 and even RV630 in some cases suggesting that there would be scenarios were G71 might perform poorley next to Xenos (assuming Xenos shares R580's and R6xx's strength in moderns workloads).

    However RV630+25% ~=Xenos does still seem to be a reasonable assumption in broad terms. In fact I expect the two to be more equal when shader limitations come into play - which is most likely the very same scenario where G71 is comparitvely weak next to RV630. Obviously there will be other scenario's such as the use of 4xAA or framebuffer bandwidth limited situations were Xenos could go beyond that 25%. Also, I think the benchmarks support R580 being generally faster than RV630+25%.

    But if those things were truly such huge performance limitations because of bandwidth restrictions then why, with the same ROP/bandwidth ratio as RV630 doesn't R600 significantly outperform RV670 which should be even more memory bandwidth limited?

    i.e. the HD 3870 should be more bandwidth restricted for its given ROP power than RV630 and yet it shows pretty much no advantage when given more bandwidth (in the form of R600).
     
  10. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,410
    Location:
    Wrong thread
    This would depend entirely on the game and the kind of variance in different parameters - you could very, very easily have a situation in something like a FPS where a 38fps average with vsync off wouldn't give you an acceptable 30fps cap with vsync and double buffering on.

    I'm used to PC frame rates that are all over the place (e.g. Team Fortress where I go from a steady 75fps to a fairly consistent 25 fps depending on where I stand in a map) but if a console game spends more than a very, very small percentage of its time at 20 fps instead of its 30 fps cap I consider it a bit unpolished. Like most PC gamers I know, I have a bit of a double standard when it comes to frame rates...

    Triple buffering for great justice IMO btw - I don't see how you can justify not spending the memory on a 3rd AA - resolved buffer if your game is suffering from horribly ugly tearing on a remotely regular basis (given the enormity of its benefit compared to anything else you could cram into that space), and I don't see why MS don't want you using triple buffering on the PC. Puzzling.
     
  11. Cheezdoodles

    Veteran

    Joined:
    May 24, 2006
    Messages:
    3,930
    Likes Received:
    24
    What???

    The highest resolution supported in the PS2 was 1080i upscaled in GT4 if im not wrong.

    Meanwhile i have been playing at HD resolutions with PCs since before 1999.
     
  12. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    I fail to see how that disagrees with me - I said that PS2 was the first console with HD support - capability which at time of release, was roughly on par with how high PCs could go. And that obviously wasn't the case for any other console in 3d era.

    I made a point it took more time for SW to showcase it, but there's been a fair amount of debate in this thread focusing on purely HW aspects of respective consoles.
     
  13. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    You had the option to run 2 Voodoo2 cards together to up the 3D resolution roof to 1024*768 smoothly some years before, does it count?
     
  14. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    Okay, clearly we are the only ones interested in this tangent, so I'm going to reply to you in a PM.
     
  15. Shompola

    Newcomer

    Joined:
    Nov 14, 2005
    Messages:
    197
    Likes Received:
    40
    You could even run a few games in 800x600 on VOODOO Rush back in 1997.
     
  16. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    I thought it was interesting. *shrug* :) Besides, it was on-topic (Xenos and PC hardware of the time etc).
     
  17. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,715
    Likes Received:
    293
    GT4 is the only PS2 game I can think of with any HD support at all.
     
  18. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    How many games actually had high resolution support? The only major one I know of the is the pseudo-1080i of GT4, and that was many years after launch. By the time PS2 was launched, we were already in the GeForce era.

    Overall I agree, but I think DreamCast and 360 came closest since accelerators were available.
     
  19. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    960x720 on a RIVA 128. Voodoo1 actually could do 800x600 if the game skipped the Z-Buffer, so that's back to 1996. And to think that Xbox and Cube were running 720x480 (or 640x480?) almost exclusively throughout their entire existence. Just omgwow... :wink: Of course, that was probably almost entirely caused by SDTV they had to output to. I'm sure it was nice to have a NV2x only required to push that many pixels though. Lots of fillrate to play with.
     
  20. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    Ya know... it's funny... My TV detects Max Payne 2 or I suppose... the Xbox, as sending a 720x480 signal. Curious.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...