How will NVidia counter the release of HD5xxx?

Discussion in 'Architecture and Products' started by Miksu, Aug 26, 2009.

?

What will NVidia do to counter the release of HD5xxx-series?

  1. GT300 Performance Preview Articles

    29 vote(s)
    19.7%
  2. New card based on the previous architecture

    18 vote(s)
    12.2%
  3. New and Faster Drivers

    6 vote(s)
    4.1%
  4. Something PhysX related

    11 vote(s)
    7.5%
  5. Powerpoint slides

    61 vote(s)
    41.5%
  6. They'll just sit back and watch

    12 vote(s)
    8.2%
  7. Other (please specify)

    10 vote(s)
    6.8%
Thread Status:
Not open for further replies.
  1. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Its not a very demanding game as far as GPU workload is concerned. A single card struggling 1600p with PhysX on isnt all that surprising. At lower resolutions. The game is entirely GPU PhysX limited.


    [​IMG]

    Due to the Way standard SLI rendering PhysX is setup. At lower resolutions your actually limited by the GPU Compute performance of the primary GPU. As with AFR. Only the primary GPU handles PhysX operations. Obviously as resolution scales up AFR will have a better balance because theres more traditional GPU workload to work with.

    Shutting off SLI and just running a second card as a PhysX device at lower res's actually does more than SLI. Or keeping SLI and running a 3rd GPU as a PhysX device independently helps alleviate this bottleneck. Its just important to understand that this is a GPU Compute Bottleneck that can be dealt with a few ways.

    [​IMG]

    [​IMG]
     
    #681 ChrisRay, Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  2. Miksu

    Regular

    Joined:
    Mar 9, 2003
    Messages:
    997
    Likes Received:
    10
    Location:
    Finland
    This topic wasn't titled "Is PhysX any good?". I'm sure that this discussion could be continued somewhere else.
     
  3. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    But if you don't already have a Nvidia card, you can only see Physx in gameplay movies on Youtube and the like. Someone upgrading in the next six months then has the choice of a very fast, next-gen DX11 card without Physx, or a current-gen DX10 card with Physx.

    Personally, it's difficult for me to get excited by something I can't see in my games, when up against a next-gen part that promises the next version of DX that all cards have been advancing with over the last few years since it became a well supported standard. You can't see Physx unless you are already a Nvidia customer, so it doesn't really tempt people away from other brands unless it can become a widely supported industry standard.

    That's why I don't think marketing Physx is enough against the marketing of a DX-next generation card. Proprietry features never have been in the past. But to be fair, that's all Nvidia have got to focus on - 120hz monitors + 3D glasses, or academic CUDA isn't going to do it for the mainstream gaming community.
     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    And you can't see DX11 unless you're from the future. It's doesn't really make sense to try to weigh future, unknown DX11 effects against current, available PhysX functionality.
     
  5. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,330
    speaking of physics ive just got red faction 3 and the destruction is done very well and incorporated into the game quite well er as well ;) its not gpu accellerated though (nor physx or havoc)
    also in a way its a step back from red faction 1 because the terrain is not destructable only buildings and objects. but it is consistant as all buildings are totally destructable unlike rf1 were you would blast through 10 feet of solid rock only to be confronted with an indestructable wooden door.

    About that, were those 2 monitors (or 1 in the u.k because the non samsung fails british standards and cant be sold here) created to work with 3d vision or is there going to be a move anyway to 120hz for monitors

    ps: to nvidia at least support triplehead (you should push for this this chris ;)) it cant compare against 24 monitors but its enough for nearly everyone...
     
    #685 Davros, Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  6. LonelyMind

    Newcomer

    Joined:
    Jun 3, 2003
    Messages:
    27
    Likes Received:
    0
    Isn't the topic of this thread how nVidia will counter the release of HD5xxx? As far as I can see the main focus of nVidia is PhysX. Why is then a discussion about PhysX (and nVidias claims regarding it) not relevant to this thread?

    I've read many posts in this thread from, what I suppose is, nVidia/PhysX affiniciados claiming it's benefits with regards to AMD/ATis new offerings. Those posts still stand. Why is it all of a sudden not ok anymore? Am I missing something? Please correct me if that is the case.
     
  7. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I wonder if the majority shares your opinion.
    DX10 was a pretty underwhelming success, and its problematic adoption may have made people wary of new DX versions not delivering on their promises.
    Also, I wonder if other people are as closed towards PhysX as you are. It seems that lots of people respond more in the vein of "Wow, I love those effects, but why can't I get them on my AMD GPU or CPU?". That's where nVidia's marketing pitch comes in.

    Another thing is... The DX11 API will actually work on nVidia's hardware (and current drivers, including DirectCompute). Reminds me of the days when ATi only had SM2.0 hardware (Radeon X800 series), vs nVidia's SM3.0 hardware (GeForce 6-series). Since it was both 'DX9', most people didn't really know or care about the difference. ATi never really had problems selling their 'outdated' hardware. I never heard anyone here complain about it either :)
     
    #687 Scali, Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  8. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Yes, all may be as self-centered as usual! :cool:
     
  9. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    That kind of information does leak out from time to time in dev blogs and such. But given timelimes, one does suspect that ATI's product this time around got a lot more "reference" kind of usage in the late QA testing for DX11 release and early dev testing as well.
     
  10. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    I was leaning that way for sure, and then I looked at the poll options and, alas, the author (ahem!) opened the door to the relevance of this part of the discussion with option #4.
     
  11. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    But you never have been able to see the future, same way as you can't tell if there will be any Physx titles on the level of BAA this time next year when Nvidia have brought out their own DX11 cards, or if it's going to fizzle out when Nvidia get interested in something else. Especially when looking at the available Physx functionality today, it's not that impressive. BAA and a few curtains in Mirrors Edge that arn't missed are about it. You can look at the past and see that DX has always trumped proprietary features.

    If you don't already have a Nvidia card and want to upgrade, you have the choice of a DX10 card with Physx, or a next-gen DX11 card. I can't see that people will not go for the faster next-gen card, unless they go for brand loyalty. People always want the newer stuff, not the old mutton dressed as lamb. If what we've heard about the OEMs is true, they've already made that choice for their customers, and there's a lot of sales right there.
     
    #691 Bouncing Zabaglione Bros., Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  12. leoneazzurro

    Regular

    Joined:
    Nov 3, 2005
    Messages:
    518
    Likes Received:
    25
    Location:
    Rome, Italy

    Ok, this is how it runs on a high end setup. Could you please test how it runs on a single mainstream card as the GTS250?
     
  13. DeF

    DeF
    Newcomer

    Joined:
    May 3, 2007
    Messages:
    162
    Likes Received:
    20
    If i remember correctly, Ati gained a lot of market share with HD4xxx series in the discrete segment. That can suggest that above mentioned argument ("Wow, I love those effects, but why can't I get them on my AMD GPU or CPU?") wasnt really that convincing. Thats why i dont see how Nvidia can beat "we got much better performance, DX11 and stuff" with it. But im sure they will try to do it of course.

    Well in my country (Poland) i remember that DX9b vs DX9c debate was really strong and in the end Far Cry patch alone was enough to convince 80% of audience that they will be better of buying DX9c hardware. The reason for this was that everyone knew that DX is an industry standard and Ati will have to catch up eventually.
     
  14. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Nvidia's support of 120 Hz monitors is for stereo3D. It has nothing to do with the framerate delta of a 60 Hz Limitation.
     
  15. thatdude90210

    Regular

    Joined:
    Aug 9, 2003
    Messages:
    937
    Likes Received:
    6
    TGDaily is saying that Nvidia is countering the HD5xxx (in addition to physx > all) with ............ Press releases!
     
  16. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Or the more common 8800 series.

    Regards,
    SB
     
  17. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    What kind of motion blur are we actually talking about cause I'm not even aware what the CryEngine2 is using?

    I recall leaving in some past racing sims the so called "motion blur" disabled because it was more like a full screen blur (above a specific speed range) then actual antialiasing in the temporal dimension. The latter as far as I'm aware costs quite a bit in performance and shouldn't be that easy to properly implement in an interactive game.

    Of course didn't you make that comparison, yet one of the reasons a movie looks a lot closer to what the human eye perceives in real time is that despite a typical framerate of say 24fps sudden motion gets antialiased in the temporal dimension and definitely not with just a few samples.
     
  18. nyt

    nyt
    Newcomer

    Joined:
    May 14, 2003
    Messages:
    80
    Likes Received:
    0
    Location:
    Mtl
    Bold statement (FUD?) http://www.revioo.com/articles/a13159_3.html
    High end setup, 1680x1050 4xAA, 9800GTX 12fps, GTS280 41fps. Maybe NV will make cards with an N-1 gen GPU along with G200 to compensate for PhysX impact? It raises FPS back into the 70+ range. Still I'd rather have a couple cores from a quad do physics than an extra GPU/card.
     
    #698 nyt, Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  19. Sxotty

    Legend

    Joined:
    Dec 11, 2002
    Messages:
    5,496
    Likes Received:
    866
    Location:
    PA USA
    A 260 is not mainstream?
     
  20. Jaaanosik

    Newcomer

    Joined:
    May 18, 2008
    Messages:
    146
    Likes Received:
    0
    If it's going to help visuals to be more realistic then yes. What's wrong with that?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...