How will NVidia counter the release of HD5xxx?

Discussion in 'Architecture and Products' started by Miksu, Aug 26, 2009.

?

What will NVidia do to counter the release of HD5xxx-series?

  1. GT300 Performance Preview Articles

    29 vote(s)
    19.7%
  2. New card based on the previous architecture

    18 vote(s)
    12.2%
  3. New and Faster Drivers

    6 vote(s)
    4.1%
  4. Something PhysX related

    11 vote(s)
    7.5%
  5. Powerpoint slides

    61 vote(s)
    41.5%
  6. They'll just sit back and watch

    12 vote(s)
    8.2%
  7. Other (please specify)

    10 vote(s)
    6.8%
Thread Status:
Not open for further replies.
  1. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Yea, I'm trying to have a tech discussion.
    It seems that people are so brand-focused that they can't even conceive that there are people like me who don't care what the label says on the tin, but rather what's inside.
    I deliberately posted a comprehensive list of the graphics cards/platforms I've used over the years in my signature, to hopefully make people see that for me "anything goes", as long as it's cool hardware. I've never favoured any brand. Each generation I'll just pick whichever card appeals to me the most (yes, I chose a Radeon 8500 over a GeForce 3 back in the day. I bought it because it was a good GPU, actually the first good GPU that ATi ever made. The GPU that slowly started building the reputation of ATi as a good alternative to nVidia).

    In fact, if you look far enough back in my post history, you'll find me defending nVidia's shadowmapping extensions, even though at the time I was using a Radeon 9600 card myself, which supported no such extensions. Despite the fact that I owned a card of a different brand, which didn't have that technology, I didn't find it necessary to downplay the technology. Today, virtually every GPU on the market, and virtually every game uses this shadowmapping technology.

    Likewise I think accelerated physics will be in virtually every game, running on virtually every GPU in the future. However, today there is only one working solution out there, which happens to be PhysX. As I already said, I supported it long before nVidia owned it and before it ran on GPUs. I saw value in the PPU aswell (even though I never actually owned one myself).

    To conclude... What I'm saying about PhysX and DX11 is simply what I expect to happen when both companies start their marketing engines.
    I don't currently use PhysX in my code, and my engine is already ported to DX11.
    In my perfect world, PhysX would work on all GPUs, and nVidia had a DX11 part ready at the same time as AMD... Sadly this is not the case. Would *I* pick a PhysX DX10 card over a DX11 one? Probably not (especially since I already have a PhysX DX10 card anyway). But I am talking about what nVidia wants people to do (topic of this thread), and I think they'll be able to convince people.
     
    #601 Scali, Sep 17, 2009
    Last edited by a moderator: Sep 17, 2009
  2. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Like what games exactly? There hasn't been a single game so far where physx hasn't been gimmicky, so far it's never been an integral part of the gameplay experence, and indeed it cannot be because of the small fraction of the market that's physx capable.

    Oh come on now... It's not as if DX11 is something completely nebulous and immaterial. DX11 exists right now, it's being put into boxes as we type; titles for it exist in advanced stages of development, they just haven't been released yet.

    If AMD wants to show off DX11 stuff at their hardware introduction they can surely do so, it's well within their power and ability to do so.

    You're making a bigger deal of it than actually warranted IMO. Nobody buys graphics cards because of physx, and that's a fact. DX11 is the future, and besides, AMD's card will be the fastest piece of hardware on the planet until Nvidia counters with their own chip, regardless of the status of DX11.

    Nvidia has an edge in the professional market with Cuda, but that doesn't apply to consumer 3D.
     
  3. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26

    I think my post here was misunderstood., As digi implied that scali was acting covertly. The focus group does not run that way And I think digi should know better especially considering his history on the subject. Nvidia does not allow focus group members to post without them labeling themselves as such due the initial problems with how it was percieved. I was just commenting why I do not have one. There are not alot of signatures on this forum so I prefer not too try and stand out against that trend. I just decide to put in my user title. In a nutshell I am just tired of the witch hunts and everyone questioning everyone's agenda. Most of these should be irrelevant when discussing technology as long as its civil and polite. The baiting is simply not needed.

    *edit* Lots of typos and corrections.
     
    #603 ChrisRay, Sep 17, 2009
    Last edited by a moderator: Sep 17, 2009
  4. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    There we go, downplaying again.
    I don't think you get the point... What we may see as 'gimmicky' may impress Joe Schmoe, because it's nice and colourful and it moves around. It's smoke and mirrors, so to say.

    Hello!? How often do I have to repeat myself? I actually have a DX11 engine in development myself.
    You just don't get what I'm saying, and I'm tired of trying to explain it. You just don't want to understand.

    But not with actual games, and demos just don't work that well anymore, because of what happened over the years, with all sorts of demos/benchmarks being rigged, effects/features never materializing, things not looking exactly as good as the demo did, etc.. People have become suspicious. Even hardware/game reviewers have become suspicious, and it shows in the reviews that they publish. The past few years there's been a strong focus on benchmarking with actual games, and trying to capture the actual gameplay experience, rather than just running canned demos and benchmarks.

    Your opinion is noted. Have a nice day.
     
  5. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    The next person to use "you" or "your" in a post on this thread (after this one, of course) in the next ~24 hours gets a free one week vacation. This isn't RPSC. Members will discuss the issues relevant to the thread, not each other. Or else.

    The next smart-aleck oneliner in the next ~24 hours gets a discount --just a one day vacation.

    Why the "~"? Because if it is 24 hours and two minutes or something of that carefully not exactly announced duration, no forum-lawyering will avail in defense, s'welp me!

    Edit: "you" that is clearly in the plural (or join the southerners and go for y'all), or impersonal sense, as in "well, what you can do go to get around that . . ." will be tolerated.
     
  6. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Does no one else remember the Mac Flag easteregg? Honestly!
     
  7. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Oh please. Mirror's Edge? lets turn on physx so the whole game runs worse and on the odd case that is actually being looked at some useless flag it acts a little differently!


    MOD EDIT: I suspect aaronspink did not see my last post, as he was responding to one from yesterday, so I edited instead. But really, I'm not kidding on avoiding personal pronouns for the next 24 hours on this thread.
     
  8. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Right! they hire companies to do that for them! If people are skeptical of astro-turfing on the part of nvidia, it is well deserved skepticism.

    In fact I wouldn't be surprised if Nvidia started a large scale astro-turf campaign to go along with their latest strategy of Physx >> ALL. Realistically, it is the only marketing argument they can make(price cuts cannot really be considered marketing arguments and actually have a history of acting as anti-marketing for products in the "man that's cheap, I wonder what is wrong with it" vein). So from a biz perspective, one could hardly find fault with pushing their story of physx as hard as possible.

    Not that I think it will work on a large scale, but any aborted or delayed sale is a benefit at this point to nvidia.
     
    #608 aaronspink, Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    I wonder what people will say when ATI shows Dx11 games running with Dx11 features on a Dx11 card? Nevermind, that already happened.

    I can only imagine that reviewers will also have access to Dx11 games when reviewing the Dx11 card. Unless for some bizarre reason MS won't allow them access to Dx11 for review purposes?

    Regards,
    SB
     
  10. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    I cant really believe that so much space was wasted arguing that PhysX matters over D3D11. I mean really, that makes zero sense to even the most platform agnostic person, heck I would be a S3 zealot and that would still remain an absurd suggestion.

    I think the moral of this thread is that Nvidia will focus on PhsyX over the 5800 and its ok. Well, we'll find out next year ..
     
  11. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    I think the moral of this thread is that nVidia will spread whatever FUD they feel they must in order to make up for their lack of a competing product. :yep2:
     
  12. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Honestly! I have no idea what that is :oops:

    Geo, we're allowed to use "I" right? And "we"? :D
     
  13. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    (fixed) assumption was one sided. I'm simply pointing out I don't think it's that easy to say (And I know it's not true in the outside world where the noobs dwell). Where is crossfirezone.net? lol. :razz:
    Anyways, the beta testers I'm sure are far outnumbered by the other group that does so much more then us. Dare I say were not even active anymore? I still have it in my sig because I'm proud of the fact I helped contribute what I could.
    I don't find anything ironic to it at all.
     
  14. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,397
    I think a few people should try decafe.

    Nvidia will respond like any company in a bad position will, they will lie, cheat, and steal their way into the hands of new consumers. I expect their marketing department will be working overtime.
     
  15. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    As promised, here are some quick Batman: AA numbers using the in-game benchmark. Test system is a Core i7 965, 6GB, GTX 285, VelociRaptor/Intel X-25M SSD, 190.38 driver set. All in-game settings maxed out, 16x AF set via the control panel (8x AA via the app). I don't have much free time tonight, so just ran the test three times at 2560x1600.

    PhysX off = 58 average fps
    PhysX Normal = 35 fps
    PhysX High = 28 fps

    So at that res, with those settings, PhysX literally cuts the frame rate in half (at least in that benchmark, which specifically focuses on scenes using it). The difference between high and normal is cloth physics (no fluttering flags or cobwebs at normal).
     
  16. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    @JR

    Since this is UE3, I suppose it's frame-capped at 62fps (hovering around that area) and you could possibly go higher.
     
  17. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    Where is the in-game benchmark?!? :shock:
     
  18. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    Ah, forgot about that. Glad I tested at the highest res I could then.

    It's in the menu, Diggie, though your special 'digital download' copy might not have it. :cool:
     
  19. SiliconAbyss

    Newcomer

    Joined:
    Mar 28, 2004
    Messages:
    75
    Likes Received:
    0
    Location:
    Canada
    As absurd it appears to you and I'm sure many others including myself, this seems to be the #1 stance Nvidia is taking. When it first became clear that AMD would beat NV to the DX11 market, I heard suggestions that Nvidia would focus in on PhysX and claim it as more important, but I never actually believed they would do it.
    Of course. But Nvidia does have a big advantage over AMD, they are the market leader (let's leave Intel out of this discussion) so what they say does carry a significant amount of weight. How far that leading position will help them stop the bleeding remains to be seen.

    The thing is, if Nvidia has their DX11 part in stores around January, then all of this will be forgotten. If it stretches in Feb/March and still nothing, then Nvidia will be in serious trouble.
     
  20. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    I dunno, if it's not out until January I think that would hurt them plenty enough just missing the holiday season. :???:
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...