How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
Scali was talking about what he thinks about PhysX in relationship to DX11. Is he biased? Maybe. Who cares. Alot of people on this forum are biased. But the labeling around here has reached a point of obsurdity. This is not something thats just happening on one side of the fence. Beyond3d lately to me has been a very unfriendly place for tech discussions.

Yea, I'm trying to have a tech discussion.
It seems that people are so brand-focused that they can't even conceive that there are people like me who don't care what the label says on the tin, but rather what's inside.
I deliberately posted a comprehensive list of the graphics cards/platforms I've used over the years in my signature, to hopefully make people see that for me "anything goes", as long as it's cool hardware. I've never favoured any brand. Each generation I'll just pick whichever card appeals to me the most (yes, I chose a Radeon 8500 over a GeForce 3 back in the day. I bought it because it was a good GPU, actually the first good GPU that ATi ever made. The GPU that slowly started building the reputation of ATi as a good alternative to nVidia).

In fact, if you look far enough back in my post history, you'll find me defending nVidia's shadowmapping extensions, even though at the time I was using a Radeon 9600 card myself, which supported no such extensions. Despite the fact that I owned a card of a different brand, which didn't have that technology, I didn't find it necessary to downplay the technology. Today, virtually every GPU on the market, and virtually every game uses this shadowmapping technology.

Likewise I think accelerated physics will be in virtually every game, running on virtually every GPU in the future. However, today there is only one working solution out there, which happens to be PhysX. As I already said, I supported it long before nVidia owned it and before it ran on GPUs. I saw value in the PPU aswell (even though I never actually owned one myself).

To conclude... What I'm saying about PhysX and DX11 is simply what I expect to happen when both companies start their marketing engines.
I don't currently use PhysX in my code, and my engine is already ported to DX11.
In my perfect world, PhysX would work on all GPUs, and nVidia had a DX11 part ready at the same time as AMD... Sadly this is not the case. Would *I* pick a PhysX DX10 card over a DX11 one? Probably not (especially since I already have a PhysX DX10 card anyway). But I am talking about what nVidia wants people to do (topic of this thread), and I think they'll be able to convince people.
 
Last edited by a moderator:
No, I'm pointing out that nVidia has these shiny new PhysX games to show to the public...
Like what games exactly? There hasn't been a single game so far where physx hasn't been gimmicky, so far it's never been an integral part of the gameplay experence, and indeed it cannot be because of the small fraction of the market that's physx capable.

Where AMD has DX11, except they can only show DX9/DX10 stuff... They can't see anything shiny and new.
Oh come on now... It's not as if DX11 is something completely nebulous and immaterial. DX11 exists right now, it's being put into boxes as we type; titles for it exist in advanced stages of development, they just haven't been released yet.

If AMD wants to show off DX11 stuff at their hardware introduction they can surely do so, it's well within their power and ability to do so.

It's an incredibly good point actually.
You're making a bigger deal of it than actually warranted IMO. Nobody buys graphics cards because of physx, and that's a fact. DX11 is the future, and besides, AMD's card will be the fastest piece of hardware on the planet until Nvidia counters with their own chip, regardless of the status of DX11.

Nvidia has an edge in the professional market with Cuda, but that doesn't apply to consumer 3D.
 
Chris, I was in the process of chastising Digi for baiting you (I agree --everyone knows who you are and your history) when I discovered you seem to have taken personally a remark aimed at someone else entirely, as the remark Digi quoted showed.


I think my post here was misunderstood., As digi implied that scali was acting covertly. The focus group does not run that way And I think digi should know better especially considering his history on the subject. Nvidia does not allow focus group members to post without them labeling themselves as such due the initial problems with how it was percieved. I was just commenting why I do not have one. There are not alot of signatures on this forum so I prefer not too try and stand out against that trend. I just decide to put in my user title. In a nutshell I am just tired of the witch hunts and everyone questioning everyone's agenda. Most of these should be irrelevant when discussing technology as long as its civil and polite. The baiting is simply not needed.

*edit* Lots of typos and corrections.
 
Last edited by a moderator:
Like what games exactly? There hasn't been a single game so far where physx hasn't been gimmicky, so far it's never been an integral part of the gameplay experence, and indeed it cannot be because of the small fraction of the market that's physx capable.

There we go, downplaying again.
I don't think you get the point... What we may see as 'gimmicky' may impress Joe Schmoe, because it's nice and colourful and it moves around. It's smoke and mirrors, so to say.

Oh come on now... It's not as if DX11 is something completely nebulous and immaterial. DX11 exists right now, it's being put into boxes as we type; titles for it exist in advanced stages of development, they just haven't been released yet.

Hello!? How often do I have to repeat myself? I actually have a DX11 engine in development myself.
You just don't get what I'm saying, and I'm tired of trying to explain it. You just don't want to understand.

If AMD wants to show off DX11 stuff at their hardware introduction they can surely do so, it's well within their power and ability to do so.

But not with actual games, and demos just don't work that well anymore, because of what happened over the years, with all sorts of demos/benchmarks being rigged, effects/features never materializing, things not looking exactly as good as the demo did, etc.. People have become suspicious. Even hardware/game reviewers have become suspicious, and it shows in the reviews that they publish. The past few years there's been a strong focus on benchmarking with actual games, and trying to capture the actual gameplay experience, rather than just running canned demos and benchmarks.

You're making a bigger deal of it than actually warranted IMO.

Your opinion is noted. Have a nice day.
 
The next person to use "you" or "your" in a post on this thread (after this one, of course) in the next ~24 hours gets a free one week vacation. This isn't RPSC. Members will discuss the issues relevant to the thread, not each other. Or else.

The next smart-aleck oneliner in the next ~24 hours gets a discount --just a one day vacation.

Why the "~"? Because if it is 24 hours and two minutes or something of that carefully not exactly announced duration, no forum-lawyering will avail in defense, s'welp me!

Edit: "you" that is clearly in the plural (or join the southerners and go for y'all), or impersonal sense, as in "well, what you can do go to get around that . . ." will be tolerated.
 
It might, yes. With titles like Mirror's Edge, Cryostasis and Batman AA, the customers are slowly becoming aware of the existence of PhysX, and realizing that there's some extra visuals to be had in various games.

Oh please. Mirror's Edge? lets turn on physx so the whole game runs worse and on the odd case that is actually being looked at some useless flag it acts a little differently!


MOD EDIT: I suspect aaronspink did not see my last post, as he was responding to one from yesterday, so I edited instead. But really, I'm not kidding on avoiding personal pronouns for the next 24 hours on this thread.
 
Nvidia does not allow focus group members to post without them labeling themselves as such due the initial problems with how it was percieved.

Right! they hire companies to do that for them! If people are skeptical of astro-turfing on the part of nvidia, it is well deserved skepticism.

In fact I wouldn't be surprised if Nvidia started a large scale astro-turf campaign to go along with their latest strategy of Physx >> ALL. Realistically, it is the only marketing argument they can make(price cuts cannot really be considered marketing arguments and actually have a history of acting as anti-marketing for products in the "man that's cheap, I wonder what is wrong with it" vein). So from a biz perspective, one could hardly find fault with pushing their story of physx as hard as possible.

Not that I think it will work on a large scale, but any aborted or delayed sale is a benefit at this point to nvidia.
 
Last edited by a moderator:
I wonder what people will say when ATI shows Dx11 games running with Dx11 features on a Dx11 card? Nevermind, that already happened.

I can only imagine that reviewers will also have access to Dx11 games when reviewing the Dx11 card. Unless for some bizarre reason MS won't allow them access to Dx11 for review purposes?

Regards,
SB
 
I wonder what people will say when ATI shows Dx11 games running with Dx11 features on a Dx11 card? Nevermind, that already happened.

I can only imagine that reviewers will also have access to Dx11 games when reviewing the Dx11 card. Unless for some bizarre reason MS won't allow them access to Dx11 for review purposes?

Regards,
SB
I cant really believe that so much space was wasted arguing that PhysX matters over D3D11. I mean really, that makes zero sense to even the most platform agnostic person, heck I would be a S3 zealot and that would still remain an absurd suggestion.

I think the moral of this thread is that Nvidia will focus on PhsyX over the 5800 and its ok. Well, we'll find out next year ..
 
I think the moral of this thread is that nVidia will spread whatever FUD they feel they must in order to make up for their lack of a competing product. :yep2:
 
Then you ask me to support that people promote a certain brand/company on the internet :)

(fixed) assumption was one sided. I'm simply pointing out I don't think it's that easy to say (And I know it's not true in the outside world where the noobs dwell). Where is crossfirezone.net? lol. :p
Anyways, the beta testers I'm sure are far outnumbered by the other group that does so much more then us. Dare I say were not even active anymore? I still have it in my sig because I'm proud of the fact I helped contribute what I could.
I don't find anything ironic to it at all.
 
I think a few people should try decafe.

Nvidia will respond like any company in a bad position will, they will lie, cheat, and steal their way into the hands of new consumers. I expect their marketing department will be working overtime.
 
As promised, here are some quick Batman: AA numbers using the in-game benchmark. Test system is a Core i7 965, 6GB, GTX 285, VelociRaptor/Intel X-25M SSD, 190.38 driver set. All in-game settings maxed out, 16x AF set via the control panel (8x AA via the app). I don't have much free time tonight, so just ran the test three times at 2560x1600.

PhysX off = 58 average fps
PhysX Normal = 35 fps
PhysX High = 28 fps

So at that res, with those settings, PhysX literally cuts the frame rate in half (at least in that benchmark, which specifically focuses on scenes using it). The difference between high and normal is cloth physics (no fluttering flags or cobwebs at normal).
 
@JR

Since this is UE3, I suppose it's frame-capped at 62fps (hovering around that area) and you could possibly go higher.
 
@JR

Since this is UE3, I suppose it's frame-capped at 62fps (hovering around that area) and you could possibly go higher.

Ah, forgot about that. Glad I tested at the highest res I could then.

It's in the menu, Diggie, though your special 'digital download' copy might not have it. :cool:
 
I cant really believe that so much space was wasted arguing that PhysX matters over D3D11. I mean really, that makes zero sense to even the most platform agnostic person, heck I would be a S3 zealot and that would still remain an absurd suggestion.
As absurd it appears to you and I'm sure many others including myself, this seems to be the #1 stance Nvidia is taking. When it first became clear that AMD would beat NV to the DX11 market, I heard suggestions that Nvidia would focus in on PhysX and claim it as more important, but I never actually believed they would do it.
I think the moral of this thread is that nVidia will spread whatever FUD they feel they must in order to make up for their lack of a competing product. :yep2:
Of course. But Nvidia does have a big advantage over AMD, they are the market leader (let's leave Intel out of this discussion) so what they say does carry a significant amount of weight. How far that leading position will help them stop the bleeding remains to be seen.

The thing is, if Nvidia has their DX11 part in stores around January, then all of this will be forgotten. If it stretches in Feb/March and still nothing, then Nvidia will be in serious trouble.
 
Status
Not open for further replies.
Back
Top