NVIDIA GF100 & Friends speculation

Of course, cause it's not like you can disable it if you wish. Oh wait, you can!! ;)

That really wasn't in the scenario he gave, was it? Propping up the 'value' of a card by showing us how we can use a feature supported by a scant handful of games to halve my framerate isn't something I'm particularly interested in. My opinion does not equate to that of everyone else, of course, but since he provided the impromptu question, I provided my answer.

Or you can pick up a cheap 8800 and use it as a stand-alone fizzX card. :)
Only if you have another NVIDIA product to actually display the game though ;)
 
You sure about that sp/tmu combination you got there....... :)
Well he said half-quad tmus. Seems highly doubtful to me though. 3 half-quad tmus per SM? It's true half-quad tmus exist, but I've only ever seen them if the total number of tmus was less than 4... OTOH such a thing shouldn't be very difficult to implement - because you really want to operate on quads (derivatives and all) each half-quad tmu would still work on quads, but simply deliver the results over 2 clocks.
 
That really wasn't in the scenario he gave, was it? Propping up the 'value' of a card by showing us how we can use a feature supported by a scant handful of games to halve my framerate isn't something I'm particularly interested in. My opinion does not equate to that of everyone else, of course, but since he provided the impromptu question, I provided my answer.

Well yeah on an individual level you can decide that the "PhysX option" is of zero value to you. Yet it is an option and thus inherently has a value over not having that option. So in Chalnoth's scenario where the card with the option is priced the same as the one without, the former is inherently of higher value to the consumer.

It's like tessellation for me now. Games haven't done anything with it yet but I still miss the ability to play with Unigine first-hand. So there still is some value to having that particular feature even if it's just to run a meaningless demo/benchmark.
 
That really wasn't in the scenario he gave, was it? Propping up the 'value' of a card by showing us how we can use a feature supported by a scant handful of games to halve my framerate isn't something I'm particularly interested in. My opinion does not equate to that of everyone else, of course, but since he provided the impromptu question, I provided my answer.

Wrong, because being able to disable it, IS part of the question.

You may not like what you see now, but what if you do eventually ? You'll be stuck with something that can't provide it, while if you did buy what does, you'll be able to choose to enable it whenever you can or want.

Also, by your "half the frames" reasoning, you will never use the DX11 path or tessellation either, on current hardware, because the hit is quite significant too...
 
I would like to see a poll asking who considers physX when making a GPU purchase.

A poll here wouldn't be very useful. To be properly meaningful such a poll would have to include folks who stand in their local B&M store and read the words on the back of the packaging. By and large those people don't come here.
 
Well if you do things official like, I prefer to run a 5970 with my 8800GTS tbh.
That wasn't how I interpreted what you presented, however it doesn't change my concept of value much either. I don't have any interest in physics on my GPU, beause it's barely playable now on a GT200 class card, and going forward it's only going to get less performant...

Wrong, because being able to disable it, IS part of the question.

You may not like what you see now, but what if you do eventually ? You'll be stuck with something that can't provide it, while if you did buy what does, you'll be able to choose to enable it whenever you can or want.
Again, that's not how I interpreted his question. And even so, as I already replied above, the performance on a single GT200 is pretty crap right now when doing "dual duty" -- I don't imagine it getting any better without me buying a wholly seperate card.

Also, by your "half the frames" reasoning, you will never use the DX11 path or tessellation either, on current hardware, because the hit is quite significant too...
The two are not the same. I can have a DX11-capable rendering engine (display API) that performs fantastically (and there are already several). Whereas using a video card for PhysX and display simultaneously (also how I interpreted his original hypothetical situation) is only going to get MORE demanding as games continue to evolve.

I don't see a single card having enough horsepower to do what I want with DX9, 10 or 11 and still having enough "gusto" to optionally run a bunch of physics simulations that I currently am not interested in.
 
Last edited by a moderator:
A poll here wouldn't be very useful. To be properly meaningful such a poll would have to include folks who stand in their local B&M store and read the words on the back of the packaging. By and large those people don't come here.

Yep, you might as well make the poll question "ATi vs Nvidia?" since that's going to produce the same result. Besides, there are lots of polls about PhysX already on bigger sites. But the question isn't whether you would buy something for PhysX though, that is silly. The question is would you take it for free. That's essentially what's being offered to Nvidia's customers.

How do you capture the "normal people" who google for Batman reviews and come across this?

That said, for those of you wanting our honest opinion, we’ll admit that if you want to get the most out of Batman then you will need Nvidia hardware – and the extra content is worth the effort if you’re thinking of making an upgrade soon anyway.
 
How do you capture the "normal people" who google for Batman reviews and come across this?

"normal people" don't read reviews in the first place. Normal people buy based on
1) advertising
2) attractive packaging
3) word of mouth

Also WRT the particular review cited, I happen to agree with their conclusion, as I believe most gamers who have tried Batman with and without Physx would as well.
 
That wasn't how I interpreted what you presented, however it doesn't change my concept of value much either. I don't have any interest in physics on my GPU, beause it's barely playable now on a GT200 class card, and going forward it's only going to get less performant...


Again, that's not how I interpreted his question. And even so, as I already replied above, the performance on a single GT200 is pretty crap right now when doing "dual duty" -- I don't imagine it getting any better without me buying a wholly seperate card.


The two are not the same. I can have a DX11-capable rendering engine (display API) that performs fantastically (and there are already several). Whereas using a video card for PhysX and display simultaneously (also how I interpreted his original hypothetical situation) is only going to get MORE demanding as games continue to evolve.

I don't see a single card having enough horsepower to do what I want with DX9, 10 or 11 and still having enough "gusto" to optionally run a bunch of physics simulations that I currently am not interested in.

Hey, change "PhysX" with "Tessellation" and you have the same situation with evergreen. :LOL:
 
Yep, you might as well make the poll question "ATi vs Nvidia?" since that's going to produce the same result. Besides, there are lots of polls about PhysX already on bigger sites. But the question isn't whether you would buy something for PhysX though, that is silly. The question is would you take it for free. That's essentially what's being offered to Nvidia's customers.

How do you capture the "normal people" who google for Batman reviews and come across this?

Beta fluidmark 1.2 results show that the cpu-s are not bad at all in PhysX (at least in this particle simulation) http://physxinfo.com/news/2390/new-physx-fluidmark-1-2-first-tests/ .
 
"normal people" don't read reviews in the first place. Normal people buy based on
1) advertising
2) attractive packaging
3) word of mouth

Of course they do. I read reviews on nearly everything I purchase yet I don't immerse myself in the online community like I do with GPUs. I've never participated in a poll or forum on cars, digital cameras, tv's etc but I surely read reviews on them before buying.
 
Fermi 2 should be slated for Q4 next year on 28nm. Can't see it being any earlier than that. They probably just realized B1 wouldn't have done squat cause something is fundamentally wrong. It'll be easy to tell once the down-market parts are released.
 
Back
Top