Its funny, there seems to be less interest at this point as to what Nvidias high end GPU is going to be. The interest seems to revolve around the question of when relative to AMD's high end GPUs.
To be fair. This thread was never about the GT300 specifically. Its been a gloating thread from the start
Pfft, holiday season isnt important. If it does matter even remotely then Nvidia will have launched G300 in some form or another.I dunno, if it's not out until January I think that would hurt them plenty enough just missing the holiday season.
Honestly! I have no idea what that is
Geo, we're allowed to use "I" right? And "we"?
As promised, here are some quick Batman: AA numbers using the in-game benchmark. Test system is a Core i7 965, 6GB, GTX 285, VelociRaptor/Intel X-25M SSD, 190.38 driver set. All in-game settings maxed out, 16x AF set via the control panel (8x AA via the app). I don't have much free time tonight, so just ran the test three times at 2560x1600.
PhysX off = 58 average fps
PhysX Normal = 35 fps
PhysX High = 28 fps
So at that res, with those settings, PhysX literally cuts the frame rate in half (at least in that benchmark, which specifically focuses on scenes using it). The difference between high and normal is cloth physics (no fluttering flags or cobwebs at normal).
would they tell you?Do devs have working g300's
8xMSAA in 2560? Doesn't sound like a very demanding game to me. Two questions for when you have time: how does it fair with 4xAA instead and what's roughly the average framerate that still guarantees playability?
I think the moral of this thread is that nVidia will spread whatever FUD they feel they must in order to make up for their lack of a competing product.
Pfft, holiday season isnt important. If it does matter even remotely then Nvidia will have launched G300 in some form or another.
Of course, Nvidia will soon be telling us that Christmas isn't important and Easter is what consumers actually want. :
Nvidia's Sr. Vice President, Investor Relations, Mike Hara, has played down the significance of Christmas at the Deutsche Bank Securities Technology Conference. Instead, Mr. Hard insists Holidays like Easter, 4th July, and Thanksgiving are the future.
“Christmas by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build holidays, which is always good, and the new features at Christmas are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in Christmas,” explains Mr. Hara.
8xMSAA in 2560? Doesn't sound like a very demanding game to me. Two questions for when you have time: how does it fair with 4xAA instead and what's roughly the average framerate that still guarantees playability?
Proves the point I made earlier though... Even current cards can get 60+ fps at very high resolutions.
The game is capped at 60 fps anyway, so what point is there in a faster AMD card?
It may be faster than the GTX285, but the GTX285 can already do 60+ fps at 2560 res with 8xMSAA and 16xAF. Where do you go from there?
eyefinity?
Someone should make a generator for this, where you pick 2 words (e.g. GPGPU and DX11, Easter and Christmas, etc.) and it generates an Nvidia statement.
As promised, here are some quick Batman: AA numbers using the in-game benchmark. Test system is a Core i7 965, 6GB, GTX 285, VelociRaptor/Intel X-25M SSD, 190.38 driver set. All in-game settings maxed out, 16x AF set via the control panel (8x AA via the app). I don't have much free time tonight, so just ran the test three times at 2560x1600.
PhysX off = 58 average fps
PhysX Normal = 35 fps
PhysX High = 28 fps
So at that res, with those settings, PhysX literally cuts the frame rate in half (at least in that benchmark, which specifically focuses on scenes using it). The difference between high and normal is cloth physics (no fluttering flags or cobwebs at normal).
What am I missing here? I thought by buying a nVidia card (and one of the best) you get a lot of visual extras and the card takes care of the math doing it? Then why does the fps tank? 285 is quite a capable card, no?
Looks like strange marketing to me; "Buy a PhysX card with lots of visual extras in games and get your fps halved at the same time". With fps nearly halfed, why couldn't these extras be run on the CPU and an AMD GPU instead?
Please explain as I feel very confused at the moment.