NVIDIA GF100 & Friends speculation

Only in a few games, not generally. And I bet that ATI will be late with optimizing Metro33. ^^
They (ATI) have a different mantra now, I bet they will have tweaks for Metro33 sooner than you'd bet however how much of a gain they squeeze and will it be matter are two questions which are more appropriate.
 
So should we assume that every DX11 title released up to date will be faster on AMD?

what kind of silly arguement is this ?

Nvidia exchanged money with this developer. Believe out right payments for performance , believe marketing money . Whatever you believe you can bet that nvidia didn't pay out money for a game to be slower on its hardware.
 
So should we assume that every DX11 title released up to date will be faster on AMD?

That's a good default assumption. Anyone can buy a boatload of ATI DX11 cards to play with and tune for.

NV DX11 cards are in short supply. I'm sure key developers have a few, but there's a big difference between having 1-3 cards running around your office and having 10s or 100s to play with.

Given that ATI cards have been around for a lot longer, devs are just going to be a lot more familiar with them and have more motivation for tuning (what is ATI's DX11 marketshare again?).

DK
 
That's a good default assumption. Anyone can buy a boatload of ATI DX11 cards to play with and tune for.

NV DX11 cards are in short supply. I'm sure key developers have a few, but there's a big difference between having 1-3 cards running around your office and having 10s or 100s to play with.

Given that ATI cards have been around for a lot longer, devs are just going to be a lot more familiar with them and have more motivation for tuning (what is ATI's DX11 marketshare again?).

DK

100%
 
what kind of silly arguement is this ?

Nvidia exchanged money with this developer. Believe out right payments for performance , believe marketing money . Whatever you believe you can bet that nvidia didn't pay out money for a game to be slower on its hardware.

You belief is completely not based in reality. nVidia does not pay out any money and when you makes quotes like that you expose yourself as an uninformed spreader of FUD.

Why don't you actually read what nVidia does and doesn't do for game developers.

http://www.xbitlabs.com/news/multim...e_Developers_for_Implementation_of_PhysX.html
 
Erhm?
Cysis Warhead has reduced I.Q. over Crysis, what are you talking about?

Not really, all in all it is doing more. Just becouse draw distance for procedural clutter is reduced doesn't change that it has increased load due to upped draw distances for other stuff. Also increased load due to more intensive post process/effect use and increased AI entities in battle upto 40 entities at the same time active. And warhead has more higher res textures and comes with a higher memory footprint than Crysis due to the increased detail.
 
Apparently it really depends on who you ask. http://www.bit-tech.net/hardware/graphics/2009/10/01/autumn-2009-graphics-card-buyers-guide/5

How come there haven't been any early benchmarks of Metro 2033 though? The game is due out on Tuesday right? Normally by now there would've been something.

bit-tech agrees with me The 4890 in their testing is on par with the gtx 260. The x2 is 1 fps faster than thegtx 260. The gtx 285 is a good 14fps faster.

As for metro who knows
 
Back
Top