Why Can't Nvidia cards do..

I guess I'll wait for a mid-range G8x to upgrade my ti4200. so I'd get a new card when there's really something interesting to play (Crysis and Stalker), able to run in 1024 or 1152 with FP16 HDR and 2x AA, I hope.
(for some reasons I want Nvidia)
 
I have a good 17" CRT, and I'm not rich. (it's big enough considering I sit close to it and I had a 14" before)

maybe it's a conservative expectation for how a "gf 8600" (probably 128bit GDDR4) would run on a Unreal 3 style engine, highest detail (including the most expensive soft shadows), maybe not. anyway 1024 or 1152 with 4x rotated grid AA is perfect on my display.
 
Well. The e-mail itself didnt mention the G71 at all. Just current Geforce 7 hardware. So I'm not 100% sure what they meant in it either. Heres a snippet from the e-mail we got. Extrapolate it however you see fit.

Expect NVIDIA's next generation architecture to support HDR+AA.

This e-mail was sent out to us after I sent nvidia alot of questions regarding the issue of MSAA/SSAA + HDR. So you can at least understand that Nvidia has changed its position slightly since David Kirk first made his reply on the issue.
 
Last edited by a moderator:
  • Like
Reactions: Geo
Ailuros said:
Pardon for the tongue-in-cheek comment, but it's not like anyone couldn't have figured that D3D10 GPUs wouldn't support HDR/MSAA combinations.
Well, right, I think we all expected it, but more from a, "nVidia would be really screwed if they didn't do it," standpoint than from any position of real knowledge.
 
ChrisRay said:
Well. The e-mail itself didnt mention the G71 at all. Just current Geforce 7 hardware. So I'm not 100% sure what they meant in it either. Heres a snippet from the e-mail we got. Extrapolate it however you see fit.
I thought that at first, too. But I think we're still expecting the G71 to be called "GeForce 7900," so I think the statement that the GF7 doesn't support HDR+AA due to a hardware limitation also applies here.
 
Chalnoth said:
I thought that at first, too. But I think we're still expecting the G71 to be called "GeForce 7900," so I think the statement that the GF7 doesn't support HDR+AA due to a hardware limitation also applies here.


*Nods* I thought so too. But I try to be very careful quoting when it comes to something like this since its so easy to misintepret or misread. But I pinged Nvidia directly and got confirmation about it being beyond the G70 architecture. So I guess its pretty hard to misintepret that.
Chris
 
Last edited by a moderator:
Pretty unrealistic to expect that significant a change to the ROPs in their current architecture for a refresh anyways, regardless of market/mindshare pressure to compete against ATI's 1xxx parts.
 
Blazkowicz_ said:
I have a good 17" CRT, and I'm not rich. (it's big enough considering I sit close to it and I had a 14" before)

maybe it's a conservative expectation for how a "gf 8600" (probably 128bit GDDR4) would run on a Unreal 3 style engine, highest detail (including the most expensive soft shadows), maybe not. anyway 1024 or 1152 with 4x rotated grid AA is perfect on my display.

Mainstream D3D10 GPUs are going to arrive that late, that I'm not really that worried about the first game using UE3 ;)
 
Chalnoth said:
Well, right, I think we all expected it, but more from a, "nVidia would be really screwed if they didn't do it," standpoint than from any position of real knowledge.

Ignore the competition for a second; if asked before your doubts of it not being an option would be miniscule.
 
ChrisRay said:
But I try to be very careful quoting when it comes to something like this since its so easy to misintepret or misread.
Unless you're talking about being careful about commenting on emails you're allowed to post, you shouldn't need to be careful about posting emails you're allowed to post.

Usually, misinterpretation/misreading (by any member of the public) of an email posted by anyone who was allowed to post such an email provides great discussions.

You have no idea how glad I was John alloed me to post his response to my email on the id/Creative issue way back then. Now that was something I should have been very careful about when it comes to posting an email!

Stop being careful when there's no such need :)

Sorry for the OT!
 
Reverend said:
Unless you're talking about being careful about commenting on emails you're allowed to post, you shouldn't need to be careful about posting emails you're allowed to post.

Usually, misinterpretation/misreading (by any member of the public) of an email posted by anyone who was allowed to post such an email provides great discussions.

You have no idea how glad I was John alloed me to post his response to my email on the id/Creative issue way back then. Now that was something I should have been very careful about when it comes to posting an email!

Stop being careful when there's no such need :)

Sorry for the OT!

No no. What I meant is. I am careful misquoting because I could be wrong about my interpretation of the quote. Its mostly a self issue where I dont want to end up looking stupid. Heh. There was nothing in the e-mail that I would have to worry about. Just some replies from Nvidia about the possibility of supersampling and HDR. Though I didnt completely get the answer I was hoping for. ((Mostly related to multisampling)) I did however get some more useful information about the upcoming hardware I guess. Lose Some, Get some I guess :)

Chris
 
Ok, late to the game, but is it indeed possible for G7x to AA in pixel shader for FP16 HDR? If so, would this be something easy to implent and what about the performance hit compared to do AA in the ROPs?
 
MistaPi said:
Ok, late to the game, but is it indeed possible for G7x to AA in pixel shader for FP16 HDR? If so, would this be something easy to implent and what about the performance hit compared to do AA in the ROPs?

Age of Empires 3 uses afaik 1.5*1.5 OGSS. I doubt anything beyond that is possible (as in multisampling for instance and if we're talking about real float HDR). Mark the genre and performance requirements of AoE3; there a minor shot of SSAA shouldn't be that much of an issue.
 
So when can Ati users take advantage of this feature or is it the next 3DC?

At this moment, there is only ONE game which has official MSAA+FP HDR support for Ati where Nvidia can't do it, Serious Sam2.

Farcry: no official patch out yet.
AOE3: no FP blending HDR for Ati.
SCCT: no FP blending HDR for Ati.
Oblivion: devs already said no support for HDR+MSAA.

Ati please, spend some $$$ on marketing.
 
Last edited by a moderator:
Chalnoth said:
The correct term would be developer relations :)

That's not correct either in a strict sense; there's not that much even devrel could do for released games when a company arrives that late with float HDR support.

Despite any possible antagonism between ATI/NVIDIA, float HDR was more like a test-run for users to see partially what it is all about (both in games as in hardware support) and we'll get real float HDR enabled games when D3D10 GPUs arrive.
 
Back
Top