Wich card is the king of the hill (nv40 or R420)

Wich card is the king of the hill (nv40 or R420)

  • Nv40 wins

    Votes: 0 0.0%
  • they are equaly matched

    Votes: 0 0.0%

  • Total voters
    415
Status
Not open for further replies.
DaveBaumann said:
I just think that telling people now "give up half your performance for this" isn't going to happen for a generation or so, on a widescale basis.

"This" doesn't seem to be a small thing though from the screenshots i've seen. And half the performance of the NV4X, R420 is hardly slow. Definitely not compared to my GF4 :)
 
Bjorn said:
I thought that the loss of FSAA wasn't necesarily true. Though there's of course always the SSAA option :)

And loss of fill-rate doesn't seem like a big issue with the R420, NV40X generation. The question should rather be, will the games still be playable at acceptable resolutions using those features ?

For the $400- 500 cards the fillrate might not be an issue, the other 95% of the market may run into problems.
 
Putting it in CoD or SS:SE will drop the performances significantly, but given the environments and LoD in there would it really suit the detail in those games? Putting it in a more detailed environment, such as you are likely to see in D3/HL2/UE3/etc. would look much better and more correct, but then those titles are going to have significantly lower performance anyway.
 
DaveBaumann said:
No, it is the case MSAA doesn't work with NV40's FP blending.

As pointed out to you in another thread, you have the same issues when trying to use HDR on ATI without blending. The people responding to you were trying to point out that blending is a way to gain back performance compared to using shaders, and that hardware filtering runs faster than doing it in the shaders.

As for the bandwidth and fillrate implications, we're seeing some games with frame rates well in excess of 60fps at highest resolutions and games that are heavily CPU limited. There are plenty of scenarios where HDR rendering could be used which would not be a 50% performance reduction across the board. Sure, it won't work with all game engines but did anyone suggest enabling HDR like it is some kind of "always on" FSAA-like feature? You are pooh-poohing it way beyond the level needed. Sometimes it can be used, sometimes it can't, just like lots of other 3D features (e.g. Temporal FSAA doesn't work on all games either) The 6800U and X800 certainly have enough performance to enable it in some scenarios where it is desparately needed. Can't you atleast admit that it's a usable feature? Is there any exclusive feature of the NV40 that you don't dislike?
 
DaveBaumann said:
Putting it in a more detailed environment, such as you are likely to see in D3/HL2/UE3/etc. would look much better and more correct, but then those titles are going to have significantly lower performance anyway.

Half the performance of the NV4X would still be in the ballpark of the 9800 Pro/XT. And i doubt that the 9800 will have problems running these games. Well, UE 3 engine games will probably be problematic but then we're 1.5-2 years in the future also :)
 
Democoder said:
As pointed out to you in another thread, you have the same issues when trying to use HDR on ATI without blending.

I didn't say there wasn't. I was addressing Bjorns question, which was concerning NV40.

As for the bandwidth and fillrate implications, we're seeing some games with frame rates well in excess of 60fps at highest resolutions and games that are heavily CPU limited.

And these games are relatively low detail. My point being is that increased level of detail are going to be required, and this will impact of performance - you can't yet have it all, at least, not yet.

Can't you atleast admit that it's a usable feature? Is there any exclusive feature of the NV40 that you don't dislike?

By god, you really can be a sad little Muppet at times can't you? You're the one ascribing "dislike" to these things because I have expressed some things aren't congruent with your point of view - and then you tar everything with your own biased brush. I shall refer you to the conclusion of the NV40 preview. Liking or disliking a feature doesn't necessarily mean that the performance will be there for it to be utilised effectively and on a widespread basis in the generation its introduced in either.
 
Bjorn said:
DaveBaumann said:
Putting it in a more detailed environment, such as you are likely to see in D3/HL2/UE3/etc. would look much better and more correct, but then those titles are going to have significantly lower performance anyway.

Well, half the performance of the NV4X would still be in the ballpark of the 9800 Pro/XT. And i doubt that the 9800 will have problems running these games. Well, UE 3 engine games will probably be problematic but then we're 1.5-2 years in the future also :)

NV4x does not necessarily mean a 16 pipe card at 400 mhz. What about the 12 pipe 6800 at 300-375 mhz? What about the lower end parts?
 
Bjorn said:
Well, UE 3 engine games will probably be problematic but then we're 1.5-2 years in the future also :)

Which will probably be a generation or so's hardware down the line! :)
 
I just thought the lack of conviction in this particular post was amusing....with all the maybes

I was bing nice and open minded. Besides my speciality is Progranmme Management and large scale finance in military projects, not 3d hardware. :)
 
AlphaWolf said:
NV4x does not necessarily mean a 16 pipe card at 400 mhz. What about the 12 pipe 6800 at 300-375 mhz? What about the lower end parts?

That's true. But HDR rendering was supposed to be usable on a R300 using much slower methods then FP blending. Now we have the NV40 which are, just released, have double the performance of the last gen, can use a much more effective way to do HDR. And all of a sudden it's close to unusable ?
 
That's true. But HDR rendering was supposed to be usable on a R300 using much slower methods then FP blending. Now we have the NV40 which are just released and have double the performance of the last gen and which can use a much more effective way to do HDR and all of a sudden it's close to unusable ?

It's called 'Managing Expectations'...
 
DaveBaumann said:
Which will probably be a generation or so's hardware down the line! :)

Sure. But i sometimes think that we are very quick to judge older hardware. The R9700 is close to 2 years old and is still a mid end card by todays standards. And we just got a doubling in performance and add to that the problems with getting benefits by going to "smaller" manufacturing processes and you start to wonder if we'll see any doubling of GPU performance next year. And ATi is hardly in any better position then NVidia since they need to move to FP32, SM3.0 (or better).
 
Sad little muppet? So its silly name calling now that you're being criticized? I just pointed out that you're in these forums beating drums for ATI, but not once have I seen you defend any features of the NV40. In fact, you've taken features you were previously enthusiastic about and declared them unusable.

You're attitude was completely different when it was NV3x being accused of not supporting HDR formats in DX9. Back then, nVidiots were raising the same concerns you are now, but you weren't raising any skepticism about performance back then.

DaveBaumann said:
Whilst Valve were there they displayed that they had implemented higher dynamic range into the engine. HDR will be available in the game to those that support its requirements, however it is not currently implemented in the benchmark. HDR needs the a higher range, and PS1.4 only has -8, 8 range which is probably not enough - the HDR effects also require float buffers which none of the current NVIDIA drivers support. There is also dramatic visual differences between HDR being enabled and disabled.

DaveBaumann said:
The other issue is that I actually don't think there is any solutions to the performance WRT DX9 performance. What Gabe was saying is that as more is added to the engine this will be DX9 only content, and thus the performance difference will get worse than it is with the mixed modes. HDR is a great example of this: it will force any board that its enabled on to run exclusively in high precision float (i.e. over the PS2.0 pipeline) - because the FX series is so lacking in FP performance in the PS the relative performance will be what we are seeing in the current DX9 (or even worse). I think its more of a case of building that awareness because they just aren't that great for Floating Point rendering performance.


I mean, did you consider back then that HDR performance would be bad? Instead, you seemed to assume it would be good on R300, and therefore a detriment to NV3x because it would force float precision.

You practically absorbed and parroted everything Valve said back then without much skepticism, especially if directed against NV in some way.
 
I just pointed out that you're in these forums beating drums for ATI, but not once have I seen you defend any features of the NV40.

No, you are making false assumptions based on your own preferences – I’ve already made my position clear in the review, whatever else you think is your own issue.

I mean, did you consider back then that HDR performance would be bad? Instead, you seemed to assume it would be good on R300, and therefore a detriment to NV3x because it would force float precision.

You appear to be developing blind spots to known issues again – NV3x (more so with < NV35) does suffer from float shaders over integer, so regardless of the performance with R300, forcing a PS2.0 path would reduce performance from this alone, before we consider the extra processing for HDR – how many data points do we need before we can state it reasonably with extrapolated logic??

As for performance, I don’t know what it is actually like running, however if didn’t appear to be bad from the demo’s shown, which I assume were running in realtime. However, do we know exactly how the HDR rendered with HL2? Do they use full resolution float buffers, or are they just using regioned HDR areas?
 
DemoCoder said:
Sad little muppet? So its silly name calling now that you're being criticized? I just pointed out that you're in these forums beating drums for ATI, but not once have I seen you defend any features of the NV40. In fact, you've taken features you were previously enthusiastic about and declared them unusable.

You're attitude was completely different when it was NV3x being accused of not supporting HDR formats in DX9. Back then, nVidiots were raising the same concerns you are now, but you weren't raising any skepticism about performance back then.

I mean, did you consider back then that HDR performance would be bad? Instead, you seemed to assume it would be good on R300, and therefore a detriment to NV3x because it would force float precision.

You practically absorbed and parroted everything Valve said back then without much skepticism, especially if directed against NV in some way.

There is a difference between having a preference based solely on a brand name and that of performance and key features one product may display over another. You don't seem to understand this. If someone chooses to pick the 6800 from a neutral viewpoint they may be described as having a preference towards that product. If someone chooses the 6800 based on the fact that it is made by nvidia and thus must be good, they have a bias. At this point in time ATI has the upper hand from a technological standpoint, this is not an opinion it is fact.

Valve have demonstrated HDR running on a 9800 Pro at perfectly reasonable framerates (reasonable being in the area of 35-40 fps). The NV3x, on the other hand, is incapable of HDR at these reasonable framerates due to it's architecture and the way it handles DX9. Now the 6800 is another story altogether and I see no reason why it and the X800 shouldn't be able to handle HDR rather easily.
 
At this point in time ATI has the upper hand from a technological standpoint, this is not an opinion it is fact.

I'm sorry I don't agree.

ATI certainly made different design decisions than NVidia, but technology includes featureset, as much as it encompasses performance. PS3.0 and 32 bit fp may not be important to the average consumer, but that doesn't deminish there technological worth.

Personally I'm disapointed with the X800, it's utterly useless to me, but I have very different requirements than the average gaming user, I need to be writing PS3.0 shaders now.

Stating that something is a fact doesn't make it so.
 
ANova said:
Valve have demonstrated HDR running on a 9800 Pro at perfectly reasonable framerates (reasonable being in the area of 35-40 fps). The NV3x, on the other hand, is incapable of HDR at these reasonable framerates due to it's architecture and the way it handles DX9. Now the 6800 is another story altogether and I see no reason why it and the X800 shouldn't be able to handle HDR rather easily.

Well, then argue that with Dave, I'm not arguing against HDR being playable, he is (after asserting how much better it was too during that ATI demo) I happen to think 6800U will do HDR very well, better than the X800 in fact.


As for X800 being technical leader, a fact? I disagree. It's opinion. That's a value judgement that is subjective based on each person's needs. Performance leader? Maybe. All around technological superiority? Nope.
 
ERP said:
At this point in time ATI has the upper hand from a technological standpoint, this is not an opinion it is fact.

I'm sorry I don't agree.

ATI certainly made different design decisions than NVidia, but technology includes featureset, as much as it encompasses performance. PS3.0 and 32 bit fp may not be important to the average consumer, but that doesn't deminish there technological worth.

Personally I'm disapointed with the X800, it's utterly useless to me, but I have very different requirements than the average gaming user, I need to be writing PS3.0 shaders now.

Stating that something is a fact doesn't make it so.

I was speaking on behalf of the average consumer. Yes the 6800 has shader model 3.0 and yes it's better for developers but that is the only technological feature I consider it to have over the X800. There are many other advantages to the X800 as well. And anyone who chooses the 6800 over the X800 from a gamer's perspective is doing so primarily because it is made by nvidia. That's not to say there are some exceptions, nvidia's better support for Linux being one that comes to mind.
 
Status
Not open for further replies.
Back
Top