A note on percentages for you guys here.

K.I.L.E.R

Retarded moron
Veteran
We all know after reading the previews that people started saying that nVidia will deliver drivers with 10%-30% performance increase.

In some rare cases I would believe so. Though the thing is, I doubt that they could deliver such a percentage to games going over 300fps *cough* quake 3 *cough*.

What I see is that people forget that small percentages can mean massive gains.

IE: 10% of 100? = 10
Want to know the calculation? .1*100 = 10

Turn the percentage into a value, remember you can't work with percentages as a percentage is NOT a value.

If nVidia can deliver a 30% gain in performance when playing with:
1280x960
32bpp
8x FSAA
64-tap aniso (with balanced/aggressive/etc... modes [whichever looks best])
all @ 25fps, then you will have 32.5fps.

On another note:
Are the framerates comparable to [H]'s benchmarks with both NV30 and R300 at highest details? No, not even close. A driver release improving the NV30's performance by 30% under that situation will not do much to catch up to the performance level of the R300. Sure, average performance will be 35.1fps, but is that comparable to the R300's 62fps? Not by a long shot.
An increase of 129.9% in the drivers will bring the NV30 to roughly Ati's level of performance which is 62fps, NV30's will be 62.073fps, though if using only 1 decimal place without rounding off it's exactly 62fps. Would a 129.9% increase in performance in drivers ever be real with the NV30? I very much doubt it. I would say the probability of that is 0.

Thoughts:
So in the end, I would be expecting a 5%-10% performance increase in most benchmarks/games and a 30% increase in very rare circumstances like playing at best IQ settings.
 
Most of the performance increase will be in games that use lots of shaders, and in high-detail scenarios (anisotropic/FSAA).
 
Shader

doubt they'll really come during the lifetime of the GFFX.

Saying that though, there does seem to be something terribly wrong with the current drivers. Maybe a 40-50% increase for the NV30 specific areas aren't isn't out of the question.
 
There may not be any gains in drivers. Just because before nvidia has done it doesn't mean they will do it again . This could very well be a hardware bug that will get fixed in a later refresh of the core. It could be a driver bug yes but I wouldn't be to sure of it.
 
Re: Shader

Heathen said:
doubt they'll really come during the lifetime of the GFFX.

Saying that though, there does seem to be something terribly wrong with the current drivers. Maybe a 40-50% increase for the NV30 specific areas aren't isn't out of the question.

40-50 % ! ...no driver can give a 40-50% increase..look at the Detonator Database at www.guru3d.com.

The only high percentage gain on the drivers has been 3Dmark...you need to knock some zeros off those numbers.
 
Past

The GFFX is a completely new architecture, comparing it to what's happened before is almost worthless. The chip is performing nowhere near where it should do possibly due to something completely and utterly broke within the drivers, or just plain missing. If it's that a major increase would not be out of the question. If it's due tot he hardware, well good luck Nvidia, they're going to need it.
 
They may get even 100% increases in very specific areas, like a facet of Specview Perf etc If in games a 30% to 40% increase in titles across the board with and without AA and AF happens I would fire the driver team ;)

And NVIDIA does have the best drivers and maybe driver team, on the planet. Just that they misfire sometimes.. lol..

I wish I still had screenshots I took of GP3 and my NVIDIA GF2MX when I enabled FSAA with the first driver set that enabled it.
 
I don't know why people are so convinced the card is underperforming. If it wasn't for the 9700 this card would be the fastest on the market far and away.

I also find it difficult to believe that driver problems will cripple simple synthetic benchmarks.
 
The detonator database doesn't go back far enough. Some of the early Detonators delivered large performance increases. For example, the first detonators for the GF1/GF2 didn't enable S3TC, and very had poor bus usage due to the way buffers were allocated. Later detonators delivered across the board 30% increases for the GF2 simply by enabling S3TC and balancing bandwidth between AGP and vidmem better.

Like the GF1, the GFFX is a new architecture, and it is highly doubtful that the driver compilation of ARB2 and DX9 shaders is optimal given the complexity of the new shader pipeline units.

I don't think anyone believes future drivers will deliver huge increases in AF/AA limited DX7 titles, but no one knows what kind of legroom there is in the shader throughput at this point. I think it's premature to assume it is already optimal.

The mere fact that the ARB2 path is not just 1/2 per cycle slower, but 1/3 per cycle slower indicates there could be something very wrong.
 
And NVIDIA does have the best drivers and maybe driver team, on the planet. Just that they misfire sometimes.. lol..

No, they have the most overrated, pandered to driver team on the planet.

The single bigget driver speed increase Nvidia has ever gotten wa when they implimented S3TC, which has since been a Iq Reducing Speed hack ever since the day it was introduced.
 
Hellbinder[CE said:
]
And NVIDIA does have the best drivers and maybe driver team, on the planet. Just that they misfire sometimes.. lol..

No, they have the most overrated, pandered to driver team on the planet.

The single bigget driver speed increase Nvidia has ever gotten wa when they implimented S3TC, which has since been a Iq Reducing Speed hack ever since the day it was introduced.

I know they are, but what are you?

:rolleyes:

Do you ever get bored of bagging on NVIDIA?
 
RussSchultz said:
I know they are, but what are you?

:rolleyes:

Do you ever get bored of bagging on NVIDIA?
do you ever get bored bagging on people who bag on nVidia?
I know i dont get bored bagging on people who bag on people who bag on nvidia.
 
Like I said, anything under 130% improvement in speed will not allow the NV30 to catch up to the R300 with max IQ (both cards at their max). :)
 
Well

I agree with you KILER, Nvidia's got a loooonnnnngggg way to go before the GFFX even begins to catch the 9700Pro. Not a big chance of that happening I'll agree.

PS: Do briefcases count?
 
Re: Shader

Doomtrooper said:
Heathen said:
doubt they'll really come during the lifetime of the GFFX.

Saying that though, there does seem to be something terribly wrong with the current drivers. Maybe a 40-50% increase for the NV30 specific areas aren't isn't out of the question.

40-50 % ! ...no driver can give a 40-50% increase..look at the Detonator Database at www.guru3d.com.

The only high percentage gain on the drivers has been 3Dmark...you need to knock some zeros off those numbers.
Well i just had a look at Nvnews and gess what? 50% increase in 4* AF , 7% in 2*AA ans 30% in 4* AF & 2* AA on Quake 3 performance. :!:

driver_comparison_1280.gif
 
Unless nVidia can completely overhaul their AA logic, I think it's all a moot point. I would take the one with significantly higher IQ _and_ performance over the other guy that could bring high performance without the IQ settings.

I really would like to know what its like to sit in on some meetings within the halls of nVIdia right about now...just to see what they're saying, with respect to all the commentary made about their new flagship product.
 
Back
Top