I believe that there is a hidden potential in the nv30 and

K.I.L.E.R

Retarded moron
Veteran
nVidia's marketing.

NV30 has complex shader support, far more than the R300.
NV30 scores far higher in the 3dmark2k1 lights benchmark.

Is nV going ahead of themselves and when the time comes Ati will have to catch up in those areas where NV have progressed so much?
Nvidia could be simply laying down the groundwork for a new (I think) strategy.

I am just dreaming. Tell me what you guys think of my thoughts.
 
It may score higher in the lights benchmark, but it seems to score lower in the Vertex Shader benchmark.
 
Colourless said:
It may score higher in the lights benchmark, but it seems to score lower in the Vertex Shader benchmark.

That could be because it isn't optimised for DirectX versions <9. Or it's just optimised for DX9. Either that or the card is so programmable that devs will have to rewrite their engines to be able to use the nv30's potential. ;)
 
Well, I always thought presuming it as a given that the GF FX would get massive improvements in performance from drivers was irrational.

However presuming that performance increases are in order after seeing actual results that seem (to me) to be a bit below expectations, seems more reasonable (though I don't think they can change the picture in regards to the 9700 Pro very much).

So I think we'll see some improvement.

But...that complex shader support seems more PR speak than anything. I think there is potential for advantages to be shown but I think it will take a lot of specific work and special cases for that to show. I can sympathize with nVidia's desire for Cg to facilitate this, but they let their arrogance dictate how they treated that initiative (too little focus on making it appealing to other vendors, and too much dependence on perceived self-importance to be able to succeed without doing so), and it seems likely they will be burned for it.

The lighting benchmarks seems to me to be related to a thought that came to mind when looking at the Quadro FX results...it seems to me nVidia's approach offers an advantage in "simple" (fixed function, simple texturing) lighting, and seems to fall down (in comparison to the 9700) when "complex" (shaders...) lighting and transformation comes into play. I tend to think the latter will be more important moving forwards in games (I don't think large quantities of simply lit triangles is a concern for gaming).

As struck me before, what we have here is the result of nVidia delivering hype and noise when ATI quietly delivered a good product, and having it turn out that the hype and noise and delay didn't actually deliver anything that was really better.
 
Colourless said:
It may score higher in the lights benchmark, but it seems to score lower in the Vertex Shader benchmark.

My first guess is that it's due to lack of optimization in the vertex shader benchmark. That is, notice that the high poly test uses much, much higher-polygon models than the vertex shader test. If the pipeline is stalling often (which is potentially much more damaging performance-wise for a higher-frequency processor....), then that might account for the relatively poor performance.

Another possibility is that the GeForce FX supports 16-bit floats in the vertex shader pipeline as well. It may be possible that half-floats are used, in order to increase performance, when fixed-function lighting operations are used (which shouldn't decrease the lighting quality in the least). This could easily explain the excellent 8-light performance of the FX. But, as far as the vertex shader is concerned, nVidia should not reduce the precision unless the programmer requests (since the operations could be used for any number of purposes...). This could also help to explain the lower vertex shader performance.

I'd like to see some truly stressful vertex shader benchmarks soon...I'm sure they'll come before too long.
 
Chlanoth, I believe it only supports 32-bit FP in the vertex pipe, at least according to Digit-life. The NV20 didn't even use 16-bit FP calcs with its vertex shader, there must have been for a reason for this.
 
just replying to the topic, not read ANY message :

Yes, the GeForce FX has hidden potential, the potential to make much noise !

;p
 
I believe that there is a hidden potential in the nv30 too...

...it's called "NV35". :)

MuFu.
 
My first guess is that it's due to lack of optimization in the vertex shader benchmark. That is, notice that the high poly test uses much, much higher-polygon models than the vertex shader test.
It's worth remembering that the vast majority of the vertex processing work in the VS test is actually done using the fixed-function pipeline. Shaders are only used for skinning the character models. The poly count per frame is around 90,000 for the VS test and 1,000,000 for the HPC test.
 
I think the GFFX may have some potential in future games utilizing long floating point shaders. I feel that this is where GFFX has its strength. I would love to see a comparison of running say for instance my phong demo on both the 9700 and GFFX.
 
Humus said:
I think the GFFX may have some potential in future games utilizing long floating point shaders. I feel that this is where GFFX has its strength. I would love to see a comparison of running say for instance my phong demo on both the 9700 and GFFX.

This will be key, but then why the heck didn't they release a DX9 demo test to the reviewers that would at least show the shader ops power over the R300.

There have been too much talk and too little action from the so-called leaders of 3D consumer graphics this time around.
 
I expect that the new 3DMark will be the DX9 shader demo of choice.

My only worry is the arguments we're likely to get if the GFFX scores considerably higher in this synthetic test than the R300! The arguments will be the same i.e. not indicative of real world performance, DX9 games won't be available for a long time etc etc etc.

Basically, just a rerun of the same arguments when DX8 shaders were added to 3DMark! :p
 
No i'm afraid it just gets worse for Nvidia, adding features that push the GF FX past DX9 would have been a smart move had the GF FX been proven to be the clearly dominating card. As it stands now CG has little chance of being used outside of the likely one big Nvidia funded release.
They must have hatched that scheme when Nvidia looked unstoppable, because today it looks only like a poor attempt to steal the whole market.

Another bad side-effect of this move is that its pissed off Microsoft who puts a lot of effort into its DX programming. Rumours that Microsoft would prefer to give the Xbox 2 contract to ATi show this nicely.
 
LeStoffer said:
There have been too much talk and too little action from the so-called leaders of 3D consumer graphics this time around.


They no longer *are* the leaders. Haven't you heard?....;) Of course while they still may refer to themselves as the "leaders" they'll have to do much better than GF FX (a whale of a lot better) before they'll be perceived in that way again.

The fact is that the nv30 was never designed to compete with the R300, or "a" R300, even. nVidia designed it to compete with the nv25, and for that purpose it does very well indeed. Had there been no R300 from ATI we'd have seen nv30 ship at a core clock of ~300-350MHz, at a normal voltage with a 9700P-like cooling solution, with no clock-throttle and Silent Running mechanism, with 2x the pixel fill rate of the nv25 (at the same clockspeed), and ~60% more bandwidth than nv25 (DDRII) and an FP color pipeline of up to 128-bits precision capability. Contrasted to nv25--quite an improvement--impressive, even.

But the R300 totally upset the apple cart for nVidia--being completely unanticipated by them. It's a much more advanced design than anything yet conceived at nVidia and so it's no surprise they didn't see it coming. However, it is still somewhat surprising to me that it came out of ATI and *not* nVidia. But that's the way it is. Perhaps in the future nVidia will scrap incremental thinking and get down to the serious business of designing really powerful 3D chips which can stand on their own without the assistance of cheap and tawdry overclocking gimmicks. nv30 doesn't qualify--regardless of the jacked up clockspeed and voltages and the hairdryer cooler and the rest. Just like 3dfx learned the hard way--if you take your leadership role for granted--someone will come out of the blue and relieve you of it. And that's just what happened here, IMO.
 
The FX is (cooling system and price aside) arguably the best solution on the market. High functionality in the DX9 pipeline, and very competitive performance at quite a few settings. I wouldnt recommend upgrading if you had a 9700 pro, but its a tossup if money is not a concern and you're running a gf3.

Its not like they're getting blown away like Matroxs solution, so at this time I wouldn't say ATI is 'clearly' the industry leader either, at least not until the R350 shows.

However, one thing is sure. Nvidia is no longer market leader either. -shrug-
 
I have to agree with WaltC's thoughts here. I too wonder what NVidia's driver team were doing - were there that many bugs to fix that they simply couldn't optimise the drivers well?

I hate to wait 2-3 months for the first demo or benchmark that can even reveal the future potential of this card. All that is saying to me is see by the time game developers adopt these capabilities the hardware will be excellent.

Congratulations that it isn't a total dog, but it sure feels at the moment lack-lustre, not bleeding edge (unless you are talking out there heat transfer).

I think R300 raised the bench too high for them. It may be a generation or too of product before they get it all rocketing ahead - well done ATi.
 
Back
Top