NV30 light years ahead of the GF4 TI?

Pav_37

Newcomer
Just saw this over @ nvmax - http://www.nvmax.com/cgi-bin/community.pl?num=1028839419

NVIDIA Conference Results

The conference held today (I think voodooextreme were there also) was about CineFX and not hardware, although I wouldn't have been able to post anything anyway. NVIDIA's Geoff Ballew talked through it and I have to admit I'm sold to the NV30.
Too things are still on my mind. Although the NV30 is light years ahead of the GF4 TI in many areas I cant forsee any programs within in the next year making extensive use of its features. NVIDIA have always sold on the principle that cards are slower than the software run on them - with room for improvement.

The NV30 seems so fast that users won't need to upgrade for twice the normal time, probably more, due to the lack of hardware intensive software. Times are changing. NVIDIA's fortune no longer replies upon its devoted followers but upon developers. Although it recognizes this by the introduction of CG there is not going to be staggering change over the NV30's shelf life and I wonder how many will stick with the GF4 TI's until games more advanced then Doom 3 become mainstream.

What you guys think? True kick ass card or just PR machine on full load? :LOL:
 
The statement is correct. Once you have a fully programmable generic pipeline and high level language, then future cards only deliver higher performance, not more features, since "full" programmability means any algorithm can be encoded.

Therefore, the focus shifts from trying to get developers to use Feature X (EMBM, cubemaps, T&L, DOT3, etc) of your hardware, to all developers coding in a high level language and making sure your hardware runs the code faster than others, and that the developer tools you ship are the best.


Think about CPUs. In each new generation of Intel or AMD cpus, there isn't a real need to push developers to use CHIP specific features (like serial number, 3DNow, SSE, etc). Rather, developers still use C/C++/whatever, and the real push is to deliver compilers and high level libraries that can best take advantage of the underlying chip's performance. Auto vectorizing C compilers, high level numeric libraries that use SSE/SSE2/3DNow/etc underneath.

DX9 is close to being a universal computer. DX10 probably will be. At this stage, the GPU is a computer, and how many instructions it can process per clock becomes more important.

If one architecture can issue 4 fragment shader ops per pipeline, and another can do 8, then at the same clock frequency, the latter will deliver twice the fillrate performance.
 
There was never a good reason to upgrade your card every 6 months in the first place, so you can't say it no longer makes sense to do so (although you may be right that hardware is badly outstripping software and they may have to slow down). I know hardly anyone who ever upgraded every 6 months, although once a year doesn't seem uncommon. I actually think if Nvidia and ATi moved to yearly product cycles it would benefit everyone, because they'd still sell just as many cards but with less R&D necessary (half as many cards). It'd also allow them to make the cards more full-featured.

I don't see why you're so impressed that the NV30 is "lightyears beyond the Ti4600" though. It's main competitor isn't going to be the Ti4600 it's going to be the R9700. So unless you only buy Nvidia the comparison is pointless. Besides the same thing was said about the GF3 and honestly the GF2:Ultra ran fine in games and still does.
 
I'm not so impressed because I don't know much about nv30 - that's nvmax words there in my post, not my. Personally, I would be glad if NV30 is so good that I can easily not bother with getting nv35 - because cards these days costs a fortune and I would love for a card to last 1year at least and still be able to play the latest games with all eye candy features turned on with aniso filtering and AA turned on. :)
 
The only thing is, GPUs will continue to advance much faster in performance than CPUs, due to the nature of the architecture. GPUs will also continue to advance at about the same speed they have been for the next five to ten years.

What will result is that games will use more and more of the hardware's power for advanced effects than for things like AA and aniso. In other words, if you have a two-year old card, you still won't be able to play the latest and greatest with full details turned up.
 
Err . . . I don't really see how you come from this:
The conference held today (I think voodooextreme were there also) was about CineFX and not hardware [...]

to this:
The NV30 seems so fast that users won't need to upgrade for twice the normal time, probably more, due to the lack of hardware intensive software.

Is it only me or does this seem like two mutually exclusive statements? Or rather, does the very first sentence of the initial post contradict the rest of the posting?

???

ta,
-Sascha.rb

P.S. I agree with Mize in that other thread. "3D Technology and Hardware" doesn't mean posting 2,398,472,983,489,234 consecutive threads about NV3x, or any other graphics chipset, for that matter. -.rb
 
Nagorak said:
There was never a good reason to upgrade your card every 6 months in the first place...

yep. Still trying to get rid of the Radeon AIW 32MB AGP, but with no success. 1.5 years for GFX card a bit longer time than I usually have had. (and it looks like it ill be more like 2 years before changing.)
 
It will be a nice day when GPUs seem so fast no one feels the pressing need to upgrade every 6 - 12 months. And as I reflected before, with video cards as fast as they are now, a further delayed upgrade will buy you a hell of alot of power.

In 2 - 3 years they game graphics might be so incredible that plot, character development and A.I. might start to becomming the hot new areas of focus.
 
I have just one thing to say about this whole thing.

Regarding NV30 being light-years ahead of GeForce4 Ti.

Light years are a measure of distance.

So are they saying that GF4 will pull a vanishing act like Ti500 did, and be pushed further and further away to make sure gamers don't keep buying GF4 instead? :p
 
g__day said:
It will be a nice day when GPUs seem so fast no one feels the pressing need to upgrade every 6 - 12 months. And as I reflected before, with video cards as fast as they are now, a further delayed upgrade will buy you a hell of alot of power.

In 2 - 3 years they game graphics might be so incredible that plot, character development and A.I. might start to becomming the hot new areas of focus.

God, I hope so! :D
 
Back
Top