Where do you stand with the FX

Where do you stand with the FX

  • Definitely not going to buy

    Votes: 0 0.0%
  • Going to wait for R350

    Votes: 0 0.0%
  • Pickup a cheaper 9700

    Votes: 0 0.0%
  • No need to buy anything! No software requires either.

    Votes: 0 0.0%

  • Total voters
    193
Randell said:
...I wont own a NV40...

If nVidia continue in the same vein as they are right now, then the possibility of there ever being an NV40 is fading fast.

Maybe it's up to S3 to keep the market competitive now? I certainly hope the Delta Chrome delivers the goods. Though that would be really bad news for nV as well. Talk about a rock and a hard place. Sheesh. :rolleyes:
 
First the Cg farce. Now this.

Frankly, I'm speechless and really have no clue what to say at this point.

At any rate, as long as there is competition, gamers win. And thats all that matters really.
 
DaveBaumann said:
It has also been pointed out that, equally, R300 could provide the same level of functionality via the use of mulitpass – that is a legitimate questions to ask “what can NV30 do via the use of its long shaders that R300 can’t do with Multipassâ€￾?

Yes, but is it convenient?
There tend to be a general belief that you can easily workaround every limitation by going multipass. While in this case it's usually possible to go multipass it will often be a pain in the ass, which is the no.1 reason why most of my demos don't include a fallback path for older cards.
When it comes to some instructions, most important the ddx/ddy/txd, it's not even possible to do on the 9700.
 
At the moment I'd imagine that its not at all convenient, and (as I did) question if its possible right now. However, as has been mentioned by ATI and Carmack, ulimately its got to come down to the HLSL compilers to be able to do it, which should make it convenient.
 
There's no easy general solution of transferring a shader to multipass, so I think in the end we'll see it fall back into software rather than to multipass. Only in a few limited cases will we see auto-multipass.
 
BoardBonobo said:
If nVidia continue in the same vein as they are right now, then the possibility of there ever being an NV40 is fading fast.
I don't think so. nVidia's got a huge amount of presence in the PC market, and most of their money is not made in the high-end market.

What will really make a difference in their earnings will be how good the NV31/NV34 products come out. And S3? They suck. Now that VIA owns them, I really don't see any possibility of that changing.
 
Chlanoth, maybe BoardBonobo is refering to the fact that it would not be feasible, after the R&D troubles encountered, for Nvidia to develop another chip architecture. Alternatively, their processors may benefit more from scaling in terms of units, effieciency of implementation, etc. with the same fundimental architecture. The Pentium Pro is an example.
 
What will really make a difference in their earnings will be how good the NV31/NV34 products come out.

Not entirely true.

The only thing that defines "how good" the NV31/34 products are, is how good they are perceived to be. A good part, but not nearly all, of the perception is based on the actual product performance and features.

However, another part of that perception is the brand name. Up until YESTERDAY, nVidia was basically viewed as the technology leader. Even though the 9700 has been out for months, the "assumption" was that the competitor from nvidia would be clearly superior, and thus nVidia would not really "lose the crown".

IMO, the sellability of the NV31/34 took a significant hit yesterday, before we know anything about those produts. The "word on the street" is now that ATI is the "best graphics chip maker." I'm guessing that in the low end, ATI and nVidia will reverse roles:

ATI will be able to sell their parts at a marginally higher profit margin....even if their parts end up marginally worse. All this due to the launch of a card (FX) that will have little consequence itself on sales volume.
 
Reverend said:
You're overreacting to a statement of mine. And, personally, if that statement of mine isn't truly a constructive post, especially the words before the comma, what is then DT? Your posts?

No Rev, for 6+ months I've heard of this 'functionality improvement' yet I've see no one back it up. I'm no programmer, and when I see someone claim I'd reccomend brand X over Y to due to functionality, well all I asked for was examples...

Dave and PCchen answered my question..thankyou...you on the other hand are going to be very hard to have a conversation with due your obvious opinion embedded in your head of me (Doom + ATI) (Doom +ATI)... :rolleyes: . I never accused you of Bias in our conversation, I expect that same in return.

It was a legit question looking for legit answers and Sorry your reference to Chalnoth (no offense Chalnoth you will say anything) I will stick to some of the other knowledgabe people here.
I thought we were through with this...with cheap pot shots blaming other members for leaving = lame :devilish:
 
Derek Smart [3000AD said:
]First the Cg farce. Now this.

Frankly, I'm speechless and really have no clue what to say at this point.

At any rate, as long as there is competition, gamers win. And thats all that matters really.

welcome back, and maybe you could persuade DC to come back when peole like you and him have FX's and DX9 drivers to code with. Then comments on drivers, hardware functionality compared with the R300 etc. can be gauged with more insight than the initial overreaction to the 'hot hairdryer ' that behaves as most of us here expected (hence the reason I already own a 9700Pro)
 
Typedef Enum said:
Wow? Derek Smart...speechless?

My God, what's next? Cats and dogs getting along?

Some kind of product to some market available at retail stores with working Bitboys hardware/technology on it? ;)

be careful while reading that sentence... it might mean something, but to most of here it won't mean much...
 
BoardBonobo said:
Randell said:
...I wont own a NV40...

If nVidia continue in the same vein as they are right now, then the possibility of there ever being an NV40 is fading fast.

Maybe it's up to S3 to keep the market competitive now? I certainly hope the Delta Chrome delivers the goods. Though that would be really bad news for nV as well. Talk about a rock and a hard place. Sheesh. :rolleyes:

This is a huge overreaction. The FX is far from perfect and many ways is not as good as the 9700 Pro but it is still quite competitive. The underlying tech of NVIDIA is there. Not to mention 3DFX's and Gigapixels tech. Look forward to more heated competition between ATI and NVIDIA. Neither company is going to go away as easily as past gfx card vendors.
 
Nappe1 said:
Typedef Enum said:
Wow? Derek Smart...speechless?

My God, what's next? Cats and dogs getting along?

Some kind of product to some market available at retail stores with working Bitboys hardware/technology on it? ;)

be careful while reading that sentence... it might mean something, but to most of here it won't mean much...

Hmm, let me guess...mobile gaming?
 
Reverend said:
Er, PS version isn't what I was talking about. But you did state what I didn't say outright... why is a benchmark laden with PS 1.1. effects faster on a GeFX Ultra than a R9700Pro but games with little-to-no shading slower overall on a GeFX?

what about the 3DMark2001 APS scores? R300 is twice as fast in those. Codecreatures is for all intents and purposes an nVidia benchmark
 
Doomtrooper said:
Reverend said:
You're overreacting to a statement of mine. And, personally, if that statement of mine isn't truly a constructive post, especially the words before the comma, what is then DT? Your posts?

No Rev, for 6+ months I've heard of this 'functionality improvement' yet I've see no one back it up. I'm no programmer, and when I see someone claim I'd reccomend brand X over Y to due to functionality, well all I asked for was examples...

Dave and PCchen answered my question..thankyou...you on the other hand are going to be very hard to have a conversation with due your obvious opinion embedded in your head of me (Doom + ATI) (Doom +ATI)... :rolleyes: . I never accused you of Bias in our conversation, I expect that same in return.

It was a legit question looking for legit answers and Sorry your reference to Chalnoth (no offense Chalnoth you will say anything) I will stick to some of the other knowledgabe people here.
I thought we were through with this...with cheap pot shots blaming other members for leaving = lame :devilish:
Chalnoth is right IMO insofar as my reference to what he said.

I didn't "recommend" the GeFX. I said :

It may, however, be tougher for those that do not already own a 9700Pro. And it will also depend a great deal on whether that person is an avid programmer or not. If such is the case (a programmer), I'd probably opt for the GeForceFX since it affords me a little extra over the R300 in terms of functionality.

I said "a little extra". I said "probably", not "definitely". The R300 only supports non-standard 24-bit fp but I could also say that the NV30 can't blend when rendering into 64/128-bit surfaces. It's about expressing my opinion and my preferences.

But you are right that I think of you as pro-ATI-anti-NVIDIA everytime I read your posts. I apologize for thinking this way and whatever cheap shots I may levelled at you in this thread. You don't deserve it.
 
Randell said:
what about the 3DMark2001 APS scores? R300 is twice as fast in those. Codecreatures is for all intents and purposes an nVidia benchmark

Why is it an NVIDIA benchmark? When it was first programmed there was no Geforce FX. Now it may be optimised for the GF4 Ti cards but if the GFFX is faster than the Radeon 9700 Pro it is not likely for the same reason.
Or do you mean NVIDIA's GFFX runs faster in Code Creatures? Is there anything wrong with that? No! And wasnt the Radeon 9700 Pro faster than the GF4 Ti cards in Code Creatures?
They seem to use a lot of fillrate intensive effects in that benchmark, maybe this is where the clockspeed differential does help! Or maybe it is because Codecult secretly programmed the benchmark to turn off efficient, faster code when running a non NV card and turns on the bloated, slow code when using non NV cards?
 
The R300 only supports non-standard 24-bit fp but I could also say that the NV30 can't blend when rendering into 64/128-bit surfaces.

'Non standard'?

What 'standards' are you refering to? It is, afterall, the recommended minimum DX9 PS accuracy.
 
My response to the pool was "No need to buy anything! No software requires either." for some time. Also we will need lots of CPU/memory to push all this GPUs.

I will be receiving my new home computer in the next three weeks and I will keep it low cost (will be my first AMD, finally ;) ).
 
Back
Top