NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Exactly. And to have a fastest and fully functional DX10.1 card on the market means MONEY for them.
Why? Games don't support it and they never will unless nVidia implements DX10.1. They already stated that DX10.1 is not worth the trouble.
[quote="DegustatoR]Bottom line is -- they have to and they will (DX11, yeah).
The question is -- will they add the 10.1 support in G1xx or will we have to wait for the DX11 architecture.[/quote]
I think they'll stick with plain DX10 until DX11 comes.
DegustatoR said:
It wouldn't be wise for them to give a feature edge to AMD for such a long time.
People don't care about the feature edge for as long as nVidia has the performance edge.
DegustatoR said:
They'll NEED to support DX10.1 in the end. And from what i know about the DX11 specs i'd say that any currently implemented 10.1 features will still be useful for the DX11 architecture.
Of course they'll implement the features in their DX11 architecture. But not now. DX10 will be the next DX9.0b - just remember how long ATi held onto its R300 design and how long it took for SM3.0 supporting games to be released. I think it's gonna be the same with G80 and DX10.
CJ said:
It did? Then you haven't been paying too much attention if it caught you by surprise.
Don't be silly, I have to keep track about more things than just GPUs and I haven't been visiting B3Df before. Although I realize you guys usually know more/sooner than the rest of the world...
CJ said:
It's very likely. After all the transistor count differences for RV610->RV620 and RV630->RV635 indicate a great deal of similarity
I'd say RV610 and RV630 actually do support DX10.1, but ATi didn't enable it so they could use it as a buzzword for RV620 and RV635.
DegustatoR said:
Isn't NV doing something very similar to Gather4 since NV2x?
Can't tell, but when R520 came out, it was said that it supports Fetch4 and NV4x/G7x doesn't.
 
Why? Games don't support it and they never will unless nVidia implements DX10.1. They already stated that DX10.1 is not worth the trouble.
Games with 10.1 support are already here and there will be more of them soon.
As i've said 10.1 support isn't that hard to implement, it's not some other renderer and it won't break your 10.0 compatibility.

I think they'll stick with plain DX10 until DX11 comes.
That would be a mistake for them.

People don't care about the feature edge for as long as nVidia has the performance edge.
People do care about features and 10.1 features can be used for additional performance.

Of course they'll implement the features in their DX11 architecture. But not now. DX10 will be the next DX9.0b - just remember how long ATi held onto its R300 design and how long it took for SM3.0 supporting games to be released. I think it's gonna be the same with G80 and DX10.
SM3 was quite different from SM2, that's not the case with 4.1 and 4.0, 4.1 is a minor extension of 4.0, more or less like 2.0.a/b were to 2.0.

Can't tell, but when R520 came out, it was said that it supports Fetch4 and NV4x/G7x doesn't.
Well obviously they didn't but i thought that they did support some form of high speed 1-channel fetching, just not as advanced as Fetch4 since NV2x architecture.
 
I'd say RV610 and RV630 actually do support DX10.1, but ATi didn't enable it so they could use it as a buzzword for RV620 and RV635.
If that were the case RS780 would as well, however it doesn't.

However, DX support, like PCI Express, was implemented up to the level that was set at the point design finished.
 
Games with 10.1 support are already here and there will be more of them soon.
What games? Assassin's Creed? By the way, the performance gain in DX10.1 was actually caused by some render pass being left out (I don't know the details).
SM3 was quite different from SM2, that's not the case with 4.1 and 4.0, 4.1 is a minor extension of 4.0, more or less like 2.0.a/b were to 2.0.
I remember reading an interview with a game developer (maybe someone from Crytek?) who was comparing SM2 to SM3 and he claimed that what was possible with SM3 could be done with SM2 as well, but the only limit was performance of GPUs back then. By the way, many people connect SM3 to HDR, as games began to use both at approximately the same time and some HDR implementations didn't work on ATi R3xx/R4xx, but otherwise they're separate technologies.
Well obviously they didn't but i thought that they did support some form of high speed 1-channel fetching, just not as advanced as Fetch4 since NV2x architecture.
Don't ask me, my knowledge doesn't stretch that far back. But one of the childhood heroes of mine around here might know more.
Dave Baumann said:
If that were the case RS780 would as well, however it doesn't.
Perhaps they felt they could not sell RV620 if RS780 offered almost the same performance and at the same time supported DX10.1 as well? Frankly, this has always been a mystery to me. RS780 is manufactured on 55nm, yet its IGP is different from RV620?
 
Games with 10.1 support are already here
Like what? Assassin's Creed has just withdrawn DX10.1 support. Are there any others?

and there will be more of them soon.
Like what?


I think they'll stick with plain DX10 until DX11 comes.
That would be a mistake for them.
Why? So long as Nvidia doesn't support DX10.1, no games can afford to require it. If Nvidia did support DX10.1, it would be a terrible mistake for AMD not to; but Nvidia has a de facto monopoly, here. Unless AMD is able to easily overtake Nvidia in performance and price/performance terms, they will continue to control the bulk of the market and games will continue to be written to use the features of Nvidia cards because it would be commercial suicide for a game not to run well on Nvidia hardware.
 
AMD needs to work with the devs like nVidia does with the TWIMTBP programme in the first place. Then they can make the rules like nVidia does now.
 
Like what? Assassin's Creed has just withdrawn DX10.1 support. Are there any others?

Like what?


Why? So long as Nvidia doesn't support DX10.1, no games can afford to require it. If Nvidia did support DX10.1, it would be a terrible mistake for AMD not to; but Nvidia has a de facto monopoly, here. Unless AMD is able to easily overtake Nvidia in performance and price/performance terms, they will continue to control the bulk of the market and games will continue to be written to use the features of Nvidia cards because it would be commercial suicide for a game not to run well on Nvidia hardware.

Err, supporting DX 10.1 wont hurt Nvidia cards. So argument doesn't hold.
 
Perhaps they felt they could not sell RV620 if RS780 offered almost the same performance and at the same time supported DX10.1 as well? Frankly, this has always been a mystery to me. RS780 is manufactured on 55nm, yet its IGP is different from RV620?
RS780 IGP core is lifted from RV610's core 3D design, not RV620's.

RS780 is AMD only, so irrespective of DX10.1 or not it would not have stepped on RV620's toes as that caters to all platforms.
 
I think if it's easy for NV to do they will support DX10.1 with GT200. But I do not think it's a high priority item for them. After all they didn't bother to add DX8.1 support in GF4 when R200 had it, and it clearly didn't have too much of an effect on sales. As others have mentioned NV sorta controls the market in the sense that if they don't support 10.1 many devs wont bother with it as well. Feature set is always secondary to performance.

What games? Assassin's Creed? By the way, the performance gain in DX10.1 was actually caused by some render pass being left out (I don't know the details).

Hm I read it as DX10.1 allowed them to omit a pass thanks to the added functionality of 10.1 compared to 10. There was no (observable) IQ difference between 10 and 10.1.

Also I'm seeing posts suggesting that GT200 is ready and NV is not releasing it due to lack of competition from ATI. This makes NO sense. No sane company would put off selling a (quickly) depreciating product (i.e. GPUs).
 
Last edited by a moderator:
Also I'm seeing posts suggesting that GT200 is ready and NV is not releasing it due to lack of competition from ATI. This makes NO sense. No sane company would put off selling a (quickly) depreciating product (i.e. GPUs).

Yeah I don't think Nvidia is just waiting in the wings to launch GT200 at their leisure. Even if the chip is ready there's always potential for delays due to bad yields, poor drivers etc. Honestly, who really knows what Nvidia is doing. They just renamed all their G8x based cards to G9x with no performance increase. And there's talk of a 55nm G92. I wouldnt be surprised if there's no "GT200" until fall.

At least they're out of stuff to rename. So far we have:

8800GTS (G92) -> 9800GTX (G92)
8800GT (G92) -> 9800GT (G92)
8800GS (G92) -> 9600GSO (G92)
9600GT (G94) -> 9600GT (G94)
8600GTS (G84) -> 9500GT (G96)
8500GT (G86) -> 9400GS (G98)
8400GS (G86) -> 9300GS/GE (G98)

Don't think there's anything left. The sad thing is that the only cards to get a real upgrade in terms of performance are all < $100.
 
Err, supporting DX 10.1 wont hurt Nvidia cards. So argument doesn't hold.
It certainly does hurt Nvidia cards if the game runs in a less than optimal way without DX10.1. If adding DX10.1 support gives a minor performance boost, they might do it; but if it adds significant features or turns the game from being unplayably slow to playable, they won't bother; they'll downgrade it until it can run comfortably in DX10 instead.

There's also the development effort to consider. Why bother investing a lot of development work in features that only a tiny fraction of your target audience can benefit from?
 
Hm I read it as DX10.1 allowed them to omit a pass thanks to the added functionality of 10.1 compared to 10. There was no (observable) IQ difference between 10 and 10.1.
Yeah I know, btw DX10.1 actually had better IQ (in DX10 AA was not working as perfectly as it should). Strange.
Also I'm seeing posts suggesting that GT200 is ready and NV is not releasing it due to lack of competition from ATI. This makes NO sense. No sane company would put off selling a (quickly) depreciating product (i.e. GPUs).
You're right, it wouldn't make sense. But... there is some evidence pointing towards ATi having R600 chips in January 2007. The cards were launched in May and there certainly was not any lack of competition. :unsure:
At least they're out of stuff to rename.
Really? Yesterday I read about nVidia planning to re-brand G86 as GeForce 9400 GS or something. And also some mobile GeForce 9500's will be G84 based, if VR-Zone is correct.
Dave Baumann said:
RS780 is AMD only, so irrespective of DX10.1 or not it would not have stepped on RV620's toes as that caters to all platforms.
Why did AMD stop developing chipsets for Intel, anyway?
 
Yeah I know, btw DX10.1 actually had better IQ (in DX10 AA was not working as perfectly as it should). Strange.

NV has a horrible track record for strong arm marketing tactics. I don't know if you were following the industry in the NV30 days but the strategies they used to promote their underperforming part (NV30) were extremely unethical to say the least.
 
I haven't been keeping track about hardware back then, but their marketing practices don't seem to have gotten any better since. Killing VIA K8T900 by blackmailing mobo makers is a shining example...
 
What games? Assassin's Creed? By the way, the performance gain in DX10.1 was actually caused by some render pass being left out (I don't know the details).
Yes, it was -- because DX10.1 features allowed them to.

I remember reading an interview with a game developer (maybe someone from Crytek?) who was comparing SM2 to SM3 and he claimed that what was possible with SM3 could be done with SM2 as well, but the only limit was performance of GPUs back then. By the way, many people connect SM3 to HDR, as games began to use both at approximately the same time and some HDR implementations didn't work on ATi R3xx/R4xx, but otherwise they're separate technologies.
Everything can be done via multitexturing on Voodoo 1, yeah.
Are we still doing it this way? No? Why?

Like what? Assassin's Creed has just withdrawn DX10.1 support. Are there any others?
Actually, it hasn't -- yet.
And when it will they'll withdraw it to enchance it so it will return eventually.

Like what?
Like every game out there that will need Z access from the shader.
Like every game with tone mapping and MSAA.

Why? So long as Nvidia doesn't support DX10.1, no games can afford to require it.
It seems that no one listens... You DON'T have to require 10.1, but you can use it in the same shaders to improve speed / quality and fallback to 10.0 features if the hardware doesn't support 10.1.

If Nvidia did support DX10.1, it would be a terrible mistake for AMD not to; but Nvidia has a de facto monopoly, here.
With G8x vs R6x0 they had. But with G1xx vs R7x0 they'll need to do something to maintain this "monopoly". I think that they understand this.

Unless AMD is able to easily overtake Nvidia in performance and price/performance terms, they will continue to control the bulk of the market and games will continue to be written to use the features of Nvidia cards because it would be commercial suicide for a game not to run well on Nvidia hardware.
10.1 will work fine on NVs 10.0 hardware but on 10.1 hardware it'll run faster and/or look better.
In the end NVs 10.0 G100-based card can (and probably will) end up faster on 10.0 path then RV770-based card on 10.1 path but it's not really fare to compare these chips and with some more comparable to RV770 G1xx derrivative things can get messy -- and that's precisely why NV needs to support 10.1 in G1xx line.
 
Status
Not open for further replies.
Back
Top