NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
i believe that you really mean, nVidia expects to stay monolithic and that they would love to have a single powerful core that dominates all.
-But then, they really don't know what performance r700x2 [x3/x4] might bring, and they may be forced to make a GT200-X2 [variant].

Maybe; not impossible. And the only thing i can do now is to bet you. But betting is illegal in the USA. A gentleman's wager, perhaps?
[you don't need to point out i am no gentleman; but i am certain there will be a GT200x2 on the smaller process next year!
- Edit: How certain am i? i will apologize to all of you and leave B3D in shame forever; -- even if i somehow make it as a member here to 01/01/10 =P]


The differences are cultural, imo. And some of the English speaking Tech sites do exactly the same, Lukfi.


Unless they don't plan on updating dx versions thats will be the only way a gx2 will come out again. Oh and most likely they won't need an gx2 (performance is pretty good on gt200's.
 
Unless they don't plan on updating dx versions thats will be the only way a gx2 will come out again. Oh and most likely they won't need an gx2 (performance is pretty good on gt200's.

i am sorry, i do not understand what you mean about updating DX10 versions.
--i am personally of the opinion that GT200, the Tesla core, is the basis for nVidia GPUs until DX11; probably for two years.

And i am agreed with you; performance looks outstanding with GT200's top performer!. However, we still really don't know what AMD has up it's sleeve. We are pretty sure of a 4870x2 .. but i am saying to expect a 4870x3 - eventually, possibly with their own shrink/bump - and that should bring on a variant of GT200 in an x2. My own opinion is that it is likely, next year, '09 [within 17 months].

I think it was Jen-Hsun himself who said GF7 was a mistake, it was a kinda semi-official statement.

Excuse me? I can certainly never recall Huang stating g7x was a mistake. He did say nv30 was a mistake. Possibly Huang may have referred to the texture filtering optimizations that marred nv4x and g7x as a mistake. I can't recall him saying even that (it would be nice if he did acknowledge it); doubtful as it involved the ongoing process of bedding down 3dfx engineers into nVidia, i think. Bob Kilgarriff, ex-3dfx was nv4x lead designer, wasn't he? That appeared to free up nVidia engineers to work on g80.
 
Last edited by a moderator:
for every extra GPU there is diminishing returns, lets wait and see what the x2 can do before we think about an x3, the main reason, power consumption will probably end up around 250 watts just for an x2, can they make an x3, would it be even possible?
 
-But then, they really don't know what performance r700x2 [x3/x4] might bring, and they may be forced to make a GT200-X2 [variant].
I am convinced that ATi will not launch a triple or quadruple RV770 as a single board. Either they would have to make the cooling systems take up four slots (which would result in the card being so heavy that it would tear the motherboard apart) or hire magicians (do you know how expensive they are these days?). These days an RV670 X2 can cope very well with almost every game out there, so what would we need >200% its power for? By the way, P45 is coming = CrossFire support for the mainstream.
But betting is illegal in the USA.
"...land of the free my *ss."
[you don't need to point out i am no gentleman; but i am certain there will be a GT200x2 on the smaller process next year!
If so, then it won't be a die-shrink like G71 was to G70 (yeah, they miraculously made the 30 million transistors go poof, but the functionality stayed the same), but a new core based on the GT200 architecture. You can't fit a 512bit bus on a smaller die. And you'll need some serious bandwidth for such a powerful card, so they'd have to improve the memory controller to include GDDR5 support.
The differences are cultural, imo. And some of the English speaking Tech sites do exactly the same, Lukfi.
Well I wouldn't know. But look, some of the guys around here consider *you* a troll, because you're pretty stubborn in your opinion. But try posting some good looking specs on VR-zone forums (they are officially English speaking btw) and observe the responses:
#1: o_O
#2: hoot dah liao!!!!!!!!!!!
#3: (no text, just a drooling emoticon)
Cultural. Yeah.
Excuse me? I can certainly never recall Huang stating g7x was a mistake.
Like no-X said on the previous page, he evidently said that on a press conference at the time G80 was launched. Though you won't find in on nVidia's website as an official statement I guess.
Bob Kilgarriff, ex-3dfx was nv4x lead designer, wasn't he? That appeared to free up nVidia engineers to work on g80.
I don't know, but I think there are more factors involved in making the design decisions than just who's the lead engineer.
 
If so, then it won't be a die-shrink like G71 was to G70 (yeah, they miraculously made the 30 million transistors go poof, but the functionality stayed the same), but a new core based on the GT200 architecture. You can't fit a 512bit bus on a smaller die. And you'll need some serious bandwidth for such a powerful card, so they'd have to improve the memory controller to include GDDR5 support.
Given the differential in memory bandwidth between high-end GDDR3 and GDDR5 in that timeframe, just moving from 4 ROPS/64-bit to 8 ROPs/64-bit (i.e. a 256-bit memory bus) would likely make the most sense IMO.
Well I wouldn't know. But look, some of the guys around here consider *you* a troll, because you're pretty stubborn in your opinion. But try posting some good looking specs on VR-zone forums (they are officially English speaking btw) and observe the responses:
#1: o_O
#2: hoot dah liao!!!!!!!!!!!
#3: (no text, just a drooling emoticon)
Cultural. Yeah.
I don't know if I'm right, but I suspect the reason for this kind of behaviour being mainstream is that technology in general is much more mainstream in many parts of Asia. So everyone wants to talk about it; but in practice, they really don't know much more than the average European/American, and the average Asian isn't magically a billion times smarter than the average European/American. So naturally, the average levels of intelligence and knowledge are lower... It's not necessary to put culture or genetics or even education into the equation, IMO.
Like no-X said on the previous page, he evidently said that on a press conference at the time G80 was launched. Though you won't find in on nVidia's website as an official statement I guess.
Well, I guess since we can't find the exact quote, what I'm curious about is what kind of mistake was he thinking of? I think you're really not interpreting it correctly; there is no way in hell he said G7x was a mistake or failure in absolute terms. He must have obviously thought it was suboptimal but, duh, that's not really the point (in fact, I'd even disagree completely in terms of perf/mm² and perf/transistor; it's more power efficient but that's about it really).
I don't know, but I think there are more factors involved in making the design decisions than just who's the lead engineer.
Nowadays certainly; 10 years ago, it was obviously a much bigger factor though.
 
k, so they just don't know what they are talking about and everything they say should be ignored... :p
 
Given the differential in memory bandwidth between high-end GDDR3 and GDDR5 in that timeframe, just moving from 4 ROPS/64-bit to 8 ROPs/64-bit (i.e. a 256-bit memory bus) would likely make the most sense IMO.
So that they can get the same bandwidth with a 256bit bus and GDDR5, but they won't lower the number of RBEs. Well, if it's balanced as it is, it seems quite logical.
I don't know if I'm right, but I suspect the reason for this kind of behaviour being mainstream is that technology in general is much more mainstream in many parts of Asia. So everyone wants to talk about it; but in practice, they really don't know much more than the average European/American, and the average Asian isn't magically a billion times smarter than the average European/American. So naturally, the average levels of intelligence and knowledge are lower... It's not necessary to put culture or genetics or even education into the equation, IMO.
Perhaps.
Well, I guess since we can't find the exact quote, what I'm curious about is what kind of mistake was he thinking of? I think you're really not interpreting it correctly; there is no way in hell he said G7x was a mistake or failure in absolute terms. He must have obviously thought it was suboptimal but, duh, that's not really the point (in fact, I'd even disagree completely in terms of perf/mm² and perf/transistor; it's more power efficient but that's about it really).
When G80 was launched, even GeForce 7950 GX2 could no longer compete with R580. The performance degradation in newer games was quite noticeable. Some claim it was ATi's work on the drivers, but it was more like the games were shifting towards newer technologies, requiring higher ALU vs. texturing ratio and R580 was better designed for that. However, with nVidia's excellent marketing backing it, not even the GeForce FX line was a failure for them. Because they sold the chips and that's all that matters.
 
apoppin and lukfi, last time i looked I could bet on anything i wanted to here in the US as betting is not illegal here. Hell, I could go to Las Vegas and place a bet on anything I wanted to if they had the odds for it.
 
However, with nVidia's excellent marketing backing it, not even the GeForce FX line was a failure for them. Because they sold the chips and that's all that matters.
Let's not be ridiculous; the GeForce FX line had very bad gross margins compared to what came before and after. From a financial perspective, it was a massive failure; just look at the stock price...
 
When G80 was launched, even GeForce 7950 GX2 could no longer compete with R580. The performance degradation in newer games was quite noticeable. Some claim it was ATi's work on the drivers, but it was more like the games were shifting towards newer technologies, requiring higher ALU vs. texturing ratio and R580 was better designed for that. However, with nVidia's excellent marketing backing it, not even the GeForce FX line was a failure for them. Because they sold the chips and that's all that matters.

I dont recall any new game where this was the case when G80 launched. Oblivion was already n the market and was/still is very ALU happy. What game/s is it you are referring to?
 
Those backflashes on past successes or failures get tiresome at some point. In case you haven't noticed both IHVs have their ups and their downs through the years and no nothing is ever perfect.

Successes like R300 and G80 are overall rare; both architectures increased IQ and performance at the same time compared their predecessors and both incidently didn't have exactly worthy competitors which helps increasing the halo effect that surrounds them.

Now back to reality: NV obviously won't be able to repeat a G80 with GT200. On the other hand though despite the rumoured huge die a single chip sollution for the time being has more advantages than disadvantages compared to a dual chip SKU. All IMHLO of course.
 
=>XMAN26: Perhaps apoppin isn't old enough yet?
Games? Well Oblivion was one of them, but it was the case of almost everything that came out in 2006.
 
=>XMAN26: Perhaps apoppin isn't old enough yet?
Games? Well Oblivion was one of them, but it was the case of almost everything that came out in 2006.

Again, aside from Oblivion, which had been out for awhil already before the G80 launched, I do not recall any other game that showed that kind of warped ALU focus.

On another note, I think I may have found what JHH was referring to as a mistake in the G7x design:

"In addition, our processors are C-programmable processors. It’s a radically different thing. G71 had shaders and were programmable, but they were not programmable by C. G80 is unified and programmable through C. "

http://blogs.mercurynews.com/aei/2006/11/08/nvidia_launches/
 
I don't remember specific titles, but look up a test from the time when Radeon X1950 XTX launched. Especially the AA+AF tests have a tendency to show R580 in a good light.
 
Yes especially in quite a few reviews by Computerbase for instance; with the slight difference that anything G7x had been tested with all AF optimisations switched off and anything R5x0 left on default. Let's overlook that one for a minute, why isn't anyone switching off optimisations nowadays in G8x/9x vs. R6x0/RV6x0 comparisons? Unfortunately I guess you can't switch them off for the latter.
 
Again, aside from Oblivion, which had been out for awhil already before the G80 launched, I do not recall any other game that showed that kind of warped ALU focus.
Need for Speed Carbon was one of those. Scaled pretty nicely even between X1800 and X1900. :)

FWIW most of them ALU-happy titles seemed to be Xbox360-ports.
 
Status
Not open for further replies.
Back
Top