NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
=>XMAN26: Perhaps apoppin isn't old enough yet?
Games? Well Oblivion was one of them, but it was the case of almost everything that came out in 2006.

Baby boomer .. not as old as Bush, however
=P
i'm too old to change much, perhaps and i apparently stopped 'trolling', as all of us also apparently also stopped disrespectfully disagreeing with each other. What happened? i am still called stubborn, yet you guys have somewhat accepted my posts - what you called formerly called stupid - is now just improbable or fringe. i do put a lot of thought into my posts even if they are indeed off-the-wall. That is real progress imo.

OK, we also know that there are Engineering Samples of 3870x3. that tells me that 4870x3 is possible; certainly with the bump/shrink:

http://www.bit-tech.net/news/2008/03/28/asus_shows_off_its_hd_3850_x3_trinity/1

That also tells me we have zero clue with an AMD x3 will do against a GT200 GTX or ultra performance-wise. If the X3 somehow wins, then we see a GT200x2 .. my reasoning. That is all.

"In addition, our processors are C-programmable processors. It’s a radically different thing. G71 had shaders and were programmable, but they were not programmable by C. G80 is unified and programmable through C. "

That is all i could find and it really does not admit any mistake; just a logical core evolution. otoh nv30 was an admitted mistake with consequences for nVidia of a much higher magnitude, imo. There is a difference between the so-so g7x and the big flop nv30.


I am still still going to say "cultural" .. the US/European tech sites are just very different from the Asian ones. i can find great and bad examples everywhere. If there is the very slightest minor fault here, it might be the extreme homogeneity of opinion and intolerance for a differing opinion.

so .. a bet on GT200x2 [variant] by the end of '09? is a bet legal? and most importantly, is this fake?:

http://www.pczilla.net/en/post/13.html

We also heard that GeForce GTX 280 graphics card would be the last single chip design from NVIDIA. After GeForce GTX 280, NVIDIA will only bring graphics cards with double and multiple cores design.
 
Last edited by a moderator:
Lukfi said:
Well I wouldn't know. But look, some of the guys around here consider *you* a troll, because you're pretty stubborn in your opinion. But try posting some good looking specs on VR-zone forums (they are officially English speaking btw) and observe the responses:
#1: o_O
#2: hoot dah liao!!!!!!!!!!!
#3: (no text, just a drooling emoticon)
Cultural. Yeah.

I don't know if I'm right, but I suspect the reason for this kind of behaviour being mainstream is that technology in general is much more mainstream in many parts of Asia. So everyone wants to talk about it; but in practice, they really don't know much more than the average European/American, and the average Asian isn't magically a billion times smarter than the average European/American. So naturally, the average levels of intelligence and knowledge are lower... It's not necessary to put culture or genetics or even education into the equation, IMO.

:LOL:

I found this hilarious, as Lukfi was spot on about the VR-Zone forum response impersonations.

I'm getting kind of irritated at the lack of solid news about GT200/RV770, so I'll talk about the "cultural differences" of Asian and Western tech sites.

First of all; there isn't much of a cultural difference as both Lukfi and Arun pointed out. Arun was pretty much right when he said that tech is more mainstream in Asia, and thus more people want to talk about it (despite not having a clue).
Secondly; To be quite honest, Chiphell/VR-Zone are pretty trash when it comes to the quality of content/discussions. They're great for getting news scoops, but if you're going to VR-Zone for an intelligent discussion, you're at the wrong place. It's like going to Fudzilla, The Inquirer or HardOCP to have a meaningful conversation.

A place where you can discuss hardware on a more deeper, meaningful level would be like PC Watch (A Japanese, Akihabara based site, IIRC). These sites are sort of like rare gems though (just like B3D ;) ).

EDIT: Names corrected.
 
Last edited by a moderator:
OK, we also know that there are Engineering Samples of 3870x3. that tells me that 4870x3 is possible; certainly with the bump/shrink:

http://www.bit-tech.net/news/2008/03/28/asus_shows_off_its_hd_3850_x3_trinity/1

That also tells me we have zero clue with an AMD x3 will do against a GT200 GTX or ultra performance-wise. If the X3 somehow wins, then we see a GT200x2 .. my reasoning. That is all.

Just out of curiosity, why would a buggy, non-working (wasn't shown to work), watercooled, AIB prototype of a 3870x3 lead you to believe there will be a production 4870x3 when it didn't even mean there was a production of itself?
 
[rant]
I am experiencing a lot of noise in this thread. Would it be to extreme for the mods to use the ban stick on this new breed of massive spam posters? Or should I just use the ignore feature? Some posts read like machine generated garbage converted to text.
[/rant]
 
GT200 @ 256-Bit and maybe reduced cluster number @ 45nm and a GX2 should be easy to make.
In an interview, NV already confirmed that there will be future GX2-SKU, always at the end of a generation.
 
Just out of curiosity, why would a buggy, non-working (wasn't shown to work), watercooled, AIB prototype of a 3870x3 lead you to believe there will be a production 4870x3 when it didn't even mean there was a production of itself?
Agreed.

Just look at the cooling setup that ASUS had to use: 3 MXM modules, all connected via dual heatpipes to a waterblock.

A mass manafactured, [stock] air cooling setup would be unfeasible.
 
Yes especially in quite a few reviews by Computerbase for instance; with the slight difference that anything G7x had been tested with all AF optimisations switched off and anything R5x0 left on default.
Perhaps that's because X1000-series boards are, on default, set on the Quality preset (with the exception of Radeon X1300 Pro I think). Anyway, the settings used on CB were chosen to produce the same quality on both ATi and nVidia hardware. Of course, when it comes to AF quality, nVidia was still lagging behind - their chips couldn't do AF without angle optimizations (R520 and later can) and had problems with texture shimmering, at least on lower quality settings.
Let's overlook that one for a minute, why isn't anyone switching off optimisations nowadays in G8x/9x vs. R6x0/RV6x0 comparisons? Unfortunately I guess you can't switch them off for the latter.
I think that's because both are already set to high quality by default and the differences are very subtle.
Quasar said:
Need for Speed Carbon was one of those. Scaled pretty nicely even between X1800 and X1900.
Carbon also came to my mind, but it's not a good example. I don't know why, but while it runs very well on R5xx-based boards and scales nicely with the number of pixel shaders, on G80 it didn't run much faster than on the R580. In other games, the difference was much bigger.

=>Wesker: it's KF, not FK. Thanks.
 
well the gt200 doesn't need a gx2 design, its performance per watt, and mm2, is higher then any card previous to it, even the 9800 gx2.
 
well the gt200 doesn't need a gx2 design, its performance per watt, and mm2, is higher then any card previous to it, even the 9800 gx2.
that's just your opinion i wonder if that will change when the next crysis-like game comes out and not even the gt200 can produce playable framerates the 8800gtx was god with like every game (except crysis) and they still came out with the 9800gx2 i wasn't surprised at all they want more money so they whip out something people will give their left arm for gx2 $$$$ variant
 
=>Wesker: it's KF, not FK. Thanks.

My apologies.

well the gt200 doesn't need a gx2 design, its performance per watt, and mm2, is higher then any card previous to it, even the 9800 gx2.

With no benchmarks on GT200, I would say that it's a bit too early to compare its perf/watt with other GPU's.

Secondly; the 9800 GX2 shouldn't be used as a reference for perf/watt or perf/mm^2.
 
With no benchmarks on GT200, I would say that it's a bit too early to compare its perf/watt with other GPU's.

Secondly; the 9800 GX2 shouldn't be used as a reference for perf/watt or perf/mm^2.

....:?: why are we even talking about gt200 performance if we haven't seen any true benchmarks gx2 would be a discussion after gt200 comes out.
 
200805201808234174cz4.png


Got from another forum, no link to original thread where he got it from sadly
 
You can't fit a 512bit bus on a smaller die. And you'll need some serious bandwidth for such a powerful card, so they'd have to improve the memory controller to include GDDR5 support.
i don't understand why they couldn't fit it, as for the gddr5 i don't see them redoing the memory controller for a gx2 variant they haven't before or more importantly why would gdd3 limit them if it doesn't limit already existing cards sure the performance boost would be great but they don't have to there are alot more factors than memory bandwidth that can screw up a sandwich card.
 
Status
Not open for further replies.
Back
Top