NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
GT200 would be cheaper to produce then the GX2 without a doubt (2PCBs and 2 rather large dies can't be too too cheap after all) and it would perform better and could be priced higher. Now why wouldn't NV want to release it again? They would just increase their margins while dominating the competition to an even greater degree. They could price the GT200 above the GX2 while they clear out old inventory and then adjust the price later down the road. There's just no reason not to do it. IHV's always release their parts the moment they are ready.

Actually, there's been some buyer backlash to Video card prices increasing at a stratospheric rate the past few years. 700 USD for a high end video card when you can get a high end CPU, MB, and Memory for the same or less (If you exclude the extremely low volume E-peen Intel Extreme procs)? Granted if you absolutely postively have to have that extra 2-5% more performance from extremely high priced memory and MB's you could be looking at well over 1 grand. :p

Not to say it was by choice, but ATI did score some points with quite a few people for releasing relatively high performance parts for a very nice price.

Sure they could price GT200 wherever the heck they wanted. But price it too high and you won't have to worry about supply in the channel since very few people would be interested in buying it. At which point they'd now be hurting in trying to recoup the cost of R&D for the chip.

No company can ignore what consumers are willing to pay for a piece of hardware. So while they have a little latitude in pricing, they just can't pick an arbitrarily high price point and say this is it.

So for GT200 not only would it have to be faster, but it also has to be cost effective to produce. So while there's certainly some truth in Nvidia not being pressured into releasing it. I'm also fairly certain they aren't doing it deliberately to prolong the life of G80.

It's the other way around. The success of G80 and lack of high end pressure from AMD/ATI has allowed them the luxury of refining and ironing out wrinkles without having to release a chip before it's optimally ready. IE - They don't HAVE to try to stick to their initial release schedule if there's some major/minor problems with the chip. I'm sure ATI wished they had this luxury when they were working on R600.

If GT100/200/whatever it currently is or was...had it been ready for release, Nvidia would have released it. The fact that they didn't indicates they ran into issues they felt had to be addressed.

I'm sure they don't want to have repeats of past chips where advertised features didn't work after going through the pre-release advertising/marketing push promised those features in the cards. Unfortunately back then while I'm sure they would have liked to have postponed the release of those chips, they didn't have the luxury since AMD was putting pressure on them. Ergo, we ended up getting mostly working partially defective chips that were deemed "good enough" to compete.

Regards,
SB
 
I included the DX11 thing simply as an example of extreme forward push, having a part supporting the API before the API is even announced:D. I could just as well have said:a card that runs Crysis(as that seems to be the current favorite) with 8X AA at 2560x1600 with Very High settings.

I understood as much; I have severe doubts that even a future first D3D11 GPU would be able to hit such a high target. Some funky refresh =/>2011 probably yes.
 
Interesting link... what does it mean though when they say "announced"? Is that typically the time they are available as well?

I wonder what new information we'll hear between then and now, on both Nvidia and ATI fronts.
 
The thing is from my point of view 9800GX2(3870X2) is a big failure. Really only one major game really needs a power boost and that's crysis, and multi-GPU setups dont seem to help that game at. Even SLI seems to have little effect. For all it's brute strength 9800GX2 barely dents Crysis. Even if you get a slightly higher average framerate, it's decimated by very low frame rate dips. I was looking into 9600GT SLI as well, which for a low price puts up some amazing performance number, until again, you look at the one game that really needs it, and it makes virtually no difference. So who cares if you run COD4 at 140FPS instead of 100?

This is where we really need a true next gen chip..

Also, why do AMD and Nvidia seem to know exactly what the other is doing? 8800GT comes out...3870 follows. 3870X2 comes out, 9800GX2 follows. They shadow each other perfectly, and knowing this stuff is months in the planning..
 
The thing is from my point of view 9800GX2(3870X2) is a big failure. Really only one major game really needs a power boost and that's crysis, and multi-GPU setups dont seem to help that game at. Even SLI seems to have little effect. For all it's brute strength 9800GX2 barely dents Crysis. Even if you get a slightly higher average framerate, it's decimated by very low frame rate dips. I was looking into 9600GT SLI as well, which for a low price puts up some amazing performance number, until again, you look at the one game that really needs it, and it makes virtually no difference. So who cares if you run COD4 at 140FPS instead of 100?

This is where we really need a true next gen chip..

Also, why do AMD and Nvidia seem to know exactly what the other is doing? 8800GT comes out...3870 follows. 3870X2 comes out, 9800GX2 follows. They shadow each other perfectly, and knowing this stuff is months in the planning..

On the other hand, people with similar backgrounds, similar training, acting on the same market, targeting the same customers are somewhat likely to come up with similar solutions(within the confines of the "tools" available to them). It's not like nV and ATi are some inherently different beasts once you get down to the ppl making up both IHVs.
 
Strangely though vr-zone begs to differ:

http://www.vr-zone.com/?i=5684

Yeah I saw that...I actually got the first link from this vr-zone article. But if GT200 is the basis for a 9900 series there's no indication that GT200 is their next generation monolithic chip. It could be a reference to G92B.

Look at it this way...Nvidia jumped from a 8800GTX to 9800GTX for zero performance gain. They're now going to go from 9800GTX to 9900GTX with a massive jump in performance?

Edit: Sorry, nvm thought you were linking to http://forums.vr-zone.com/showthread.php?t=256538. So 9800* is just a temporary stop gap....if so I don't see the point. Why not just release a 8800 GX2 and save the 98xx series for GT200?
 
Why not just release a 8800 GX2 and save the 98xx series for GT200?
NVidia's marketing went pear-shaped when HD3870 appeared before Christmas (the whole 8800GTS mark 1/2 fiasco). So 8800/9800/9900 are scrambled and it's not very edifying trying to infer future meaning from "9900".

Jawed
 
Bah! I guess it would just be par for the course if they went and branded GT200 as another 9x00 series part.

I guess there isn't any reason to think nvidia might try and clean up their naming schemes? It's just so darn confusing trying to align all the G80s and G90s with the GT200 and NV60 when you have 8x00s and 9x00s all crisscrossing! I don't even want to get started on whether they will begin to differentiate based on node size...

Sorry for the rant, it just seems so unorganized. Maybe it's just to throw us off...
 
Edit: Sorry, nvm thought you were linking to http://forums.vr-zone.com/showthread.php?t=256538. So 9800* is just a temporary stop gap....if so I don't see the point. Why not just release a 8800 GX2 and save the 98xx series for GT200?

The whole naming scheme since G9x appeared sucks IMHLO. First we had with the advent of the 8800GT an endless list of old/new 8800-derivatives and now they went to 9xxx-whatever.

It could have been since G92 something like that f.e.:

8800GT = 8850GT
8800GTS = 8850GTS
9600GT = 8700GT
9800GX2 = 8900GX2
9800GTX = 8900GTX
9800GT = 8900GT etc etc.

Save the 9x00 stuff for whatever G92b they're planning and go for "GF10" with GT200.

Normally 9900-whatever would stand for G92b, but with the recent naming scheme stunts, who knows what is what at this point. Random rumours don't tell me personally anything either. If you check some links in the initial post of this thread you'll find GT200 being supposedly a 384SP GPU named 9800GTX under 55nm.

In that vrzone newsblurb they give credit to CJ for an abnormally huge die; CJ is usually quite well informed. If the final naming scheme for a GT200 shouldn't be 9900-whatever I don't think it's his responsibility for said confusion.
 
8800GT = 8850GT
8800GTS = 8850GTS
9600GT = 8700GT
9800GX2 = 8900GX2
9800GTX = 8900GTX
9800GT = 8900GT etc etc.

actually this makes the most sense:

8800GT = 8900GT
8800GTS = 8900GTS
9600GT = 8700GT
9800GX2 = 8900GX2
9800GTX = 8900GTX
9800GT = WTF is the point of this product? I mean seriously! Is there really that much difference between this and the 8800 GT? and 8800 GTS 512? Seems incredibly pointless to me.

[/quote]
 
9800GT is supposedly capable of Tri SLi. Either than that id wish nVIDIA used the above naming scheme provided by aaron. It actually makes sense..

Guess they didn't want to call their true next generation architecture "the 9 series" but instead something else. Hopefully not geforce X series... :rolleyes:
 
9800GT is supposedly capable of Tri SLi. Either than that id wish nVIDIA used the above naming scheme provided by aaron. It actually makes sense..

Guess they didn't want to call their true next generation architecture "the 9 series" but instead something else. Hopefully not geforce X series... :rolleyes:

AFAIK no, it isn't. Tri SLi is from 9800GTS(?) upwards.
 
9800GT is supposedly capable of Tri SLi. Either than that id wish nVIDIA used the above naming scheme provided by aaron. It actually makes sense..

Guess they didn't want to call their true next generation architecture "the 9 series" but instead something else. Hopefully not geforce X series... :rolleyes:

tri-sli? man what a useless "feature". certainly not worth a product name change. As it is though, I'm sure some idiot will buy three and brag about it.

Wake me up when either Nvida or ATI have some multi-gpu technology worth using. I'm surprised they haven't just copied the frame buffer a couple of time yet and counted that as FPS.

But seriously, inter-frame interpolation would gets you the same effect as the current "SLI" technologies on the market.

Aaron Spink
speaking for myself inc.
 
http://www.nordichardware.com/news,7578.html

Several sources are reporting on the upcoming GT200 core from NVIDIA, which will be named GeForce 9900. The stories don't match though, when it comes to the specifics. While Expreview claims that GT200 will become the 9900GX2 and 9900GTX, VR-Zone says GeForce 9900GTX and 9900GT. None of them are very specific when it comes to the exact specifications. A user over at Chiphell is though. He says that GT200 will in fact be a dual-core GPU based on the G9x architecture.

While we're certain that the GT200 is hot and will be a huge chip. It looks like it could very well be a dual-core GPU. The users says GT200 is really two G94 chips slapped together, which means that there is a total of 64TMUs, 32 ROPs and 128 shader processors. It would also result in 512-bit memory bus and more than 900 million transistors. Hot in every sense of the word.

Dual-core makes sense on a few points; GT200 is not a proper code-name for an NVIDIA core, but would sooner indicate something like a dual-core spin. The name GeForce 9900 series also points to that GT200 is another spin off the G9x architecture, and not a true next generation architecture.

The launch date is suppose to be early Q2.

Alas, apparently still no DirectX 10.1 support.

PS GT = Twin?

;)
 
Status
Not open for further replies.
Back
Top