NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
On the backside it seems there are 4 memory chips on the right, left and bottom. Maybe also 4 on the bottom. So it seems all memory is at the backside of the board, very unlike the CAD drawing.
Oops, this is just the cooling system, seems like all 16 memory chips are at the front then.
 
I think Arun was thinking about me when he wrote his posts and I wholeheartedly agree. After having played games @ 2560 x 1600 I simply dont want to go back to anything less! Of course with AA and AF maxed out.
 
Personally, I'd rather play them at 1366x768 on a huge TV. 1080p is way too expensive yet IMO and ~720p allows for some high AA on many video cards.
 
I for one can't wait for G80 era to come to close, amazing invesmtent that 8800 GTX turned out to be nonwithstanding.
 
Personally, I'd rather play them at 1366x768 on a huge TV. 1080p is way too expensive yet IMO and ~720p allows for some high AA on many video cards.

Indeed. In fact my 8800GTS 640mb struggles at 1920x1200 with 16xCSAA and 16xAF in many modern graphically intensive games.

TBH i'm looking forward to GT200 just so I can finally use that resolution properly!
 
Indeed. In fact my 8800GTS 640mb struggles at 1920x1200 with 16xCSAA and 16xAF in many modern graphically intensive games.

TBH i'm looking forward to GT200 just so I can finally use that resolution properly!

But Mr liverpool you're due for an AMD card! :p

OT: Im concerned that I might want to spend more money in a couple of months.
 
Errr... ;)
I'm thinking of the market which bought a 8800 GTX near launch at the full price, or which is buying a multi-GPU configuration right now.

That's not the definition of an enthusiast though. That's the definition of somebody who bought a GTX at launch at the full price who wants to play Crysis at 2560x1600.

Anyway I'm not saying we're not due new hardware. Just saying that there isn't an overarching sense of people needing more power to play the games that they do. You're gonna have one or two people like pjb but for most it's more that they're just tired of the old toys in the pram.
 
Did NVIDIA make a G80 "X2" card? No. The G80 core is too big and too hot to stuff two of them onto a 'single' card. With G92's reduced power size and power consumption, it became much much more feasible to put two cores so close together. But even with G92's higher efficiency, the 9800GX2 is just adequately cooled.

If two GT200 cores (rumored TDP >200W each) were to be put in the same place, a dual-slot aircooler wouldn't be able to effectively dissipate that kind of heat. A triple-slot design would be cumbersome and perhaps too heavy, and there aren't enough enthusiasts with watercooling to justify having a 'watercooling only' card.

If/when GT200 is shrunk to 55nm, it might be possible, but not likely before then.

I also really doubt a monster like GT200 has "awesome margins." Yields won't be great with such a big core, and in a time where $200 cards can run most games at high settings on 1920x1200 monitors, it's going to be a hard sell. It's not like average gamers are really clamoring for more performance right now.

actually you are right ... the shrink will be what enables GTx2

thanks for the clarification!

i should clarify .. "awesome margins" for what they sell them for. The GTX is $500, right [at least]; what is the Ultra rumored to be, $650?
- that IS a good margin in my book
"average gamers" don't buy the GTX or the Ultra; me!? i am *dying* for GT200GTX to play Crysis on Very High at better than what i am playing it now - 11x8 or 10x7 w/2AA
:rolleyes:

AMD would die for those margins or the performance [if the rumors are to be believed]

And of course, nVidia will use their cut down versions of the GT to make their lesser midrange GPUs that the average consumer craves sub $400 .. and probably good value at the $250-299 magic price point [imo]
--i hear Jen is heard to break into song nowadays when he thinks no one is around
=P
 
Almost sounds like a poll question:

Do gamers desire a newer generation higher-end product for more performance and IQ or really just tired of the old toys in a pram?

I would go with the performance and IQ.
 
i should clarify .. "awesome margins" for what they sell them for. The GTX is $500, right [at least]; what is the Ultra rumored to be, $650?
- that IS a good margin in my book
You missed a logical step there. You need to know the cost before you can think about margins!
 
Almost sounds like a poll question:

Do gamers desire a newer generation higher-end product for more performance and IQ or really just tired of the old toys in a pram?

I would go with the performance and IQ.

Is that a trick question? When would people ever not want more performance and IQ? My point is that the performance gain is not going to have a material impact on the experience with the vast majority of games and configurations out there.

IQ is far more dependent on developers making use of future hardware performance than it is on slapping AA and AF on older titles so faster hardware isnt going to magically make current games look better. This is a circular argument. I was simply responding to Arun's claim that people are clamouring for more single-chip performance so they can play Crysis. That's simply not the case.
 
It was also a response to PJB's view which you even offered:

You're gonna have one or two people like pjb but for most it's more that they're just tired of the old toys in the pram.

What did PJB desire?

Indeed. In fact my 8800GTS 640mb struggles at 1920x1200 with 16xCSAA and 16xAF in many modern graphically intensive games.

Is this just Crysis?

It's having the ability to utiilize a higher resolution with some levels of impressive IQ in the more modern engines - hence the reasons to craving a newer generation product. A fair view and certainly one or two people don't feel this way.

So, do gamers desire a next generation product to improve performance and IQ over their existing products or really just tired of old toys in a pram?
 
You missed a logical step there. You need to know the cost before you can think about margins!

actually i didn't

i was asking for clarification from you guys when i *suggested* my prices .. they are indeed my guess; and frankly i am thinking nVidia wants near $700 for their Ultra if it stomps the 4870x2
-When i say thinking, i mean my personal speculation based on HW companies' history of over-pricing - again, imo - when the competition is not stellar compared to the performance leader; pricing tends to go up
- and whoever holds the performance crown, their midrange just needs to be priced above the midrange of the competing company if it is a better performer - even if the manufacturing costs are similar

what i guess i was asking for was - Discussion: What do you think GT200-GTX and ultra will be priced at?
-i am just saying "high" in comparison to r700's fastest
 
What did PJB desire?

To run Nvidia's maximum supported AA mode at one of the highest available resolutions available on a card that's equivalent to the $150 9600GT? That's hardly representative or reasonable. I also have a 640MB GTS and if performance and IQ was a real issue there are products out there right now that are much faster (and appropriately more expensive) that I could trade up to today.

People will always want more and it wont be any different the day after new hardware is released. But how many people do you actually see nowadays complaining about performance in games besides Crysis?

Check the sigs of some of the guys criticizing Nvidia's current product strategy or ATI's lack of competition....you'll see that they're running 2 and 3 year old hardware. A lot of the noise is coming from guys who don't even buy the stuff!
 
$599-$649 sounds about right.
That $499 might be the "GTS" model.

Only if it completely stomps the upcoming 4870. If the performance is close, Nvidia whon't have a choice but to price the GTS near the 4870 (399 or was it 299? I forget already) and the GTX will naturally then price around 499.

Significantly lower than I'm sure Nvidia wishes it could price the GTX.

For Nvidia to continue with their historical (up to G80) trend of maintaining or increasing the price premium for enthusiasts class cards, it's going to have to walk all over the 4870 just as the G80 walked all over the R600.

A trend that was quite soundly broken with the release of the 9800 GTX due to price pressure in the midrange/performance sector from the 3870.

Regards,
SB
 
Only if it completely stomps the upcoming 4870. If the performance is close, Nvidia whon't have a choice but to price the GTS near the 4870 (399 or was it 299? I forget already) and the GTX will naturally then price around 499.

I haven't seen any rumours yet that suggest a die size that exceeds the 240-260 square millimeters (@55nm) for RV770. Just how many units can AMD stomp inside those 50-70mm^2 more than RV670 that it would be able to beat a hypothetically new high end GPU from NV where rumours speak mostly of a monstrous die size so far? Or else better why would AMD even think of combining two chips on one PCB as a high end sollution?

For Nvidia to continue with their historical (up to G80) trend of maintaining or increasing the price premium for enthusiasts class cards, it's going to have to walk all over the 4870 just as the G80 walked all over the R600.

I thought it has been officially admitted by AMD that they intend to address the high end desktop market with dual chip configs. In order for such a strategy to make more sense I have the feeling that combining two performance chips would be the best sollution. Anything that would exceed the 300 square millimeters per chip die size sounds rather like a strategy that intends to address the high end segment with single chip sollutions.

A trend that was quite soundly broken with the release of the 9800 GTX due to price pressure in the midrange/performance sector from the 3870.

The 8800GT was released months ago compared to that one and at even lower prices. Performance was already just a relatively small notch below the 8800GTX and higher or sometimes the same with the former 8800GTS (depending on memory amount).
 
$599-$649 sounds about right.
That $499 might be the "GTS" model.

I also agree. Assuming GT200 shares the same architectural traits with G80, then the 240SP figure points to GT200 having 15 clusters (8x2 scalar ALUs per cluster). Where as a full fledged GT200 probably has all 16 clusters enabled.
 
I also agree. Assuming GT200 shares the same architectural traits with G80, then the 240SP figure points to GT200 having 15 clusters (8x2 scalar ALUs per cluster). Where as a full fledged GT200 probably has all 16 clusters enabled.

IMHLO the rumoured 240SPs are true, then they're for the maximum config for the highest end chip. And why so many clusters anyway? What speaks theoretically against something like 8*3/cluster?
 
To run Nvidia's maximum supported AA mode at one of the highest available resolutions available on a card that's equivalent to the $150 9600GT? That's hardly representative or reasonable.

16x CSAA is actually NV's 4xMSAA + 16x Edge AA implementation (unless i'm mistaken). Anyway thats the one i'm using which is pretty modest compared to the 16xQ CSAA (or whatever they call it) which uses 8xMSAA.

The one I use is barely supposed to have a performance impact over standard 4xMSAA. Add in the 16xAF in there and I don't think those image quality settings are unreasonable for an upper level G80. I also have multisampled Transparency AA turned on. Again thats not supposed to have much performance impact at all.

The reason I want to play all games at those settings is simply for convenience - I don't want to have to mess around in CP every time I play a game. Plus I just want consistency in image quality between my games. That way I won't notice jaggies in a game that uses lower settings :smile:

So basically what i'm looking for is a card that will let me turn on those settings and forget about them - then allow me to play every game at my monitors native resolution (something I don't have a choice about given the size of monitor I wanted).

Examples of games in which I can't do this are....

Oblivion (modded)
ST Legacy
Bioshock
CoD4
CM: Dirt
RD: Grid
UT3
Crysis
Lost Planet

And many more that I can't think of right now. In some cases the performance is so far below what I would require (a solid 30 fps is fine) that there is no way even a 9800GTX could pull off 1920x1200 at those settings.

I do agree with your position that the GPU power available out there today is plenty to make most people very happy and there certainly isn't an issue with me being able to enjoy any of those games. Its just that to run at 1920x1200 with good, but not outlandish image quality settings isn't quite there yet for a few of the most stressfull games IMO (unless you use SLI of course).
 
Status
Not open for further replies.
Back
Top