NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
To run Nvidia's maximum supported AA mode at one of the highest available resolutions available on a card that's equivalent to the $150 9600GT? That's hardly representative or reasonable. I also have a 640MB GTS and if performance and IQ was a real issue there are products out there right now that are much faster (and appropriately more expensive) that I could trade up to today.

People will always want more and it wont be any different the day after new hardware is released. But how many people do you actually see nowadays complaining about performance in games besides Crysis?

Check the sigs of some of the guys criticizing Nvidia's current product strategy or ATI's lack of competition....you'll see that they're running 2 and 3 year old hardware. A lot of the noise is coming from guys who don't even buy the stuff!

X16AA isn't nVida's official maximum supported AA mode with a single GPU. x16 is x4 MSAA with the CSAA feature -- we're not talking x16Q - which is x8 MSAA with the CSAA feature. DirectX 10 content takes a pretty substantial hit here at times and as time passes will be much more important. There's things like adding AA, resolution and TA technologies to consider as well. Personally targeting DirectX 10 content!

I disagree with you Trin, that's all. Understand the point of people getting tired of their hardware if they have it for some time.........but DirectX 10 needs a good kick-in-the-ass and feel this may be a very important reason for gamers to consider upgrading -- not because they're tired of their old toy in the Pram.
 
Oh, I'm pretty damn sure GT200, G100, NV55 and D10U are all perfectly correct codenames. I don't know if I should be saying that either, but I'm so sick and tired of people saying this or that isn't a proper codename that I just thought I had to let that out. And the latter pretty clearly indicates it won't be called 9900 anyway...

Now you're talking!!

Oh boy I can't wait for a new Shiney nVidia card.
 
I disagree with you Trin, that's all. Understand the point of people getting tired of their hardware if they have it for some time.........but DirectX 10 needs a good kick-in-the-ass and feel this may be a very important reason for gamers to consider upgrading -- not because they're tired of their old toy in the Pram.

No we both agree that DX10 needs a good kick. Where we disagree is on who should be doing the kicking. Faster hardware isn't going to improve the lackluster DX10 efforts we have now. The onus is on developers to get up to speed. I just got back into console gaming and the more I see what devs have done with Xenos and RSX class hardware the more demanding I've become of developers in the PC space, console budget and platform advantages notwithstanding.

This is way OT but it's really annoying that in order to get an atmospheric and thrilling game like COD4 it has to be cross-platform. I'm all for that but it's insulting that PC exclusives rarely deliver that experience. And it's not for lack of powerful hardware.
 
Oh, I'm pretty damn sure GT200, G100, NV55 and D10U are all perfectly correct codenames. I don't know if I should be saying that either, but I'm so sick and tired of people saying this or that isn't a proper codename that I just thought I had to let that out. And the latter pretty clearly indicates it won't be called 9900 anyway...

Well that's good news. A 9900 series would have simply added to the folly that is Nvidia's current lineup. Any idea whether they will keep producing the 9800GTX at $300 when the new stuff hits? I'd be surprised if they kept a "GTX" card at that market position.
 
I've a completely different POV on hardware/software timelines and always of course IMHLO. I don't consider a bunch of games with some D3D10 path a pure D3D10 game. It took eons until the first pure D3D9.0 games ended up on shelves and for some of those even today's D3D10 GPUs seem to have a hard time. Someone would push the envelope pretty much if he'd try to play a game like Oblivion for instance on a R300 with all bells and whistles and a half way decent resolution; things haven't changed much ever since.

By the time real D3D10 finally appear on shelves, even a GF-Next will look like an IGP compared to future GPU hardware of that time.
 
No offense intended, but as far as I can tell, Wavey's post made perfect senses while your reply made none. Gross margins are equal to (Revenue)/(Cost of Revenue). Unless you know what the costs are, the price is meaningless.
i believe i DO know =P

what do you think they are?

a $650 retail Ultra would cost nVidia how much to make [figure their costs inc. R&D]
- i see a FAT margin


and to clear things up there is NO "GT-100"

By the time real D3D10 finally appear on shelves, even a GF-Next will look like an IGP compared to future GPU hardware of that time.
i don't thinks so .. Crysis is a pretty demanding game and i think we will only maybe get "twice as demanding" by the time we go from DX10.1 > DX11. ANd i think the shaders are already 'there' in r700 and gt200
 
a $650 retail Ultra would cost nVidia how much to make [figure their costs inc. R&D]
- i see a FAT margin
Uhhh, first, gross margins don't include R&D - you're thinking of operating margins. Secondly, how much do you think a 400-600mm² chip costs? And 16 memory chips? And a mega-PCB? And a huge cooler? It's not peanuts. Their margins *might* be fat, but you really don't have enough data to claim so reliably, especially for an unreleased product.
 
No we both agree that DX10 needs a good kick. Where we disagree is on who should be doing the kicking. Faster hardware isn't going to improve the lackluster DX10 efforts we have now. The onus is on developers to get up to speed. I just got back into console gaming and the more I see what devs have done with Xenos and RSX class hardware the more demanding I've become of developers in the PC space, console budget and platform advantages notwithstanding.

This is way OT but it's really annoying that in order to get an atmospheric and thrilling game like COD4 it has to be cross-platform. I'm all for that but it's insulting that PC exclusives rarely deliver that experience. And it's not for lack of powerful hardware.

I don't know if faster hardware isn't going to improve the lackluster DX10 efforts we see now but feel it's a combinational effort between Developers and Hardware by IHV's to improve things moving forward at this point, but we shall see.
 
Last edited by a moderator:
Is it me or.. GT200 --> Geforce "Ten" 200. :LOL:

By the time GF next is hitting the shelves to replace the high end, im guessing its the end of the 9800GX2 and maybe the 9800GTX while being replaced with 55nm variants of G92 cards.
 
Uhhh, first, gross margins don't include R&D - you're thinking of operating margins. Secondly, how much do you think a 400-600mm² chip costs? And 16 memory chips? And a mega-PCB? And a huge cooler? It's not peanuts. Their margins *might* be fat, but you really don't have enough data to claim so reliably, especially for an unreleased product.
sure i do .. i will do an analysis if you require


all i know - what is necessary to make my statement - is that GT200 costs less than g80 - everything, less R&D, less costs [except inflation] ... nVidia will make *sure* the margins are fat, OK?
UNLESS, there is a miracle and r700 kicks GT -10-Something's Ass

And *expect* GT-200 the moment r700 is announced
- that is when they finalize their clocks and gear up for their retail market

^feel free to disagree with my prediction^
-and hold me to it [it is a good way to get rid of me - you can say i am a false profit =P]
then
 
What makes you think GT200 isn't still being charged for the same R&D that produced G80?
 
all i know - what is necessary to make my statement - is that GT200 costs less than g80 - everything, less R&D, less costs [except inflation] ... nVidia will make *sure* the margins are fat, OK?
UNLESS, there is a miracle and r700 kicks GT -10-Something's Ass

Hmmmm opinion based on made up facts? Nice approach :)
 
sure i do .. i will do an analysis if you require


all i know - what is necessary to make my statement - is that GT200 costs less than g80 - everything, less R&D, less costs [except inflation] ... nVidia will make *sure* the margins are fat, OK?
UNLESS, there is a miracle and r700 kicks GT -10-Something's Ass

And *expect* GT-200 the moment r700 is announced
- that is when they finalize their clocks and gear up for their retail market

^feel free to disagree with my prediction^
-and hold me to it [it is a good way to get rid of me - you can say i am a false profit =P]
then

False profIt indeed.
 
This article summarizes nicely the release schedule of GT200 and ATI's a bit as well. Looks like it will be GT200 first... a welcome surprise for those looking to buy the highest performing parts of the two (given the same card type... x2 vs. x2, etc.):

http://www.tgdaily.com/content/view/37453/135/

"highest" if assumptions going around are correct, anyway.
 
False profIt indeed.

not accidental; i intended to make it clear it was a prediction unsupported by "facts", but by insight =)
- i also offered to analyze my "margin" estimations and even include nVidia's estimated R&D costs to show GT200 will have a FAT margin
[ i also know how to spell prophet]

these predictions are best analyzed after the event happens, wouldn't you say? Please, look at the title of this thread and tell me i do not fit here:
"NVIDIA GT200 Rumours & Speculation Thread"
-my specialty .. i am going to call it "apoppin's R&S" - as long as you know it is just that - my specualtion

It is called GT200. But it might not be called GeForce 9900. (I happen to be the one of the people that know more than Fudzilla & co. And no, I'm not gonna tell you :p

We also know and we think it is quite clever =P


Oh, I'm pretty damn sure GT200, G100, NV55 and D10U are all perfectly correct codenames. I don't know if I should be saying that either, but I'm so sick and tired of people saying this or that isn't a proper codename that I just thought I had to let that out. And the latter pretty clearly indicates it won't be called 9900 anyway...
What is G100?
.. the new tesla archecticture is all GT-200
.. there was no g80 and "g40"

What shall we call it except for GT200 or would nVidia appreciate it to let it slip out now?
- i think not

No we both agree that DX10 needs a good kick. Where we disagree is on who should be doing the kicking. Faster hardware isn't going to improve the lackluster DX10 efforts we have now. The onus is on developers to get up to speed. I just got back into console gaming and the more I see what devs have done with Xenos and RSX class hardware the more demanding I've become of developers in the PC space, console budget and platform advantages notwithstanding.
WithOUT the HW, there is no encouragement to the devs. The base of PC users that play DX10 games must GROW!
 
Last edited by a moderator:
False profits are non-gaap? :cool:


The issue lots of people have not recognized till this day is that the inventory issue of 8800GTS 320MB.

Currently, you can purchase 8800GTS 320MB with less than 120~ 130 USD in the worldwide.

GT200 will be deemed as the monster chip during 2008, it is unquestionable.
 
Status
Not open for further replies.
Back
Top