NVIDIA: Beyond G80...

Interesting....they are believable but I actually expected a faster 8600. Also the $30 difference between the Ultra and GT seems too small - the GT is cut down in every possible way and should be significantly slower.
 
Interesting....they are believable but I actually expected a faster 8600. Also the $30 difference between the Ultra and GT seems too small - the GT is cut down in every possible way and should be significantly slower.

I have to agree. The price difference should be greater between the ultra/GT because judging from the specs the performance difference is abit large.

Then again the pricings and some of the specs could be hoax. But i believe these are just an indication of what we could expect from the nVIDIA mid range solutions.
 
What's the point of discussing random guessworks when many people on these very forums are likely to be able to make up even more believable specs themselves, though? :)

Uttar
 
True, but I don't think the focus on the first two quarters of dx10 is anything about low-end and midrange.. or do you actually WANT to run something like Crysis or, god forbid, Unreal Tournament on a lowly mid-range card? I think it will be a crying game for most people that assume that their mid-range dx10 cards are that much faster than their older high-end cards.

Ironically both fore mentioned games are pure D3D9.0 applications. Why would anyone not want to run games like Crysis or UT2k7 on a midrange D3D10 if he can't or doesn't want to afford more for a GPU? You just reduce settings and/or resolution.

Going from a high end D3D9 to a midrange D3D10 GPU would be truly somewhat a side-grade; a real upgrade would be from a 7600GT to a 8600Ultra (or insert whatever name for the highest mainstream GPU) for instance. Someone that usually buys high end GPUs, normally also upgrades to high end GPUs. What am I missing exactly?
 
What am I missing exactly?

The whole focus on graphics/DX10 will be a hard pill to swallow for anyone who thinks their dx10 card will be a real upgrade whallop. I'm just afraid the first gen. of dx10 games will be a dissapointment once you do exactly that, lower the settings etc.
 
The whole focus on graphics/DX10 will be a hard pill to swallow for anyone who thinks their dx10 card will be a real upgrade whallop. I'm just afraid the first gen. of dx10 games will be a dissapointment once you do exactly that, lower the settings etc.

For the first generation of D3D10 games even R600/G80 will be slouches. I know it's hairsplitting but I refuse to call a D3D9 game anything else.

In any case you can't run Oblivion either on a 7600GT in =/>1600*1200 with maximum settings and Lord help any AA. What you pay is what you get applies for any of those cases.

And again: anyone that owns a 7900GTX/X1950XTX if he'd want to go for an upgrade there's no other sensible choice than R600/G80. Anything else would be a side- or downgrade.
 
I'm just worried that the focus is too much on graphics and that we'll just all be left with mediocre performance across the board. that is .. comparing crysis dx9 versus crysis dx10
 
I'm just worried that the focus is too much on graphics and that we'll just all be left with mediocre performance across the board. that is .. comparing crysis dx9 versus crysis dx10

Because the SM3.0 path brought any huge performance increases in Far Cry as a past example? How can you expect to see huge differences in a game like Crysis that was written and optimized from the get go for D3D9.0. I wouldn't expect anything else but minor performance increases with a D3D10 path.
 
I wouldn't expect anything else but minor performance increases with a D3D10 path.

Me too.. but the word from CES is that the DX10 path is a lot faster, somehow I can't rime that with what we've seen in the past; less work is better performance.
 
Me too.. but the word from CES is that the DX10 path is a lot faster, somehow I can't rime that with what we've seen in the past; less work is better performance.

Depending on situation the SM3.0 over the SM2.0 path delivered from ~6% to 45% more performance. The majority lies closer to the first persentage, yet I could have said easily back then up to 45% higher performance.

I don't and can't know if it's a similar case with Crysis, but one spot where a D3D10 path would accelerate over a D3D9 path would be bump mapping. I'm not aware what else they might have done, but my gut feeling tells me that the average difference won't be as big after all. Besides "a lot" is quite vague.
 
a lot in intel's wording would be "six to eight times as fast" ...

I can imagine instructions being handled that much quicker, but I just can't see a dx9 version strugling at 10 fps and a dx10 version going happily at 60fps with the same settings.

I'll just wait for the benches then...
 
What's the point of discussing random guessworks when many people on these very forums are likely to be able to make up even more believable specs themselves, though? :)

Uttar

Dont take all the fun in this. :LOL:

Its been quite quiet these days. Alright, since your ruining the fun, come up with a believable specs and lets talk about it.
 
Going from a high end D3D9 to a midrange D3D10 GPU would be truly somewhat a side-grade; a real upgrade would be from a 7600GT to a 8600Ultra (or insert whatever name for the highest mainstream GPU) for instance. Someone that usually buys high end GPUs, normally also upgrades to high end GPUs. What am I missing exactly?

Good point.
 
Nvidia Boosts Production of DirectX 10 Graphics Chips.

Nvidia Places Additional Orders on “80nm GeForce 8000” Chip Production


Nvidia Corp., a leading developer of discrete graphics processors and core-logic sets, has reportedly placed “urgent” orders on production of its flagship GeForce 8-series lineup. The reasons behind the move are unclear, but the most likely explanation would be expectations for tangible increase in demand for DirectX 10-compatible graphics cards.

DigiTimes web-site reports that shipments of the “urgent orders” of Nvidia’s “GeForce 8000 GTX and GTS” chips will begin in March and will be equivalent to 3 to 4 thousand 300mm wafers a month.

It is claimed that the wafers will use the “80nm process technology”. Currently Nvidia produces its GeForce 8800 GTX and GTS chips using 90nm process technology at Taiwan Semiconductor Manufacturing Corp. (TSMC), hence, it is uncertain whether the web-site actually meant new 80nm flavour of the chip code-named G80 or additional orders of the 90nm chips. The likely scenario is that Nvidia has placed orders to manufacture code-named G84 and G86 products.

Given that die size of the G80 chip is about 420 square millimeters, it is possible to obtain 140 – 150 of such chips of a single 300mm wafer, which will give 420 – 600 thousand of additional high-end graphics processor candidates per month (or 1.26 – 1.8 million a quarter). The number of graphics chips that are functional is lower than the amount of candidates, though, the actual yield of the G80 is unknown.

The market of graphics cards that cost from $250 to $700 accounted for 4% of revenue – or about $200 million – of the leading add-in card suppliers in Q3 2006, according to Jon Peddie Research. Average sales price of high-end enthusiast graphics cards may be as high as $475, which means that around 420 thousand of such boards are sold quarterly. Given that the yield of chips in mass production is much higher than 50%, it makes no sense for Nvidia to produce that many high-end chips.

Nvidia and TSMC did not comment on the news-story.

http://www.xbitlabs.com/news/video/display/20070116070038.html


At first, i considered this to be a response to the likely increased demand for "DX10" when Vista hits, mostly of integrated, low-end and mainstream GF8's, but then...:

“urgent orders” of Nvidia’s “GeForce 8000 GTX and GTS

Could this mean a die shrink of some kind for the G81, because they know something about R600 that we don't ? The GTX suffix was never used on anything but a high-end part, hence my confusion (or it could just be Digitimes messing things up...:D).
Just speculating.
 
Well, I find myself wondering if they've got the right sku, or possibly even the right die (market segment-wise), there.
 
May sound stupid, but will a 80nm G80 enable a GX2 style card, thermaly and power wise?
Maybe NV recieved some info about R600 performance and had to react "urgently" ;) ?
 
Back
Top