NVIDIA: Beyond G80...

Funny to see how fast this almost 2 months old forum post spread in the net, almost all big hw forum already have a topic for it, most of the users talking about it as for official confirmed specification :smile:
When i need to choice from the 2 source than vrzone better source for NV rumors about G92 than a forum member post :smile:

Why NV try to release in november a 65nm highend GPU when they have no competition for it?
Its much easier to start the 65nm technology with not the higend GPU, the time is on NV side, they not need to take any risk, step is big from 90nm to 65nm and the risk is high, its not a "simple" shrink.

I think g92 will be a nice 256bit performance card with 8800gts performance (or slightly lower/better) but the cost to make much less than the 8800gts, so its will be cheap and NV earn more money on it.

G9x highend GPU at 65nm more likely in 2008Q1.
 
Why don't you see the FX as superscalar? I'm not all that familiar with it, but I had the impression it is also superscalar.
 
Why NV try to release in november a 65nm highend GPU when they have no competition for it?
Because it'll be a year since G80 came out. The time when they would have started working on G92 would have been several years ago; back then there could have been no way of predicting that ATI would fail quite so miserably when it came to providing competition for G80, so planning a new high-end chip one year after the previous one would have made sense. By the time it became obvious that there wasn't any competition, it would have been too late to start working on a completely new design for release in early 2008.

I suppose Nvidia could just take 6 months holiday and do nothing at all until ATI catches up, but I can't see the shareholders liking that. :)
 
Why NV try to release in november a 65nm highend GPU when they have no competition for it?
Its much easier to start the 65nm technology with not the higend GPU, the time is on NV side, they not need to take any risk, step is big from 90nm to 65nm and the risk is high, its not a "simple" shrink.

I think g92 will be a nice 256bit performance card with 8800gts performance (or slightly lower/better) but the cost to make much less than the 8800gts, so its will be cheap and NV earn more money on it.

That's what I also think. Enthusiasts will be served with a G92GX2 card and so everyone will be happy. :D

And so NV can more concentrate on the D3D10.1 HighEnd-GPU, which maybe comes with G9x midrange in spring.
 
They (ATI) are calling it the second generation. But does it really matter? Especially when their second generation can't keep up with the NV's first?
Do you compare architectures (/way, how is unification implemented), or performance of particular products? I think ATi's way is better, because G80 has some limitations, which prevents to show full potential of unified core. I mean operations with many vertices. G80 has fast unified core, but slow front-end prevents to show advantages of unification and in many situations, when G80 operates with many vertices, it's vertex shader/geometry performance isn't significantly better, than on last-gen non-unified GPUs.
 
it's vertex shader/geometry performance isn't significantly better, than on last-gen non-unified GPUs.

Maybe it should not? ;) NV wants to also sell Quadro cards.

And GS-performance is in newest test with new drivers not so bad -> look on RM3D 2.0.
 
Because it'll be a year since G80 came out. The time when they would have started working on G92 would have been several years ago; back then there could have been no way of predicting that ATI would fail quite so miserably when it came to providing competition for G80, so planning a new high-end chip one year after the previous one would have made sense. By the time it became obvious that there wasn't any competition, it would have been too late to start working on a completely new design for release in early 2008.

I suppose Nvidia could just take 6 months holiday and do nothing at all until ATI catches up, but I can't see the shareholders liking that. :)

Shaderholders know what happend with ATi when hunting the lower process technology, and shaderholders see how big lead NV have in the discrete GPU segment i think most of them choice the "less risk route", and not the high risk with highend GPU at 65nm.

Check it what happend with L.P.H. AMD again with they 390million transistor 65nm midrange GPU (its more a 65nm lowend uper half GPU), i don't think NV need to take so big risk with a 1 billion transistor highend GPU, its clearly visible from r600 performance (i think 95% of the user realized already no rx6xx "magic driver" coming what change the whole picture 180 degree) AMD can't do anything to catch up, they not even have a 8800ultra segment card, 65nm r600 won't change anything, 2xrv670 card won't change anything either.

Without release any 65nm highend GPU this year NV still have end of the year ~80% of the dx10 discrete graphics segment, almost all dx10 coming this year part of twimtbp program, so there is zero presure.

(everything what happening its a very bad situation rom user aspect, but this is another story)
 
Last edited by a moderator:
it's vertex shader/geometry performance isn't significantly better, than on last-gen non-unified GPUs.

Ok vertex_shader ^^, now get off your arse and start performing better! Now! ;)
 
8800GT against 2900Pro(?) in video accerlation:
7e751b3c66cd440d8347b34qe7.jpg

http://www.hardspell.com/english/doc/showcont.asp?news_id=1111

Could be this G92?
 
8800GT sounds promising if true. That could finally be a suitable card for HTPC users like me who can't use double-slot power hogs.
 
Amusing really, since 8800GTx can't run full resolution, 1920x1080, HDCP-protected content.

Jawed

I've never heard that before.
Isn't that only of interest to those who have a monitor that requires a dual-link DVI to work (i.e., a 30 inch LCD, usually with 2560 x 1600 native resolution) ?
To my knowledge, a single-link DVI output with HDCP can drive a 1920 x 1200 @ 60Hz display, more than enough for a 1080p Blu-ray or HD-DVD, but i might be mistaken.
 
Amusing really, since 8800GTx can't run full resolution, 1920x1080, HDCP-protected content.

Jawed

It can.
I've never heard that before.
Isn't that only of interest to those who have a monitor that requires a dual-link DVI to work (i.e., a 30 inch LCD, usually with 2560 x 1600 native resolution) ?
To my knowledge, a single-link DVI output with HDCP can drive a 1920 x 1200 @ 60Hz display, more than enough for a 1080p Blu-ray or HD-DVD, but i might be mistaken.
You're right.
The source of this myth:
http://uk.theinquirer.net/?article=35966
* CORRECTION We mean "over" 1920 by 1200 rez. Sorry.
 
Last edited by a moderator:
Back
Top