The G92 Architecture Rumours & Speculation Thread

Status
Not open for further replies.
So nvidia high midrange will likely be

8700GTS = 750Mhz, 64 shaders, 512MB 256bit 1000MHz GDDR3 memory (1.0ns)

8800 GTS = 500MHz, 96 shaders, 320MB 320bit 800MHz GDDR3 memory


Which is better? Do you plump for memory amount or memory bandwidth ??

I assume the 8700GTS will be cheaper, if so will the 8800GTS be phased out do you think?

Its rumored to go for a MSRP of $249 which is a great deal if you ask me. Single slot too! :D

However, considering the fact that the G82 and G80 composes a different ALU:TEX:ROP ratio, which would G92 follow?
 
So nvidia high midrange will likely be

8700GTS = 750Mhz, 64 shaders, 512MB 256bit 900MHz GDDR3 memory

8800 GTS = 500MHz, 96 shaders, 320MB 320bit 800MHz GDDR3 memory


Which is better? Do you plump for memory amount or memory bandwidth ??

I assume the 8700GTS will be cheaper, if so will the 8800GTS be phased out do you think?

more memory, cheaper, similar perf and lower power, it's a no brainer for me.
 
I don't know, my speculation would have been 8700 GTX and GTS rather than GTS and GT ..

GTX is my opinion exclusive for ...800-series.

I think 8700GTS will match or outdo 8800GTS in some points, but not in all so that the name is still appropriate. (shader-power is some of the points, where I bet that 8700GTS will have more than G80GTS)
 
Hm... since it's all rumor and this point, I'm left wondering what exactly will we see *this year* from Nvidia?

New high end part? (Hope so, but seems 50/50)
New upper mid-range? (Seems likely)
New low end (Seems likely)

I think Nvidia is probably capable of launching all three this year, but the question is do they need to? The various 8800 skus are doing very well, and there's no guarantee that ATI will release a realistic upgrade to the 2900XT this year (1024MB 2900XTs don't count) - so why should Nvidia bother with a new high end / top end sku?

The wide variety of rumors we're seeing regarding the G92 almost makes me think it's Nvidia leaking different info to different vendors to try and identify the leaks.
 
I think 8700GTS will match or outdo 8800GTS in some points, but not in all so that the name is still appropriate. (shader-power is some of the points, where I bet that 8700GTS will have more than G80GTS)

The game can be Crysis, noone care when 8700gts slower in most games than the 8800gts when its pair or slightly faster in Crysis (coming 4 days later than rumored G92 launch, hype already 100% about the game) with less power consuption and price.
I think we see again a perfect PR strategy from NV how games can sell new hw's what is not really a sucessor of 8800gts.
 
Last edited by a moderator:
Other topic there was a talk about this and the final conclusion was its illegal.

Well it would be, if the details are material to getting the partners to sign on for products.
Some of the discussed examples in the other thread were pretty big discrepancies, if Nvidia told partners different versions.

Fiddling a little with a few parameters, such as final clock speed projections that are always uncertain to begin with, would be less likely to raise objections.
 
NVIDIA G92 and G98 to debut in Nov with PCIe 2.0 support

In Nov, NVIDIA will announce two new series G92 and G98, both are PCIe 2.0 compliant.
G92 is for the performance level and it's performance should be in between the 8800GTS and 8600GTS.
G98 is for mainstream market and it is slated to replace the 8400GS. Performance should be similar to a 8400GS.
Link
 

So, it looks like the article has been taken down. When you combine it with this, it looks like the recent trend is saying high-end again. VR-Zone is the holdout.

I'll also add that I would be shocked if Nvidia does not release some new high end card this November. They may be content to keep customers in the dark, but there is one group they won't lie to: Wall Street analysts. In Nvidia's last two earnings conference calls, they have reaffirmed that their new release schedule is high-end in Q4, followed by mid-range in the new year.

I couldn't tell you if it's G92, but there will almost certainly be a new high-end card. All other considerations aside, Crysis is too good a sales driver to pass up.
 
I agree with you.

Perhaps there is a confusion going around with the codenames. But in my eyes, it seems likely that nVidia will release a high end solution in november. They did a great job last year at the same time, why not do it once again with a shrinked G80? They can collect money from enthusiasts while putting pressure on AMD until their G100 is ready...

Lets see...
 
Die shrink of G80 would be my bet, too, especially given that alternating new technologies and die shrinks seems to be the current trend.

If not a straight die shrink, I'd expect the fused units from the GTS model to be physically removed, lowering die size further and hence further increasing yields, allowing the price to drop straight into the upper-mid range level at around £120 (with lower ram if needed). Power will be helped by then raising the operating frequencies.

The joy of such a route is that they could still fuse off parts of some chips to then offer newer, lower-end options. Also, almost all of the rumours do state that the Ultra will remain at the top, yet nVidia would be foolish to not lauch a high-end card in time for the holiday sales and new round of game releases. Hence a shrink and slight reduction of G80 would meet both.
 
So, it looks like the article has been taken down. When you combine it with this, it looks like the recent trend is saying high-end again. VR-Zone is the holdout.

I not call INQ ("k10 score 30000 in 3dmark2k6 with 2xhd2900xt" and Fruitzilla reliable NV rumor source than vr-zone.

I'll also add that I would be shocked if Nvidia does not release some new high end card this November. They may be content to keep customers in the dark, but there is one group they won't lie to: Wall Street analysts. In Nvidia's last two earnings conference calls, they have reaffirmed that their new release schedule is high-end in Q4, followed by mid-range in the new year.

When a company have big lead than can choice from more step what they do, NV now in the situation not need to take any big risk, still can earn a bunch of money without a highend GPU, gap beetween 8600gts and 8800gts big like a the Chinese wall so its obivius they need first place something there.
They can deserve a "nextgen" highend GPU until AMD release something, don't forget AMD not even have a 8800ultra segment card.

Of course there is a possibility too NV have problems with they "nextgen" highend GPU, and this is why not coming this year.

One thing is fact AMD not release any highend GPU this year, so the 8800ultra can stay this year in the top of the hill.
 
Rosaline: Roughly a G80 die shrink is my best guess right now too - with some minor architectural changes, but probably less than NV40->G71... As I think I said before, I'm definitely expecting the ALUs to be slightly modified to increase real-world utilization rates. Maybe same for the ROPs since you'd expect there to be fewer of them. Oh, and welcome to the forum Rosaline! :)

Now, one thing I'm very curious about is whether NV will want to use GDDR4 at all or not. If they don't, then 256-bit clearly restricts them to 8800GTS performance. If they're willing to go with 1.4GHz GDDR4, then they could match a 8800GTX in bandwidth and beat an Ultra in overall performance if they're clocked much higher...

I don't think the question is whether G92 is a mainstream chip or not. It obviously is: NVIDIA was ready to sell a 480mm2 (+50mm2 NVIO) chip for the ~$279 market segment. You'd be crazy to think they wouldn't be willing to sell a <=300mm2 chip with a cheaper PCB for <$249. So the real question in my mind is this: is it *also* a high-end chip?

My guess is that it is: it's very easy to see how it could beat in a 8800Ultra on average if it was basically a 800MHz G80 (albeit with fewer ROPs) and 1.4GHz GDDR4 on a 256-bit memory bus. It would be bandwidth limited much of the time, but that's not the point: if you can keep your PCB/memory costs constant and improve your performance, you can just increase your chip's price and make more money. Engineering and financial balances are not the same.

But if G92 didn't use GDDR4 at all, then we could only presume that it could not beat a 8800GTX in many cases. It's not impossible that NVIDIA is waiting for GDDR5... But this would be rather strange, because there were (afaict) very reliable rumours that G81 was a higher-clocked G80 on 80nm and with GDDR4 before it was canned. Unless G81 was canned precisely *because* NVIDIA decided to completely boycott GDDR4, why wouldn't they be willing to use GDDR4 now if they were (according to the rumour mill, at least) willing to back then?
 
Status
Not open for further replies.
Back
Top