NVIDIA Kepler speculation thread

Looks like GTX 650Ti could have really used the full 192bit bus. This SKU is the GK106 salvage part. If AMD can release a Pitcairn salvage SKU(7850) with full 256bit bus and a few less shaders then NV really needed the 650Ti to have the full bus and 1.5GB mem. It might have competed better with the 7850 than it did...
There was the listing of 3 GK106 GeForce cards a month ago:

bwwXS.jpg


So it seems that the 650 Ti is the GK106-200, and the GK106-250 is still to come.

How would a GK106 with 768 CCs at 980 MHz (maybe with Boost?) and a 192-bit bus with 5.4 Gbps GDDR5 do against the 7850?
 
How would a GK106 with 768 CCs at 980 MHz (maybe with Boost?) and a 192-bit bus with 5.4 Gbps GDDR5 do against the 7850?

It will fail miserably.

35n21sk.jpg


http://www.techpowerup.com/reviews/MSI/GTX_650_Ti_Power_Edition/28.html

Given also that the cheapest 650Ti is this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121669
http://www.newegg.com/Product/Produ...on=gtx 650 ti&bop=And&Order=PRICE&PageSize=20

And the cheapest Radeon HD 7850 is

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150617
http://www.newegg.com/Product/Produ...n=radeon 7850&bop=And&Order=PRICE&PageSize=20

$154.99 vs $159.99 after $20.00 rebate(s).

It might have competed better with the 7850 than it did...

No competition at all. ;)
 
So geforce.com says "Kepler Family Complete: Introducing The GeForce GTX 650 Ti."

Does that imply there won't be any more first-generation (GK10x) Keplers, at least non-OEM ones? If so then maybe that GK106-250 will be an OEM-only card.

I think they have test different sku, clock speed, maybe even different memory controller ( AMD do it too ) .. and this where the spec on the list who have been posted show his limit.

For be honest, if Nvidia had something in more for the series ready by now, they will have release it as 650TI, and find something else for this 650TI ( ) ..
If it come this will not be before november now.. and its start to be really late for it. maybe in the 750 line up.

In adddition about the review, i see many who dont even mention the price annonced by AIB for the 78501gb at 159$...
 
Last edited by a moderator:
I always take this resolution as a showcase of the potential of the card. Tomorrow with more demanding games, etc. Maybe I'm mistaken.

A card with a 128 bit memory bus has no place at that resolution, also how is that chart a proof that a card with the GK106 die and 768cores on a 192 bit bus wouldn't be competitive, since the 660 is faster than 6970 on that chart? The price comparisons are somewhat "cute" too, since the 650 Ti launched today. Give it some time to find it's place, you typically don't get discounts on day1...
 
LittleJ said:
Looks like GTX 650Ti could have really used the full 192bit bus. This SKU is the GK106 salvage part. If AMD can release a Pitcairn salvage SKU(7850) with full 256bit bus and a few less shaders then NV really needed the 650Ti to have the full bus and 1.5GB mem. It might have competed better with the 7850 than it did...
Yes, the 128bit bus is just odd... I guess they felt it would be too close to the 660 otherwise, but if that was the case, I think they would have been better off just using lower clocks on the 650 Ti. As for competing with the 7850, that isn't really its job. Nvidia's 4th tier card is the 660. The pricing reflects Nvidia's belief they can charge a premium for their name. For the high end, that may be true, but I doubt they will be so lucky in the value segment.
 
This has to be purely a yield decision. They've gone with the best they can get as often as they can. I do agree with ninelven that Nvidia believes they can get by on their superior brand, however they cannot continue to have this kind of defeat and remain unscathed. The 650Ti has been released to a resounding "who gives a shit".
 
Sadly the card fall just where it should not be.. between 7770 -7850 and with the bad price.
 
Last edited by a moderator:
I always take this resolution as a showcase of the potential of the card. Tomorrow with more demanding games, etc. Maybe I'm mistaken.

If you look at a High end card, looking at 2560x1600 is a ofc a good idea.

But for little card like this GTX 650TI or HD7770 the card allready start to struggle with max quality + AA on 1920x1080 in many recent games.

I copy just Guru3d conclusion ( some parts linked to resolution ) ..

.... if you play your games up-to a monitor resolution of say 1600x1200 then the GeForce GTX 650 Ti is going to work out really well for you......

......... That means that you as an consumer immediately will have to forfeit on image quality settings to make the modern games perform well enough at a monitor resolution of 1920x1080. The card will run with good image quality settings up-to 1600x1200, after that resolution you'll quickly find yourself making compromises on image quality in order to gain in rendering performance...........
 
Last edited by a moderator:
Performance projections similar to those for Sea Islands, a 15% increase.
March 2013 release "best case," but April/May is more likely

Terrible (or good for AMD :D ). When there is no obvious competetive advantage from either of the two parties, they rely in most cases on the fan bases. If you like AMD- go for Radeon, if you like nvidia- go for geforce.

So boring. :devilish: I 'demand' someone to work harder and release something revolutionary. :mrgreen:
 
If you look at a High end card, looking at 2560x1600 is a ofc a good idea.

But for little card like this GTX 650TI or HD7770 the card allready start to struggle with max quality + AA on 1920x1080 in many recent games.

I copy just Guru3d conclusion ( some parts linked to resolution ) ..
I am wondering though why on earth would you use a monitor with less than full hd resolution? You can barely buy them anymore.
I dunno if that Guru3d conclusion makes sense. 1920x1080 vs. 1600x1200 is a tiny difference, in fact it's a smaller difference than 1920x1200 vs. 1920x1080 (8% vs 11%) would be. So chances are if it runs ok with 1600x1200 it will run fine with 1920x1080 too. Plus 1600x1200 monitors have died out anyway (well if you'd compare with 1680x1050 instead then the difference is 17% which would be more likely to be really visible).
OTOH though using higher resolutions than full hd as a showcase for future games in lower resolutions is probably not quite going to give accurate results. You'd probably expect more triangles, more complex shaders but probably not really more pixels drawn with future titles, so the scaling wouldn't be the same as increasing resolution.
 
I am wondering though why on earth would you use a monitor with less than full hd resolution? You can barely buy them anymore.
If I look at my family (parents, siblings, uncles, aunts, cousins etc.) almost nobody has a 1080p class monitor. (It's the opposite, of course, for my friends, pretty much all of them in the tech field.) Maybe they're not for sale anymore, but that's irrelevant, because they don't upgrade the monitor when they upgrade their PC. Why replace if it still works?
 
From ergonomy point of view it's very disturbing to stay with CRT or small LCD, when for 100-150 you can buy a 22-24 inch Full HD beauty.
My mom is totally happy with her 19" 1280x1024. Email, browser and Word work fine. She doesn't do a lot of 3D modeling and Visual Studio

A 1080p screen would be less ergonomic, I think.
 
Back
Top