NVIDIA Kepler speculation thread

The act of making someone do or accepting to do something ethically wrong in exchange for money or otherwise financially valuable favors.

Ethically wrong is context dependent, so I'm not sure whose ethics we're talking about. Is it ethically wrong if you really believe something and take it as true? Or is it just misguidedness (for some definition of it) that is being preyed upon?

You'd have serious trouble proving anything resembling part II, mainly because it's highly unlikely to exist in the way that it is seen on forums. Conspiracy theories are neat and all, but ask yourself how frequently you end up being nice with someone who's nice to you - nice as in saying "Hello buddy, you all right?" as opposed to him buying you a car. Kind, well-meaning and incredibly friendly are the staple of marketing departments...you know, the sort of best buddy that just has a helpful FYI ready at the right time? So if your buddy is kind enough to drop a hint, and that hint neatly aligns with your world-view, how much of a bastard would you be to ignore it?

Making a long story short, end ending my OT, I think everybody's barking up the wrong tree when they're upset with IHV marketing/sales/PR/something (this is apparently a heterogeneous area in and of itself) departments attempting to grease all grooves. It's their job, and if they're good at it they'll use you up whilst putting a smile on your face for being used. As is the case with crying about IHVs "bribing"(untrue) ISVs to use proprietary tech. People should be looking at the press and the ISVs, both actually have a choice about how to do their job in the context of IHV marketing/sales/whatever people doing theirs.

Of course this is all a matter of opinion (in this case my humble opinion), and it is quite unrelated to Kepler (it'd be quite nice if people stopped calling it Keplar by the way), so if there is merit in pursuing this further perhaps we should move to a separate thread - everybody in favour nudge me so that we can clear the speculation thread.
 
I think thoses discussion about review will not change anything, its a bit of lost his time there, this will not change their approach. And after have speak with some who do them thoses last years, the problem will allways be, there will be allways a complain about something.

Personally i read a maximum of review, try compare the numbers on specific games between them, between res and settings and try to get a figure with all that... the ratio on the end are offtly purely indicative and there's so many thing to take in account.. ( It dont matter for me the card x is 25% faster in a 4years old game who run at 100+fps average on a midrange cards all maxed, but i will maybe start to look the type of graphism or games i play. )

I need say, thoses last 3 years have been really complicated for get a simple figure, Some sites have completely different results on the same games for different reason, ( the place used in C2 ( timesquare ? ), or BF3 ( single player mission, but different places ? )

This is even harder when finally cards are so close of each other, and turbo dont help for sure ( its good when the reviewer tell you at what level the card is running when he bench it, some review have too cards who have a 30-40mhz difference between each other, this is for stock cards, but its even worse with OC cards ( where the consistent of the max speed is not warranty if you have a bad card or a good card ( even if it tell you 1200mhz turbo )

Looking at the 660, TI review, there again, it seems some site have just flash their non official MSI, gigabyte cards with the stock bios they have given to them ( hence why no photos of the stock 660 or a real article about it, but only about AIB cards ( who is not a problem as stock reference cards are surely not findable )

hardwarecanuck have done a good job trying to show the variation on clockspeed in 2 titles. But sadly it dont insist enough on the real aspect, this is not due to Asus, or MSI, or gigabyte but to the core quality and there again you will get infime difference between a same model ( is it important for all days user ? no, they will not see the 2 -4fps difference, is it important for a review? maybe a bit more when cards are so close )
http://www.hardwarecanucks.com/foru...roundup-asus-evga-gigabyte-galaxy-msi-21.html
 
Last edited by a moderator:
So I stumbled upon this TPU forum post which linked to a PCHome page (translated) containing some purported specs for some of the GTX 700 series.

GTX 770: GK110, 950 MHz core, 980 MHz boost, 2 GB 6.25 GHz GDDR5, 256-bit bus, 32 ROPs.

GTX 780: GK110, 3 GB 5.2 GHz GDDR5, 384-bit bus, 32 ROPs.

GTX 790: (Apparently 2x) GK110, 6 GB 4.06 GHz GDDR5, 2x 384-bit bus, 2x 32 ROPs.

Really weird specs and I'm not really believing them (only posting them for interest/discussion/speculation). Fairly recent though—the poster says they were updated 12 days ago.

It's a nice troll, specs are made up to make the cards crippled : really, a 256bit bus card with GK110 is insane, not using fastest memory clock on gtx 780 is too, and the dual card, which we have no reason to expect yet, is a piece of crap.
 
All you have to read are hardware.fr reviews. It's too bad that the English version is lagging behind (behardware.com).
They have nice tests, consistency in my opinion, they have some of the best thermal dissipation/ power consumption around.
They complied with Nvidia's requested (include the 9800gt) but made it clear and they included many cards of that generation in the review.
Good job Tridam and your team. I will be back in France sometime in 2013, it might prove tough to find a job quickly some I may have some time to give I contact them and offer to translate some of their articles/news for free.
 
This card was on a "driver stuff" screenshot on this thread but I thought this was a Quadro.
It would have 3 GPC with one SMX per GPC, maybe. So, hot on geometry but with downplayed watts and pixel power. (and memory, what's 1GB these days?)
 
From OBR-Hardware: "EXCLUSIVE! - SOME GEFORCE GTX 780 DETAILS ...."

OBR said:
Successor GeForce GTX 670/680 will NOT be based on core GK110. These cards will be based on a completely different, new core ... more detail later.

Videocardz adds some detail, claims that the GTX 780 will have a CC count in the range of 2048-2304.

————————

Well, 2048 isn't divisible by 192, the closest multiples of 192 are 1920 and 2112, so 2048 wouldn't be possible unless it uses gcd(2048, 2304) = 256 CC SMXes or a fraction thereof or if parts of a SMX can be disabled, but both seem quite unlikely.

I've been thinking since the (true) rumors of 5 SMXes on the GK106 instead of 4, if that "anomaly" would mean the GK104 successor would have more than 8 SMXes. 10 SMXes would be double the GK106, and 12 SMXes would make the ("GK114" SMX #):(GK106 SMX #) ratio (2.4:1) very close to the ("GK106 SMX #):(GK107 SMX #) ratio (2.5:1).
 
It's not ludicrous at all. GK110 wasn't designed with games as a primary concern, and from the beginning, even NVIDIA wasn't sure it would end up in gaming cards.

Which is not to say that OBR is not making random crap up, but it's possible.
 
A GK104 refresh is no reason to expect GK110 will not be sold to consumers.

Each GK104 SMX takes ~16.5 mm^2. One additional SMX per GPC would add ~66 mm^2, bringing the die size up to 360 mm^2. In that case they might jump up to a 320-bit memory interface, but I wouldn't bet on it. Either way, such a GK114 shouldn't have any trouble keeping up with a 2560sp/160tmu HD89XX.

They could also just add 1 GPC at ~340 mm^2. Such a part would probably end up fairing like the GTX680 does against the 7970 1GHz edition.

In either case, neither would come very close to the performance of GK110. If it is an HPC only chip, then why does it have 6 GPCs and 240TMUs?

EDIT:
Alexko said:
GK110 wasn't designed with games as a primary concern, and from the beginning, even NVIDIA wasn't sure it would end up in gaming cards.
Source?
 
It's not ludicrous at all. GK110 wasn't designed with games as a primary concern, and from the beginning, even NVIDIA wasn't sure it would end up in gaming cards.
GK110 is certainly no stranger to games. Its extensive virtualization additions are perfectly suited for remote/cloud game rendering.
 
Back
Top