NVIDIA Kepler speculation thread

Ok ,why we are doing this ? I believe the reason why we are even discussing these weird news , is because of the sudden shift of Charlie's mood , from flaming NVIDIA and calling their GPUs a massive failure , to this cautious praising and neutralism !

I think the mistake here was to trust in that guy's opinion in the frist place, his so called "facts" and "technical explanation".. When obviously everything he says is either a major simplification (to the point of betraying the subject) or an utter and complete crap .

We should learn to pick our sources of news and technical opinions, there are reasons why this guy is unique in his approach , that is because no one is fool or dumb enough to tackle such complicated matters with such baseless and bogus confidence that just shows he has not the slightest of clues .

Fudo and Theo pales in comparison to this guys and they both know they get things wrong a lot and they know their credibility is in the toilet, but this guys is something else, it just shows how far an ignorant person can go when you give him the chance to listen to and believe in his mumblings .

Lets just face it , the only reason why people even follow what he says is because some of them "Want" to believe him , whether what he say is credible or not , that is a different matter entirely .
 
Let's assume it performs like Tahiti. What should they price it as to leave room for GK110?

GTX 280 was 649 USD at launch for a reference card. There was also the literal 7800 GTX Ultra press edition that had a limited run at 999 USD retail. Which was quite literally review samples produced to spoil Radeon [edit: wrong card] x1900xtx (hopefully my memory isn't failing me this time on which card it was launched against) launch and then never made again. But was sold at retail until stock ran out to at least pretend to be an actual product to back up the reviews.

So 650-750 for GK110 wouldn't make me blink if Nvidia decided to go balls out again. So 500-550 would seem like a reasonable bet for GK104 that performed similarly to Tahiti.

Of course, I highly doubt GK104 is going to be similar in performance to Tahiti

The same source as those rumors are probably the same source of the rumors that Keplar features dedicated PhysX acceleration hardware. Or the same as the supposedly Nvidia sourced slides showing GK104 performance far in excess of Tahiti that were shown to Charlie.

Silly season for Keplar might actually far exceed the silly season leading up to R600.

Regards,
SB
 
Which begs the question - if they had anything good and legit to show, why aren't they? Worried about osborning 580 sales? :p
 
Of course, I highly doubt GK104 is going to be similar in performance to Tahiti

Yeah I think common sense will prevail on this one. But even if it does perform well I can't see them charging more than $400 if it's near Tahiti. Otherwise it would be a joke compared to (fixed) Fermi.

Silly season for Keplar might actually far exceed the silly season leading up to R600.

And it could be going for another 5 months :)
 
Since I don't believe in neither pixie dust nor any magic wands for any GK104 to come close in performance to any Tahiti, the two should have quite comparable specifications since I'm personally still convinced that the hotclock is gone for Kepler.

Yes the speclist of GK104 obviously is out there as OBR claims it just hasn't been published yet :LOL:
 
Hmm.. so Kepler might process software based Physx on the hardware... I wonder if its all it can do with software based processing... OBR said something about Kepler being more CPU independent on a previous post... Could Physx be used just as an example of Kepler capabilities in processing common CPU tasks?

Could a GPU, with enough processing power and capabilities to do traditional gaming CPU tasks, avoid CPU bottlenecks, and thus achieve higher performance because of it? Could this be the case where Kepler beats Thaiti handily? I would say no because from benchies it doesnt look like Thaiti is CPU bottlenecked, even in Eyefinity resolutions.

On that front, what are the impacts in the x86 patents, or none, if Kepler starts processing software generally processed by a x86 CPU?

Just pulling ideas out of my back... :p

What do you think?
 
Charlie's latest piece doesn't make much sense to me.
If nvidia really increased peak flops to ~3TF (more than double compared to 560Ti, near twice that of 580) I don't think there should be any performance cliff for "unoptimized" games anywhere as this is probably the area where their chips were the most deficient (aside from limited color fillrate due to shader export issues but clearly this can't have been that much of a performance killer in real world). Unless they got rid of texture units or something.
Also the 1/2 SP/DP ratio looks unlikely to me for the non-compute chips.
 
Back
Top