NVIDIA Kepler speculation thread

You mean that it is not reliable?

They have similar access to industry sources (insiders at AIBs and potentially lower level employees at AMD and Nvidia) as other tech. outlets. Take that as you will.

But consider that much of the disinformation has been spread by Nvidia themselves. So it really doesn't matter how good your sources are.

Regards,
SB
 
They have similar access to industry sources (insiders at AIBs and potentially lower level employees at AMD and Nvidia) as other tech. outlets. Take that as you will.

But consider that much of the disinformation has been spread by Nvidia themselves. So it really doesn't matter how good your sources are.

Regards,
SB

ok, thanx

just wondering because I knew that Digitimes is quite aware, being a Taiwanese online newspaper
 
Digitimes have a pretty poor track record and it has gotten progressively worse over the years.

On topic, I listened to the Nvidia Q&A session after they announced their quarterly financials and I am paraphrasing but Jen Hsun said that Kepler was probably "the best GPU they've ever made". I know how hyperbolic the PR talk is and the spin that comes with it but I dont think Jen Hsun has ever gone on record to say something like this for their previous architectures? Please correct me if I'm wrong. :)
 
Silent_Buddha said:
But consider that much of the disinformation has been spread by Nvidia themselves. So it really doesn't matter how good your sources are.
This has been claimed multiple times on a various occasions (especially by you-know-who), but I always wonder if this is not just a knee-jerk claim with no solid foundation.

There will always be websites who need eyeballs, so there will always be people willing to make things up. I don't see why it's even necessary to actively spread misinformation.
 
Digitimes have a pretty poor track record and it has gotten progressively worse over the years.

On topic, I listened to the Nvidia Q&A session after they announced their quarterly financials and I am paraphrasing but Jen Hsun said that Kepler was probably "the best GPU they've ever made". I know how hyperbolic the PR talk is and the spin that comes with it but I dont think Jen Hsun has ever gone on record to say something like this for their previous architectures? Please correct me if I'm wrong. :)

That's a pretty generic and easy to satisfy claim. If we assume that Nvidia generally improves with each new generation by even a little bit, this would be true in a straightforward analysis. With the metrics not stated, it's even easier to say that by some set of arbitrary measures it's better than what they've done before.

There's a node transition and years of development work, so it doesn't seem unreasonable to say it's better than what was previously better than what was prior to that.
 
Indeed - and seriously, has any company ever claimed that the new gpu/cpu/whateverproduct is NOT the best they've ever made? :D
 
I know its generic and easy to satisfy claim and that it is to be expected, which is why I added the disclaimer about the PR spin ..

What I was actually asking is that has Jen Hsun ever said the same for any of their previous architectures?
 
I know its generic and easy to satisfy claim and that it is to be expected, which is why I added the disclaimer about the PR spin ..

What I was actually asking is that has Jen Hsun ever said the same for any of their previous architectures?

Probably. Not sure about JHH, but here's Drew Henry:

"You guys have been amazingly patient," Drew Henry, general manager of Nvidia's GeForce business, told an audience full of hardcore gamers and PC enthusiasts. "We chose to do something new and different and exciting because we wanted you guys to have a great experience with a lot of the next generation games that people are building."

Henry quickly introduced the GTX 480 graphics card to much applause. "This is without a doubt the best GPU we've ever built," Henry said. "It's got a crapload of performance. This has got to be one of the coolest GPUs we've ever done or the industry has ever had."
[emphasis mine]

http://www.crn.com/news/components-...;jsessionid=Wzn71EbcskB9OOC6JbyP2g**.ecappj02

This is a very empty statement, it just means Kepler > Fermi > anything NVIDIA has made before, which is pretty much what you'd expect in the semiconductor industry.
 
I'm not sure about the Ivy Bridge hype. Sure it will spark a notebook refresh cycle but is it really that big a deal for discrete? Should be even less relevant on the desktop side of things as Sandy is plenty fast and plenty cheap already.
 
You wouldn't really expect anyone to say "we've got this new chip but really last gen was better"???
Even NV30 was the best chip nvidia has ever built (when it was released) - of course this depends on your definition of "best" a bit since if you include things like efficiency etc. it may no longer be true.
 
Was this posted already?
http://www.xbitlabs.com/news/graphi...er_Than_Expected_Chief_Executive_Officer.html

Unfortunately for Nvidia, yields of Kepler are lower than the company originally anticipated and therefore their costs are high. Chief exec of Nvidia remains optimistic and claims that the situation with Fermi ramp up was ever worse than that.

“We use wafer-based pricing now, when the yield is lower, our cost is higher. We have transitioned to a wafer-based pricing for some time and our expectation, of course, is that the yields will improve as they have in the previous generation nodes, and as the yields improve, our output would increase and our costs will decline,” stated the head of Nvidia.
 
Didn't TSMC claim that defect density at this stage of the 28nm ramp is better than 40nm was at the corresponding stage? Or is "better than 40nm", which from what I understand was initially not good, still below what external parties were given to expect?

If there's a SI guy around with the inclination to explain, I'd be interested to hear how a company typically goes about estimating yield for a chip (if it can be presented at a layman's level without becoming disconnected from reality), and whether those estimates are usually accurate.
 
Didn't TSMC claim that defect density at this stage of the 28nm ramp is better than 40nm was at the corresponding stage? Or is "better than 40nm", which from what I understand was initially not good, still below what external parties were given to expect?

If there's a SI guy around with the inclination to explain, I'd be interested to hear how a company typically goes about estimating yield for a chip (if it can be presented at a layman's level without becoming disconnected from reality), and whether those estimates are usually accurate.
change that to "external party", it's not the first time nVidia is the only one having so much problems, it happened with 40nm too - once TSMC fixed the initial problems, everyone else were happy, but nVidia continued to get bad yields.

Also, IIRC AMD said ages ago that their 28nm yields are better than expected, or was it even better than 40nm [around the launch of 40nm chips, obviously]
 
Back
Top