It's Digitimes…
You mean that it is not reliable?
It's Digitimes…
You mean that it is not reliable?
They have similar access to industry sources (insiders at AIBs and potentially lower level employees at AMD and Nvidia) as other tech. outlets. Take that as you will.
But consider that much of the disinformation has been spread by Nvidia themselves. So it really doesn't matter how good your sources are.
Regards,
SB
This has been claimed multiple times on a various occasions (especially by you-know-who), but I always wonder if this is not just a knee-jerk claim with no solid foundation.Silent_Buddha said:But consider that much of the disinformation has been spread by Nvidia themselves. So it really doesn't matter how good your sources are.
Digitimes have a pretty poor track record and it has gotten progressively worse over the years.
On topic, I listened to the Nvidia Q&A session after they announced their quarterly financials and I am paraphrasing but Jen Hsun said that Kepler was probably "the best GPU they've ever made". I know how hyperbolic the PR talk is and the spin that comes with it but I dont think Jen Hsun has ever gone on record to say something like this for their previous architectures? Please correct me if I'm wrong.
I know its generic and easy to satisfy claim and that it is to be expected, which is why I added the disclaimer about the PR spin ..
What I was actually asking is that has Jen Hsun ever said the same for any of their previous architectures?
[emphasis mine]"You guys have been amazingly patient," Drew Henry, general manager of Nvidia's GeForce business, told an audience full of hardcore gamers and PC enthusiasts. "We chose to do something new and different and exciting because we wanted you guys to have a great experience with a lot of the next generation games that people are building."
Henry quickly introduced the GTX 480 graphics card to much applause. "This is without a doubt the best GPU we've ever built," Henry said. "It's got a crapload of performance. This has got to be one of the coolest GPUs we've ever done or the industry has ever had."
Not sure if Jen-Hsun Huang, but somebody from Nvidia said this about G80, too.
Every GPU, which lacks competition, is great
Unfortunately for Nvidia, yields of Kepler are lower than the company originally anticipated and therefore their costs are high. Chief exec of Nvidia remains optimistic and claims that the situation with Fermi ramp up was ever worse than that.
“We use wafer-based pricing now, when the yield is lower, our cost is higher. We have transitioned to a wafer-based pricing for some time and our expectation, of course, is that the yields will improve as they have in the previous generation nodes, and as the yields improve, our output would increase and our costs will decline,” stated the head of Nvidia.
change that to "external party", it's not the first time nVidia is the only one having so much problems, it happened with 40nm too - once TSMC fixed the initial problems, everyone else were happy, but nVidia continued to get bad yields.Didn't TSMC claim that defect density at this stage of the 28nm ramp is better than 40nm was at the corresponding stage? Or is "better than 40nm", which from what I understand was initially not good, still below what external parties were given to expect?
If there's a SI guy around with the inclination to explain, I'd be interested to hear how a company typically goes about estimating yield for a chip (if it can be presented at a layman's level without becoming disconnected from reality), and whether those estimates are usually accurate.