NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
huh?
I still remember years ago when Rambus were damned to file for patents while discussing DDR creation in Jedec?
Being juridically right is not as morally right.

Um, read through the full history here. NO ONE involved was morally right: not rambus, not the memory vendors, and not even JEDEC!

JEDEC was all over the place with their patent policy and never enforced anything. If anything, this whole episode has forced open standards group to actually have a well defined, articulated, documented, and legally binding policy on patent related matters.

The DRAM companies were effectively attempting to steal IP via proxy of JEDEC. This has been well documented and confirmed in court records.
Rambus was obviously underhanded in many ways as well.

Trying to pick which side had a moral high ground is like trying to pick which one of the starved rabid dogs is least likely to bite you!
 
35% of Nvidia’s Stock from High-End Professional Graphics Cards

Strange headline, but anyway:

http://www.trefis.com/articles/1114...gh-end-professional-graphics-cards/2010-02-10

Nvidia’s main customers for discrete GPUs and integrated GPUs are PC makers like Dell, Toshiba and HP that incorporate them as components within new PCs sold. In comparison to the professional graphics card business which, we estimate constitutes 35% of Nvidia’s stock, the discrete and integrated GPU businesses constitute only about 18% and 12% of Nvidia’s stock respectively.

We estimate that Nvidia’s profit margin for professional graphics cards was close to 51% in 2009 compared to 19% for discrete graphics.

We expect Nvidia’s professional graphics card margins to increase over the forecast driven by

(i) sales of newer products like Tesla which deliver superior performance over earlier products

(ii) lower unit costs that Nvidia expects to achieve by introducing products on the more advanced and efficient 40 nm manufacturing process
 
Weird the only way I can see their professional market selling twice as many cards as their consumer market is if their consumer market completely tanked.

I'm not holding much stock on what they say as they give no other reference other than their analysts have deduced this from past Nvidia filings as well as "other trusted sources."

I'm sure if the professional market was that large for Nvidia, they would have mentioned it in their conference calls.

Regards,
SB
 
I guess they mean that the workstation market accounts for 35% of nvidias stock value.

Sure, if you just look at the financiel reports, it's the market where nvidia makes most/all it's profit (ofcourse that's also a question of how shared r&d cost are distributed), but seems like it's mostly based on brand recognition, and therefore quite vulnerable - ie. can they really keep selling quadros at twice the price of similar performing firepros, not even mentioning the applications where gamer cards will do just as fine.
Many of these overpriced cards are even sold in markets where the 3d/drivers doesn't really matter (the quadro nvs series).
Relying on your customers unwillingness to try out much cheaper alternatives seems dangerous in the long run.
 
Remember that the memory interconnect is a much more controlled environment than an ethernet cable, inches instead of meters and a nice PCB.

One of the reasons PCB's aren't "nice" is because of the short trace lengths. If you're talking about frequencies in the 500-1200 mhz range, you're talking about wavelengths in the ~100-250 millimeter range. Basically, all of your PCB traces turn into 1/4-wave antennae for your signal to broadcast itself all over the board and the insides of your computer.

Ethernet cables work on twisted pairs to help counterbalance all the RF being emitted by the wires.
 
Yeah, so this analysis is kind of bogus. The workstation market is where all of NV's profit comes from...but only because they share costs with IGP and discrete. The workstation market is not a viable stand-alone business, since the development costs are too high.

Moreover, I'd expect NV's discrete ASPs to increase over time as IGPs gobble up more and more of the low-end discrete market.

David
 
One of the reasons PCB's aren't "nice" is because of the short trace lengths. If you're talking about frequencies in the 500-1200 mhz range, you're talking about wavelengths in the ~100-250 millimeter range. Basically, all of your PCB traces turn into 1/4-wave antennae for your signal to broadcast itself all over the board and the insides of your computer.

Ethernet cables work on twisted pairs to help counterbalance all the RF being emitted by the wires.

Hypertransport, XDR and if I'm not mistaken gddr5 use differential signaling, so I guess that buys you something.

for the future, there's recent progress in the labs - MIT demonstrating a SiGe laser that you could build on-chip. Let's hope it works out well
http://science.slashdot.org/story/10/02/05/0133209/First-Room-Temperature-Germanium-Laser-Completed
 
Been busy lately and I haven't followed these forums for a while, but I continue reading SemiAccurate: lower volume and much more of a guilty pleasure.

And so my eye fell onto this doom and gloom article: "Nvidia's R&D spending examined", in which the case is made that Nvidia has cut R&D spending by 33% because it has given up on the upcoming GPU generation, from $300M in Q109 back to $197M last quarter.

That is a huge decline indeed, which warrants a somewhat closer inspection. Turns out the R&D spending has been pretty constant for the last 7 quarters, with that one exception of course:
4/29/2007 158
7/29/07 158
10/26/07 179
1/27/08 134 <<<< Don't know what happened here. Number based on 10K - 3 previous 10Q's, as is the case for the 1/25/09 number.
4/27/08 218
7/27/08 213
10/26/08 212
1/25/09 212
4/27/09 302 <<<<< WTF?
7/26/09 193
10/25/09 198

There's just no way a company will increase its R&D spending by $90M in one quarter and then reduce it back by $90M the quarter after that without some kind of explanation to Wall Street.

And, indeed, a quick perusal of the Q109 10Q turns up this bit of information:
Our condensed consolidated statement of operations for the three months ended April 26, 2009 includes stock-based compensation charges related to the stock option purchase (in thousands):
...
Research and development 90,456
...
Mystery solved. Sure it's $90M in hard cold cash, but it doesn't have any impact on actual R&D resources spent, so adjusted R&D spending for that quarter was $213M. I guess the 10% decline in R&D spending from an all-time high of $218M to $197M in the last quarter wasn't sensational enough for the investigative journalists of People Magazine...
 
Last edited by a moderator:
This explains a lot. It seems Nvidia's marketing people might not be too bright :)

A marketing director for Santa Clara-based Nvidia is facing a misdemeanor charge after he told a flight attendant at San Francisco International Airport that he had a bomb in his jacket. Yushing Lui, 47, pleaded not guilty in San Mateo Superior Court to a false bomb threat charge on a Cathay Pacific Airways plane before takeoff Thursday.

http://www.fudzilla.com/content/view/17690/1/
 
Been busy lately and I haven't followed these forums for a while, but I continue reading SemiAccurate: lower volume and much more of a guilty pleasure.

And so my eye fell onto this doom and gloom article: "Nvidia's R&D spending examined", in which the case is made that Nvidia has cut R&D spending by 33% because it has given up on the upcoming GPU generation, from $300M in Q109 back to $197M last quarter.

That is a huge decline indeed, which warrants a somewhat closer inspection. Turns out the R&D spending has been pretty constant for the last 7 quarters, with that one exception of course:
4/29/2007 158
7/29/07 158
10/26/07 179
1/27/08 134 <<<< Don't know what happened here. Number based on 10K - 3 previous 10Q's, as is the case for the 1/25/09 number.
4/27/08 218
7/27/08 213
10/26/08 212
1/25/09 212
4/27/09 302 <<<<< WTF?
7/26/09 193
10/25/09 198

There's just no way a company will increase its R&D spending by $90M in one quarter and then reduce it back by $90M the quarter after that without some kind of explanation to Wall Street.

And, indeed, a quick perusal of the Q109 10Q turns up this bit of information:

Mystery solved. Sure it's $90M in hard cold cash, but it doesn't have any impact on actual R&D resources spent, so adjusted R&D spending for that quarter was $213M. I guess the 10% decline in R&D spending from an all-time high of $218M to $197M in the last quarter wasn't sensational enough for the investigative journalists of People Magazine...

Dear dear.... :LOL:
 
Mystery solved. Sure it's $90M in hard cold cash, but it doesn't have any impact on actual R&D resources spent, so adjusted R&D spending for that quarter was $213M. I guess the 10% decline in R&D spending from an all-time high of $218M to $197M in the last quarter wasn't sensational enough for the investigative journalists of People Magazine...

While I'm not exactly an expert on things like GAP, I'm sure things like this don't get filed into R&D expenses.
 
Cashing out options generally means you are expecting the stock to go down ... so I was suggesting they shit bricks, just being coy though.
 
Their next part does not bode me with confidence.

Geforce FX5800 Ultra had an easier ride to market from the whisperers... what does that tell you?I expect in current games that the latest nvidia part will be neck and neck with 5780 and so be a big failure in regards to creating a splash.
 
Could Fermi be a response to the original/bigger rumoured Cypress GPU? I.E. they heard it was gonna be big, so they felt going a bit bigger would be effective.
 
Status
Not open for further replies.
Back
Top