NVIDIA GF100 & Friends speculation

Quadro is only a profit business because it doesn't have to pay the R&D. As goes their mainstream profits, so goes their Quadro profits. And of course it is without growth, as it isn't a growth market. If anything it is a mature market that will SHRINK over time.

The quadro business is at a lower level than last year. Why should their geforce business increases with competition but they quadro segment shows no movement when nVidia has nearly no competition?

A large volume portion of the Quadro/professional market isn't actually high end 3DCC/CAD/CAM but multiple display cards which is effectively a DEAD market at this point with what ATI has done with eyefinity.

Not really. "Eyefinity" exists with matrox's parhelia since 2002 . nVidia has only one or two products with this feature in their portfolio. There is no demand for more than two active displays.
 
look up the definition of empirical, then you know what BS is when you guys just pull numbers from all over the place.

It’s quite simple... nvidia claims UP TO 2x Performance. Cypress is about 80% faster ON AVERAGE.

Sorry, but I don't see how nVidia will pull that kind of a trick on suddenly being 2x faster on average...
... is there some sort of magic we not aware of yet?

Unless of course the slides where a complete understatement and indented to keep expectations low.
 
It’s quite simple... nvidia claims UP TO 2x Performance. Cypress is about 80% faster ON AVERAGE.

Sorry, but I don't see how nVidia will pull that kind of a trick on suddenly being 2x faster on average...
... is there some sort of magic we not aware of yet?

Unless of course the slides where a complete understatement and indented to keep expectations low.


Hawx alone gets more then that x2 at x8 AA, then what does that up to x2 performance really mean? looking for numbers out on the web, multiplying them by numbers provided from somewhere else, doesn't make anything other then crap.
 
There is a big chunk of this thread talking about profits in the HPC and professional market. General consensus is that they are in fact not significant compared to the consumer market. Then again worth a LOT can be interpreted as anything.

http://www.xbitlabs.com/news/other/...tion_Market_Begins_to_Stabilize_Analysts.html

Peddie had Nv doing $214M worth of professional desktop graphic cards (quadros) in Q2. Last quarter Nv reported professional up 11% quarter on quarter, so that would put Nv's desktop professional revs at somewhere around $240M last quarter, and margins here are 80%+, trust me or take a look at the pricing and what you get for it yourself. Meanwhile ATI (AMD graphics division) did a total of 306M in revs in Q3 for their whole business.

http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1342558&highlight=

So basically ATI would need 60%+ gross margins across their entire business just to match the gross profits Nv is generating from Quadro alone. You think that's significant?
 
Not really. "Eyefinity" exists with matrox's parhelia since 2002 . nVidia has only one or two products with this feature in their portfolio. There is no demand for more than two active displays.

Which is why the Quadros that supported this feature were introduced just last year rather than oh wait, 2003-2008 then :rolleyes:

Now that ATI cards finally have Genlock/Framelock they can finally get into broadcast (the *most* lucrative of all DCC card sales), and places with a 100W thermal footprint rather than a 250W one. Previously you *had* to get the top-end SDI Quadro SKUs- of all cards to even get gen/framelock.
 
A large portion of NVIDIA's Quadro/Professional market is dead because of eyefinity? LOL, that is a ludicrous statement. Eyefinity is actually marketed towards high end gamers.

A sizable portion of the professional market is actually multi-monitor cards. These are effectively $50 cards but sold in the $400-500 range because they support 2+ (up to 6 in some cases) monitor setups that are used in a variety of fields. This market will quickly die out with the introduction of cards like the 5xxx series that support 3+ monitors out of the box. And so while eyefinity is marketed toward gamers, the functionality to drive up to 6 monitors is still there and will be available at mainstream prices.
 
A sizable portion of the professional market is actually multi-monitor cards. These are effectively $50 cards but sold in the $400-500 range because they support 2+ (up to 6 in some cases) monitor setups that are used in a variety of fields. This market will quickly die out with the introduction of cards like the 5xxx series that support 3+ monitors out of the box. And so while eyefinity is marketed toward gamers, the functionality to drive up to 6 monitors is still there and will be available at mainstream prices.


As other's have stated, Matrox has had that support for many years now, and they haven't taken on the quadro market, the most of the quadro market isn't for multi monitor support only.
 
I agree that it will shrink but it can't be proclaimed dead yet for quite some time. ATI has very little mindshare in this highly conservative market. Even Matrox has managed to this day to subsist on their slice of the profits there with zero product innovation.

They've existed because they can sell what is effectively a <$50 graphics card for $300-500 BECAUSE of the multi-monitor support.
 
The quadro business is at a lower level than last year. Why should their geforce business increases with competition but they quadro segment shows no movement when nVidia has nearly no competition?

Because the mainstream and high discrete market is still a growth market while the 3DCC/CAD/CAM market is actually been stagnant for years.



Not really. "Eyefinity" exists with matrox's parhelia since 2002 . nVidia has only one or two products with this feature in their portfolio. There is no demand for more than two active displays.

There is actually significant demand in a variety of industries for >2 displays.
 
It’s quite simple... nvidia claims UP TO 2x Performance. Cypress is about 80% faster ON AVERAGE.

Sorry, but I don't see how nVidia will pull that kind of a trick on suddenly being 2x faster on average...
... is there some sort of magic we not aware of yet?

Unless of course the slides where a complete understatement and indented to keep expectations low.

you are arguing with razor. You sir, fail at the internets. You can make a well reasoned argument till the cows come home but he'll keep repeating the same thing, just like he kept repeating that Fermi would be shipping by xmas until Dec 26th.
 
Because the mainstream and high discrete market is still a growth market while the 3DCC/CAD/CAM market is actually been stagnant for years.

There is actually significant demand in a variety of industries for >2 displays.


I agree that it has been stagnant because of market saturation

Second part, there is demand in other industries but the focus of those industries aren't all quadro's.
 
And so while Eyefinity is marketed toward gamers, the functionality to drive up to 6 monitors is still there and will be available at mainstream prices.
Take security, for example: lots of cameras, main/alarm screen - number of displays does add up pretty quickly. The cost of the LCD is quite low compared to a videocard that has more than 2 outputs.
 
http://www.nvidia.com/object/IO_86775.html

Here is the actually nvidia white paper, if you guys wanted to see marketing numbers they could have just showed you tessellation performance data from there, page 15. Wow some unknow Dx 11 app that shows 600% improvement over the HD5870.

AA HawX performance 233% higher at x8 page 21, wait that doesn't correspond with up to 2x the performance in the marketing slides.......
 
If people here watch the Top10 supercomputers arround the World, ATI GPU equip some of these computers.
Nvidia has none.
 
Because the mainstream and high discrete market is still a growth market while the 3DCC/CAD/CAM market is actually been stagnant for years.
I wouldn't underestimate the CAD market between Inventor and Solidworks alone you have 2million + users, then you have all the other CAD packages, Max,Maya etc... This is a user-base of many millions of users who seem to have not to much problem buying $2k a pop graphics cards every 2 years or so.

I've seen CAD departments at automotive companies buy truck loads of highend Quadro's ($4,000+ a pop) cards without hesitation, this is mainly due to the fact that all the CAD software used generally is unsupported on anything but "professional" cards.
 
http://www.xbitlabs.com/news/other/...tion_Market_Begins_to_Stabilize_Analysts.html

Peddie had Nv doing $214M worth of professional desktop graphic cards (quadros) in Q2. Last quarter Nv reported professional up 11% quarter on quarter, so that would put Nv's desktop professional revs at somewhere around $240M last quarter, and margins here are 80%+, trust me or take a look at the pricing and what you get for it yourself. Meanwhile ATI (AMD graphics division) did a total of 306M in revs in Q3 for their whole business.

http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1342558&highlight=

So basically ATI would need 60%+ gross margins across their entire business just to match the gross profits Nv is generating from Quadro alone. You think that's significant?

No need to use third party data when you can get them from nVidia's annual and quarterly reports as the PSP division.

http://phx.corporate-ir.net/phoenix.zhtml?c=116466&p=irol-reportsOther

For the last reported quarter nvidia had revenues of $116 million with operating income of $41 million for an division operating margin of 35%.
 
Hi,

I've read most of the thread and while the performance speculation is interesting, I haven't seen anyone really take a stand on what kind of performance would be *acceptable* and what *good*. The way I see it, acceptable would be to put out something over the Radeon performance curve, good would be something that retains their advantage from the earlier generation. Of course, I'm not talking about raw performance here, but with respect to release date (since 5670 would have blown away competition a few years ago).

I'm going to look at 4870 and 280 since 4890 and 285 were minor updates.

So, looking at Radeons first, ATI got something like 1.8 the performance of 4870 out in 5870 (is that about right?) in 15 months (4870 in June 08, 5870 in September 09). For NV to retain the advantage of 280 (out in June 08), GF100 coming out in 26 months (this March) would have to be ~2.77 times as fast as 280 (1.8^(26 months / 15 months)). That would be *good* performance, keeping up with ATI development speed and retaining their advantage from 2008.

For *acceptable* they would only have to improve upon 5870 by what the six months of development are worth. That is, GF100 should have at least ~1.27 times the performance of 5870 (1.8^(6 months / 15 months)). That performance would keep them competitive, with performance between ATI's current and projected next generation. However, they would have lost the advantage they had in 2008.

Did I have some of my facts wrong? I took the dates from Wikipedia, so I'm not 100% on them. Also, I couldn't find a very good chart of average performance between 4870 and 5870 so the 1.8 was an estimate based on what I could find.

Your thoughts: which of these (or neither) sounds more probable, and is this a reasonable way to look at things :?:
 
Back
Top