Nvidia Editor's Day report

Xmas said:
Dio said:
By the way, anyone should feel free on Radeon 9800 to do your texture loads any way you like. 'Blocks' aren't remotely important.
Except that blocks make it easier to spot when you hit the dependent read limit. And you should balance the number of texture reads and arithmetic instructions in every dependency level (ie. taking texture filtering settings into consideration, too), since internally, the R300 seems to be operating similar to the phase model of R200.
If you really feel like it - it's good form in general for all hardware, I'd guess - but I wouldn't waste too much time over it. Radeon 9800 doesn't care much now, and it won't care at all in the future.

I find blocks make it harder to see where the dependent reads are, myself, but there you go :)
 
DaveBaumann said:
Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.
Colorful ones, at that. Even those rumors aren't drawing all the pixels, as they show a little red in the corner. ;)

The whole piece was a concise summary of what appears to be a totally PR-driven event. Was Shader Days as full of fluff? I mean, ONE driver a year? nV doesn't even have one HARDWARE REVISION a year! And how does one driver work out with their unified driver releases, especially Forceware? :rolleyes:

BTW, I trust more than a few reviewers are rechecking AM3 and/or Halo to see what nV is talking about?
 
OpenGL guy said:
Did it occur to anyone that maybe the "reddish tinge" is the correct result? No one showed images from the RefRast, right?

Lots of things occur to us, but it's all pretty much moot since it hasn't been mentioned just what the heck was being SHOWN yet, so it's not like people can investigate. Heh...

Frankly, though, I would assume people are a just very jaded to marketing right now on ALL sides, and will wait to see things as they go through the review and article gauntlet on all the various and assorted sites.
 
Actually, based on firsthand experience of these Editor's Day/Shader Day type events (something Kyle does not have) I can say that they're about 10X more useful than the launch party events you get invited to -- those are 100% marketing. The nice thing about them is that you get to talk in person w/ the game devs (Valve sat around for about 45 mins after Shader Day was over talking with anyone who wanted to listen or ask additional questions, Dave can personally attest to that). I'm sure NVIDIA had actual engineers on-hand for this one. This is important, as you can sometimes get a lot more out of these face-to-face conversations than you can through email, the only problem is you usually can't talk about it. I'm 100% positive that a lot more went on than is being reported, there's always something that comes out at these types of events that's off the record. The GS report seems extremely sanitized to me. He basically skipped what Epic had to say (who was also there) I'm certain someone must have asked them about the filtering issues in UT2K3 with NV hardware.

As far as the graphics share comments, Intel's definitely stealing share from NV -- you don't think that $599 P4 PC you see on TV from Dell has discrete graphics do you? Of course not! Corporations also buy integrated machines by the truckload. Also remember that the P4 went 1 year w/o an integrated graphics option, 850 w/ RDRAM was the only ticket in town. So that helped NV sell more video cards.

Of course, ATI is taking share as well, so to say it's just Intel is definitely oversimplifying things.
 
Dio said:
I find blocks make it harder to see where the dependent reads are, myself, but there you go :)
Really? That's quite surprising.
If you sort your instructions like:
tex sampling without dependence - arithmetic instructions - 1st level dependent texture reads - arithmetic instructions - 2nd level dependent texture reads - arithmetic instructions - 3rd level dependent texture reads - arithmetic instructions,
you can easily spot where the phases begin and end, and if you're going to hit the dependent read limit. If you just mix those instructions, there's the possibility of creating dependent texture reads where you actually don't need one. AFAIK, from what I've heard from the opengl.org forums, sampling from the same texture more than once is also a dependent texture read (you can read a texture assigned to a sampler no more than four times)
 
I found lots of things as reported in Gamespot's account to simply be nothing short of bizarre. The reference to "competition with Intel not ATi," and especially the "one driver a year" comments either mean that nVidia's top people have gone stark raving mad, or else that nVidia's planning to get out of the 3d-gaming market segment completely and go head to head with Intel in the "2d/lame 3d support" corporate el-cheapo market for integrated gpus...:)

The only way in hades it's feasible to do an annual driver like that is to do it in Intel's market--where Intel does much the same thing because its 3d support is very limited and the rest is 2d support which for those chips changes hardly at all. Intel can easily get by with 1-2 updates a year in that market. There's just no way you could even THINK about doing an annual driver if you are planning to compete in the 3d-gaming markets against IHVs like ATi. By mid year half of the newer 3d games would be lucky to *run* on your products...:) And according to nVidia they have close relationships with BUGS of all types on a recurring basis.

If they'd have said they were abandoning the 3d-gaming market segments for the low-end Intel integrated segment next year, and that's *why* they no longer consider that ATi is taking their market share, that would scarcely have surprised me any more than what they said about "possibly" reducing their "quarterly" driver schedule (so demanding as that is) to possibly a *single* driver per year.

But I am quite sure that nVidia would feel a hefty sense of relief, not to mention joy, at doing something like that in Intel's market exclusively--no more benchmark worries, no more benchmark cheating to get caught in, no more "optimzations" to invent (Yipee!), and no more "next generation automatic shader reorganizing engines" to gush about! Heck--they might even fire the whole PR department, too! Ah, joy on top of joy!

"And nVidia found the Truth, yea, they found it in Intel's integrated value markets and put down their weary burdens at long last and set themselves free!"

I think I'm gonna be sick.
 
Maybe you ought to look at who's market share is growing (Intel), and which market both NVIDIA and ATI are targetting (motherboard chipsets w/integrated video) before you go spouting off that NVIDIA has gone crazy.
 
Xmas said:
AFAIK, from what I've heard from the opengl.org forums, sampling from the same texture more than once is also a dependent texture read (you can read a texture assigned to a sampler no more than four times)
This is largely what I was referring to when I said "it doesn't care much now" - i.e. it's not quite as smart under ARB_fragment_program as it is under Direct3D (largely because ARB_fp is more complex), so then I said "It won't care at all in the future" because this will get fixed soon.
 
The way it was phrased is still ludicrous, Russ, as they are shiftily trying to suggest that ATi has made no impact at all.

Both Intel and ATi made gains this year, and nVidia lost ground. There are whole segments where Intel does not compete with them at all--what are the marketshare sways there? If they lost ground--and certainly they lost ground--then it had a negative impact on nVidia's overall marketshare. Would ATi's competition alone have been enough to cause them an overall decline? Immaterial. It still carries a negative on where they WOULD have been had ground not been lost. How does the loss the experienced due to Intel competition compare to ATi competition? <shrugs> I dunno, but if they want to accentuate that point, they should be supplying the numbers to back it up. Tons of us on here would LOVE to see detailed numbers, too! :D We can reach our own conclusions from them, kthx. Plenty of smart people good with numbers about.

I figure that's exactly what Huang WAS trying to say, actually--that losses in sectors due to ATi competition alone wouldn't have caused them an overall marketshare decline. Meanwhile there's still plenty of room for "creative accounting" (such as attributing all losses to Intel in sectors all three share even if ATi made gains as well, or at least processing Intel's gains first). It also ignores saying whether INTEL by itself would have been enough to cause a loss, barring any other competition.

If they want to make an actual comparison, make an actual comparison. Otherwise statements like that are seen as vague marketing fluff trying to imply something we then automatically distrust.
 
Uttar said:
DaveBaumann said:
Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.

:LOL:

Yeah, I think NVIDIA believes me, you, and Hellbinder are really the only three guys claiming the NV35 has inferior performance. And of course, we should ONLY trust industry legends like "Ex-Kyle(TM)" and "Biased Tom(TM)", right? ;)


Uttar

Let's ignore performance for a second. I recall myself drooling over the NV25 when it hit the first (p-)reviews in the past, and there wasn't much chance back then I would have opted for a R200. Do I need to explain why tables turned completely with followup products from both IHVs? That said I still would have picked a R300 even if it would have been slower than competing sollutions.
 
RussSchultz said:
Maybe you ought to look at who's market share is growing (Intel), and which market both NVIDIA and ATI are targetting (motherboard chipsets w/integrated video) before you go spouting off that NVIDIA has gone crazy.

Is INTEL going to develop it's own dx9.0 integrated chipsets or lisence IP? If yes for the latter who would be the most likely candidate since I'm fairly sure that NVIDIA is completely allergic when it comes to selling IP?

I'm not trying to say anything else here, but that off the two IHVs ATI has IMHO more potential for growth in the foreseeable future than NVIDIA.

As far as the PC standalone graphics market goes, if ATI isn't their biggest competitor there, then I'd really like to hear who in fact is. I expect PR/marketing to be able for such flexible turns since it's actually their job; question is how much each and everyone actually buys into it.
 
Ailuros said:
Is INTEL going to develop it's own dx9.0 integrated chipsets or lisence IP?
Does it matter? If an integrated video is selling instead of your add-in board, then they're gaining market share, and you're losing it.
If yes for the latter who would be the most likely candidate since I'm fairly sure that NVIDIA is completely allergic when it comes to selling IP?
Lets see...who's integrated video with other chipset manufacturers in the past. That couldn't be NVIDIA because they're completely allergic, of course.

I'm not trying to say anything else here, but that off the two IHVs ATI has IMHO more potential for growth in the foreseeable future than NVIDIA.
And that really wasn't what was being discussed.
As far as the PC standalone graphics market goes, if ATI isn't their biggest competitor there, then I'd really like to hear who in fact is.
Which exactly wasn't what we were talking about either. NVIDIA made the statement that Intel was their biggest competitor. Integrated video is the biggest competitor to the add-in graphics market, regardless of who else you're competing with, and Intel "owns" a huge part of that market.

And, of course, they don't want to give any kudos to ATI. ;)
 
RussSchultz said:
Does it matter? If an integrated video is selling instead of your add-in board, then they're gaining market share, and you're losing it.

I have a problem with that in and of itself, actually, since in large part Intel and nVidia are not DIRECT competitors, but rely instead on the chip they're attached to. Intel does not bother with licensing for AMD chips, and nVidia has not yet licensed for Intel chips (none do in any case so far as I know, at least--in the PC market), so there is "competition" within their individual classes, but overall marketshare would seem affected much more by the popularity of the chips themselves. AMD has lost ground to Intel in recent performance, so Intel integrated chips get carried along with the Pentium sales.

RussSchultz said:
And, of course, they don't want to give any kudos to ATI. ;)

Ah... That would be it! ;)
 
RussSchultz said:
Maybe you ought to look at who's market share is growing (Intel), and which market both NVIDIA and ATI are targetting (motherboard chipsets w/integrated video) before you go spouting off that NVIDIA has gone crazy.

As of my last look, Intel is not a player in the 3d-gaming chip & reference design market. The company is a total zero in that market space, which is as of now dominated completely by ATi and nVidia, and as of late mostly by ATi. The last time Intel was a player in this particular market segment was with their i74x/5x 3d chip and reference designs, and Intel literally had its tail handed to it on a platter as 3dfx and nVidia, along with their board OEMs, in addition to ATi and even Matrox at the time, were selling competing 3d-gaming products into that market space which literally ran rings around products done on the i7xx reference designs by Intel. I actually owned an i75x-based product at the time, and have clear memories as to how disappointingly slow it was in comparison to the competing products available at the time. As such, I was not in the least surprised when Intel exited that market space after being whipped so conclusively (and has not ventured into it since.) Again, Intel does not play in what have traditionally been nVidia's strongest markets by far, and the markets which have made nVidia what it is today (varying opinions on "what nVidia is today," of course.)

That's what makes these comments so incredibly bizarre and strange, as the idea that "we haven't lost share to ATi, but only to Intel," is certainly completely untrue and not connected to reality. What is actually the case is that in the *discrete market segments* in which nVidia competes, among which are retail and system OEM 3d-gaming product markets, and the low-end 2d value integrated segment (requiring only minimal 3d hardware API support), nVidia has lost share to both Intel and ATi over the last year. In the first market segment here Intel isn't even present; and so ATi is certainly nVidia's strongest competitor without a doubt. It's only in the second market segment here that nVidia has lost market share to Intel, and that's an entirely different market segment from the one in which nVidia has lost market share to ATi in the past year.

The problem is that the "graphics chip market" does not exist as a single pie. It is often characterized as a single pie, but whenever that is done it presents only a gross distortion of whatever trends and events are actually occuring in the fully discrete market segments within the market as a whole. By way of example, a pie representing companies based on the total number of chips they manufacture, regardless of market, would be just as misleading and uninformative.

The only credible, informative way to view the "graphics chip market" is with multiple pies, each pie denoting a specific market segment which differs from the other segments in terms of products, purpose, cost, profit, and volume. To spell it out, a GF5600 would not be competitive in the integrated gpu market where Intel plays, to the degree that it simply could not be sold in that market, and an Intel integrated graphics chip designed for its corporate market would not be competitive in the GF5600 market to the degree it could not be sold in the 5600's market. Differing product lines are targeted to differing markets which are completely distinct from one another. So, against nVidia's GFFX reference card product line Intel simply does not compete at all, and is not even in the picture.

Secondly, moving to the IGP segment of the market, ATi's IGP chipsets and products use the P4 bus license as of now, and nVidia's IGP products do not--unless something major has happened and I've completely missed it...;) Interestingly enough, I read a statement recently which was attributed to JHH in which he stated he was staying away from Intel chipsets not only because of the $5-a-pop licensing fee, which he felt gave Intel an inherent cost advantage it would be difficult to overcome, but also because JHH feels that the P4 chipset market is already far too competitive for nVidia to enter with much expectation of success. So what JHH was saying *then* was for that market segment nVidia did not see itself as being able to compete not only with Intel, but also with an already numerous field of other competitors making P4 chipsets.

But there is *one way* in which all the discrete graphics-chip markets are indeed interconnected for the purposes of a company like nVidia or ATi. This is something nVidia has known and understood relative to its own success for quite awhile, and that is that in order to capture the low end you must first capture the high end. When you capture the high end--in this case the performance 3d-gaming API chip & reference design markets for OEM and retail sales--it then becomes an order of magnitude easier to drive down your product mix into the value market segements, and even eventually into the integrated graphics chip markets which reside at the very bottom, which is where Intel currently feeds ("bottom" here in terms of 3d API hardware functionality and performance, and production cost.)

This has been the key for nVidia's overall success in the last couple of years. In fact, it has also positively affected nVidia's ability to drive its products into entirely different markets, such as the Athlon core-logic chipset market, for instance.

So what's happening is that because nVidia has *lost* its position over the last year in its top and most fundamental market, the performance 3d-gaming API chip & reference design markets for OEM and retail sales, it has not only lost market share there (to ATi, of course), but also in the integrated value market segment where it competes with Intel, and in which Intel is its largest competitor. So this is why, IMO, nVidia does not wish to publicly talk about what ATi has done to it in its traditionally strong market over the last year, but would prefer to publicly discuss only what is going on versus Intel in the low-end, integrated gpu markets. Privately, though, I'm sure nVidia understands exactly what is happening in *both* market segments. When you lose the top of your markets, losing your position in your bottom markets is sure to follow swiftly.

Which brings me to again discuss in closing what I consider to be the most interesting item in the conference as it was reported by Gamespot. That's this really strange proclamation about driver release frequency. First, I thought it very strange to even mention anything about driver release frequency in the venue of this Conference, not to mention talking about drastically reducing it to such an improbably, impossibly low figure as "one driver a year." As some of nVidia's top brass was present for the conference, and for these remarks in particular, according to the Gamespot report, there's simply no way this can be construed as some kind of error or misstatement. But if nVidia wants to continue to compete in the 3d-gaming market segments, then this statement *has to* be simply erroneous.

Something like *driver release frequency* is fundamental to a company's *practical* support of its products sold into the 3d-gaming chip markets. So, if nVidia fully intends to *lower* its frequency of driver releases from their already low frequency of quarterly (in comparison with the ATi frequency of driver releases for its products sold into the same market), and is even contemplating something as non-supportive as a single driver release per year...then I would have to conclude they are seriously considering withdrawing from this market segment entirely. A single annual driver release is not sufficient to sustain product competition in the 3d-gaming gpu & reference design market it now shares with Ati. It's not even sufficent to sustain product *viability* in that market, IMO.

Conversely, I would think that in a renewed effort to regain their former position and standing in what has traditionally been their strongest market and their bread & butter, nVidia would have, if anything, announced an increase in the frequency of it official driver releases, as compared to its competition in the last year their current frequency is already insufficient.

Consider also the people likely to hear and think about such comments aside from you and I--like board OEMs and retail customers currently considering buying a nVidia-based product for 3d games. The negative ramifications of such a statement for the sales of GFFX reference-design products could be profound, which is why I don't think that such a statement might ever be made in this kind of venue by mistake, or without a full appreciation of the consequences it could well produce. It's difficult to imagine such a statement being an off-the-cuff example of mouth-engaged-before-brain PR babble, simply because the stated number of "one driver a year" is far too specific to have resulted from a generaliztion. But it is not impossible that for some reason the statement is in fact completely erroneous, so I'll be watching with interest to see what happens here...:)

The writing between the lines I see here, if it's true, is that JHH is going to drastically cut back on driver-development funds allocated to that end in the company. If you think of all the man-hours and money the company has burned in the last year producing cheats that were exposed, and optimizations that reduced image quality, and so on, I could certainly understand such a decision from him from strictly a money spent versus results produced point of view. However, such a decision would also render the company unable to compete in the 3d-gaming gpu market at the same time, seems to me.
 
And, of course, they don't want to give any kudos to ATI.

That alone would have been enough to interpret the entire PR stunt from front to end.

Lets see...who's integrated video with other chipset manufacturers in the past. That couldn't be NVIDIA because they're completely allergic, of course.

NV sold IP? Intel wants AFAIK to lisence IP only for future integrated graphics chipsets.

Does it matter? If an integrated video is selling instead of your add-in board, then they're gaining market share, and you're losing it.

No it doesn't matter at all. As of course that it doesn't matter at all that NV growth has stopped increasing, much to the contrary when it comes to ATI.

And that really wasn't what was being discussed.

Considering the first quote above, there wouldn't have been much reason for it anyway.

Which exactly wasn't what we were talking about either. NVIDIA made the statement that Intel was their biggest competitor. Integrated video is the biggest competitor to the add-in graphics market, regardless of who else you're competing with, and Intel "owns" a huge part of that market.

Well if Intel should lisence IP from ATI in the future, then I could in a nuttshell expect NV to claim in the future that neither/nor are actually their biggest competitors but Creative Labs maybe for sound sollutions or whatever weird option can be replaced with that. I know it sounds crazy, but then again if you expect me to admit that PR spins are more sane than that you'll have to wait long enough.

It's of course completely irrelevant too that the reason why they ever managed to gain their market share in the integrated market is merely because they had managed in the past to repeatedly win mindshare with their high end sollutions. Now where's my straight jacket if you please? 8)
 
Ailuros said:
NV sold IP?
Yes, there were integrated TNTs in ALi chipsets way back when.

Can we say the same thing about ATI? I actually don't remember any IP that they've license to anybody. Even their imageon seems to be a separate chip, just like the MediaQ processor.

Of course, past performance is not indicative of future activity, but its disingenuous to cast NVIDIA as "is completely allergic to selling IP" and ATI as not. Neither one of them are in that business of selling IP; both of them are competing in the chipset market directly. It doesn't seem to jive with their current strategic directions.
 
Back
Top