Mercury Research on graphics market share

There were bugger all SM2.0 titles last year (woo Tomb Raider woo!) and the R3x0 still took off. Marketing departments exist to convince people they really need features that they actually don't :)
 
Its not just that, nv3x could be proven to have poor performance in stuff that uses the new technology and was also put in the shade by 9x00 in many of the titles of the time. There is no such compelling cases cases for nv4x right now - the performance split is fairly even at and there is nothing out there to highlight that shader 3 is better than using shader 2 (ati's at least). In fact, the only thing out there is actually counter marketing for nvidia - 3dmark05 should have been a banner benchmark for nv4x with shader 3 and nvidia's shadows, and yet ati outperforms them with shader 2; not exactly a convincing argument for nv4x (and we know oem's use this in their evaluation).
 
Fodder said:
There were bugger all SM2.0 titles last year (woo Tomb Raider woo!) and the R3x0 still took off. Marketing departments exist to convince people they really need features that they actually don't :)
R3x0 had considerably more going for it than just good DX9 support. I would expect that useable AA, good IQ at a time when there really was a difference between ATI and nVidia, and unmatched speed all played a part in making the R3x0 popular.
 
whql said:
the performance split is fairly even at and there is nothing out there to highlight that shader 3 is better than using shader 2
But if they're dead even but only one has a huge (if currently fairly useless) marketing dot-point, which do you think is better for business?
Ratchet said:
R3x0 had considerably more going for it than just good DX9 support.
Of course, and I wouldn't dare to argue that R3x0 wasn't a fantastic chip - my R9600XT serves me well. :)

But, these are all features that Joe Public doesn't understand nor care about. I'd hazard a guess that there are a ton of FX owners who don't even know their card sucks in DX9.
 
BZB said:
OEMs don't want cards they can't get and that cost more due to extra cooling and power requirements.

The only NV4x cards that have significantly more extreme power requirements than competing ATI cards are the 6800 Ultras. All other flavors are available as single-slot, single molex cards. In fact, the 6600GT on the .11 process has lower power requirements and needs a less noisy and less heavy cooling solution than the X700XT!

They don't want to pay for SM3.0 that is unused and unusable on the first generation product.

LOL, who said that OEM's will have to pay extra for SM 3.0? And since when did SM 3.0 become unused and unuseable? We already know that there are at least a dozen games coming out in the near future that will make use of SM 3.0. On the other hand, support for SM 2.0b is questionable at best among some developers for various reasons.

OEMs have already been voting with their wallets, just like the retail customers.

I think you oversimply OEM needs and wants. Last gen, all the OEM's went for the FX 5200 because they were able to market it as a DirectX 9.0 compliant card. They will do more of the same with DirectX 9.0c when the value cards actually ramp up in production.

Ahh, the true JJ comes out - the insulting Nvidia apologist.

Realist, not apologist.

That is no longer the case, yet Nvidia still act like they have no competition.

This statement is pretty inane. Why would they pull out all the stops on a forward-looking new architecture with SM 3.0, SLI, etc if they felt they had no competition? Logically, this makes no sense.

Remember, the man who runs Nvidia doesn't consider graphics as their core business or ATI as a competitor. How's that for "forward looking"?

I thought it was an ATI rep who mentioned shifting away from graphics as the core business... ;) Seems that you have some personal issues with NV, and that is irrelevant to what is actually being discussed about a forward-looking NV4x architecture.

For someone who doesn't want to take the past into account, why are you talking about Nvidia's past accomplishments?

Read the statement again in it's context. It is pretty obvious what I was trying to say there.

Yeah, it was a great response to a revitalised ATI:

1. NV30 - so far behind schedule, poor IQ, and poorly performing, it was cancelled and now Nvidia disowns any mention of it. Benchmark cheats and marketing lies. Alienates enthusiasts. Nvidia OEMs break contracts and start to use ATI products. CG and Cinematic Computing is marketed furiously

2. NV35 - poorly performing, poor IQ low sales, short life. Benchmark cheats and marketing lies. Alienates enthusiasts. Even more Nvidia OEMs break contracts and start to use ATI products.

I don't think an ATI chump could have said it any better! :D

3. NV40 - massive improvement by taking the same design philosophies as ATI used since R300.

And now that's a bad thing, huh? ATI pushed the industry forward with the R3xx, and NV is doing the same thing with the NV4x.

Poor availability, shunned by OEMs due to high heat and power requirements.

Looks like you enjoy making highly simplistic arguments. Poor availability (cough, ATI X800XT PE, cough) and high heat/power requirements are really only applicable to the 6800 Ultra. Didn't I also mention to you that Dell is using the 6800GTO now, and that the 6600 and 6200 still needs to be ramped up in production? Of course, you are not one to sweat the details ;)

The hilarious thing is that you are already trying to stamp down history when it is not even set. I know it severely pains you to see this, but like it or not, the NV4x is going to carry NV a long way because it is a very fundamentally strong and forward-looking architecture.
 
Fodder said:
But if they're dead even but only one has a huge (if currently fairly useless) marketing dot-point, which do you think is better for business?
Depends on your marketing dot-point, I'm of the firm opinion that ATi still has the high-end top spot this round and that is still pretty good for bragging rights. :)
 
Fodder said:
But if they're dead even but only one has a huge (if currently fairly useless) marketing dot-point, which do you think is better for business?
They both have different thing to market - one has shader 3 the other has 3dc. We know there may be plenty of difference behind them, but as you say "Joe Public" probably doesn't so they both have a point to market.
 
The 6800GTO is the highest performing option on the 8400 series. The lowest is the X300SE, and in the middle is the X800SE. On the XPS systems, the X800SE's is the default option, with 6800GTO and X800XT as the higher performing options.
 
whql said:
They both have different thing to market - one has shader 3 the other has 3dc.
I think 3Dc will be a fair bit harder to plug, which is a bummer. Your average gamer is really beginning to grasp that 'shaders' make things look better, and if NVIDIAs shaders are 50% bigger than ATIs, that can only be a good thing. 3Dc really needs a 'killer app', something to demonstrate just what it can do, as it doesn't have SM3.0's buzzword-status to coast on.
 
What? A few posts ago "Joe Public" was fairly dumb and didn't know much beyond dx9, and yet now he know clearly that 3dc is nothing and shader 3 is something?

OK, if that is the case then its very easy for ati to make a thoroughly convincing argument that shader 3 is unimportant and their shader 2 is equally as good, if not better becuase it faster - they they do is run 3dmark05.
 
Heh, you guys are arguing over sm3.0, 6xxx series vs x800 series... the problem for Nvidia looks like they are losing market share that they currently have no answer to, that's the integrated solution.

They lost a total of 8 percent, but only half to ATI. They might get some of that back, but the rest to intel and the likes of sis, s3, etc... those are likely to integrated graphics, and market share they aren't likely to get back anytime soon.
 
Fodder said:
3Dc really needs a 'killer app', something to demonstrate just what it can do, as it doesn't have SM3.0's buzzword-status to coast on.
Isn't HL2 supposed to support 3Dc? If it looks convincingly better with 3Dc enabled in Half-life2 I think that should cover that one. ;)
 
whql said:
What? A few posts ago "Joe Public" was fairly dumb and didn't know much beyond dx9, and yet now he know clearly that 3dc is nothing and shader 3 is something?
I knew I'd get picked up on that one. :)

Currently game mags mention that shaders are a good thing, I doubt 'DXT5' has ever graced the pages of PCPP. It's dead easy to say "our shaders are better than the last shaders", it's alot harder to say "our texture compression algorithm is better than the last texture compression algorithm" when the buzzword of the graphics industry for the last three years has been shaders, not compression. I stand by what I said, Joe Public is clueless as to which card is which or how well they handle certain tasks or even what the different technologies are, but anyone who reads games mags, sites or forums knows that shaders are a good thing. They don't need to know what a shader does to want better a one. 3Dc just doesn't have the same base to work from.

I'm really not trying to plug NVIDIA or SM3.0 here or to detract from 3Dc, I just think they can do well here if they pull off NV44 and put some decent marketing behind it.
digitalwanderer said:
Isn't HL2 supposed to support 3Dc? If it looks convincingly better with 3Dc enabled in Half-life2 I think that should cover that one. ;)
Indeed, here's hoping.
thatdude90210 said:
the problem for Nvidia looks like they are losing market share that they currently have no answer to, that's the integrated solution.
Too true! NForce started out as the best integrated solution on the market, but we haven't had a significant update in that field in over 2 years. I'll have to dig out some roadmaps to see if there's even an NF4 IGP planned.
 
Fodder said:
digitalwanderer said:
Isn't HL2 supposed to support 3Dc? If it looks convincingly better with 3Dc enabled in Half-life2 I think that should cover that one. ;)
Indeed, here's hoping.
thumbup.gif
beer.gif
thumbup.gif
 
And what does "shaders" mean to "Joe Public"? Not a lot really, I'd imagine - better shaders equals more detail? Equally, texture compression doesn't mean much, which is why its called "3Dc" and not "Normal Map Texture Compression" - all you do is show an image with 3Dc and without and you see the one with 3Dc has better detail.
 
whql said:
And what does "shaders" mean to "Joe Public"? Not a lot really, I'd imagine - better shaders equals more detail? Equally, texture compression doesn't mean much, which is why its called "3Dc" and not "Normal Map Texture Compression" - all you do is show an image with 3Dc and without and you see the one with 3Dc has better detail.
The point is that right now when you say shaders they think 'better graphics'. Say texture compression and the best you'll get is a question about zip files. If it was 3Dc v3 and Shader v3, then they'd be on equal footing, but right now ATI have a lot more groundwork to do. Again, I'm not trying to take anything away from 3Dc as a technology, I just think it's a hard thing to plug, perhaps akin to NVIDIA using PCF as a selling point.
 
To average Joe, the graphics technology buzz words mean nothing to them. All they care about is games. There are many stores here listing GF FX 5700 as "best for Lineage II" or something similar. For most people playing MMORPG, "Lineage II" means "integrated display is no longer enough." However, it's not really very demanding either.

Of course, Doom 3 is another issue. There are ads with words like "Doom 3 doomed your computer" shown in computer stores. Expensive cards like GeForce 6800 Ultra/GT and X800 Pro sell like hot cakes... I don't know why, although I bought one Ultra too. :p
 
Well I said it before and I will say it again...this shows you just how GOOD the r300 was back then if they are still riding on its mertis to take over a majority of the market share. I agree that NV4x will change that but it will take time.
 
jb said:
Well I said it before and I will say it again...this shows you just how GOOD the r300 was back then if they are still riding on its mertis to take over a majority of the market share. I agree that NV4x will change that but it will take time.

I think you've got it backwards. The true glory of R300 (R420) comes now when it's able to run with the NV40. The last few years were more a display of Nvidia's incompetence than ATI's brilliance. But I guess you could twist it either way.
 
Back
Top