VSA100 ?

Contracts with big players like Dell, Compaq, and Gateway are not really made on performance, but cost, support, relationship between the companies, etc. If you believe that all a company has to do is release the #1 high-end performing card, and they will instantly displace all those TNT Vantas and GF2MX's, you're sadly mistaken. ATI is not going to dominant the high end for long enough and by enough of a margin to make a real difference from a pure technology standpoint.

3dfx lost out more because of their bad relationships with OEMs and poor management than they did because NVidia owned the high end. SiS doesn't do too badly in the system integration market, and they've never owned the performance crown.

If for some reason, ATI had a 2x performance lead for an extended period of time (like 1 year), NVidia might start to have their relationships eroded. But the GFFX won't be beat by a significant margin (if at all), and I doubt the R350 will do much to change it. ATI and Nvidia are going to be "on par" with each other for the forseeable future.
 
sas_simon said:
Althornin, the technology that is coming out today is about doing more for less. Making things more efficient. Imagine turning on 4xfsaa and 8xanisotropic filtering with absolutely no performance hit.
Sas_simon, this would simply mean that the card is inefficient when NOT doing 4xFSAA and 8xAnisotropic, or that it is CPU limited.
Nothing is for free in 3D. And i also fail to see how what you have said is at all relevant.
What is your point?
Where did i say "efficiency is bad"? My only point is is that more "power" is needed until we can run at resolutions and quality where aliasing (texture or edge) ceases to be a problem.
 
demonic said:
sas_simon said:
Althornin, the technology that is coming out today is about doing more for less. Making things more efficient. Imagine turning on 4xfsaa and 8xanisotropic filtering with absolutely no performance hit.

Ok, i'll bite... GFFX?

If you can tell me that the geforcefx has 4xfsaa and anisotropic filtering with no performance hit at all, sure... :LOL:

I was talking about future cards and future architectures i.e. the nv40, r400 and beyond.

The cards today aren't getting more powerful to churn out more and more frames in quake3, they are getting more and more programmable to churn out whatever the developers like in those frames and powerful enough too ensure that everything the programmers want gets displayed at 40frames per second+.
 
Althornin said:
sas_simon said:
Althornin, the technology that is coming out today is about doing more for less. Making things more efficient. Imagine turning on 4xfsaa and 8xanisotropic filtering with absolutely no performance hit.
Sas_simon, this would simply mean that the card is inefficient when NOT doing 4xFSAA and 8xAnisotropic, or that it is CPU limited.
Nothing is for free in 3D. And i also fail to see how what you have said is at all relevant.
What is your point?
Where did i say "efficiency is bad"? My only point is is that more "power" is needed until we can run at resolutions and quality where aliasing (texture or edge) ceases to be a problem.


Would you say that the kyro and kyro2 were very inefficient? The geforcefx is also rumoured to be able to have 4xfsaa with almost no performance hit, would you say that that was very inefficient?

If a graphics card can churn out hundreds of frames per second in doom3 but make it look like crap and be next to impossible for developers to write for, do you think people will buy it?

Future generations of cards are going to be more programmable with long vertex shaders and pixel shaders too pull off beautiful particle effects, enviromental bump mapping, etc. Developers really don't care how many hundreds of frames per second their game gets at 1600x1200x128bpp with 4xfsaa and 8xanisotropic filtering, all developers want is to be able to be creative and to ensure that there creative masterpieces get displayed at a workable framerate i.e. 60frames per second.

You didn't say that efficiency is bad, you did however say that more brute force is needed. Yes it is, but so is programmability. What you must realise is though that brute force is not the be all and end all of modern graphics, it was 2 years ago, but that is far in the past compared to the cards today.

-----------------------------------------

What I am saying is not that cards need to be more powerful but they need to have programmability to go along with that power as well. The problem is, you need to be able to reach a fine balance between the 2.
 
DemoCoder said:
Contracts with big players like Dell, Compaq, and Gateway are not really made on performance, but cost, support, relationship between the companies, etc.

The "etc." includes brand recognition, which does have a bit to do with performance as has already been proposed.

Is it nVidia that Dell, etc., deals with, or card manufacturers? Maybe some card manufacturers that switched to ATI? If they do indeed have a relationship with a card maker who did this, wouldn't they (Dell, etc) favor continuing it based on the factors you stated? If so, isn't the switch of such a card maker to one or the other quite a bit due to performance leadership, as well as their relationship to the chip maker?

I think both chip makers, either directly (ATI at current, presumably) or through card makers (both ATI and nVidia) are likely fairly even, as far a company such as Dell is concerned, in other factors, and perceived performance leadership therefore plays a significant role in which of them Dell and others are offering in "primary" configurations.

If you believe that all a company has to do is release the #1 high-end performing card, and they will instantly displace all those TNT Vantas and GF2MX's, you're sadly mistaken.

Did someone state that? That is a silly statement, but as neither ATI nor nVidia is "just" doing anything of the sort I don't know that it is relevant. Which is why regaining performance leadership with the GeForce FX is likely important to nVidia (and likely why they stuck with the name "GeForce", as others have stated).

ATI is not going to dominant the high end for long enough and by enough of a margin to make a real difference from a pure technology standpoint.

Is "ATI is not going to dominate the high end from a pure technology standpoint long enough and by enough of a margin to make a real difference" equivalent to what you mean? The association of "pure technology standpoint" with "dominate the high end" is the only thing that makes sense to me given the discussion.

If this is indeed what you mean, I'd ask why you consider this likely? It seems to go directly counter to the trend since the 9700 had been completed, both in card maker choices earlier in the year (which seem likely to have been based on some information about the 9700, as I recall some reference to future products by card maker(s) who "switched" at the time), and computer maker OEM deals more recently.

3dfx lost out more because of their bad relationships with OEMs and poor management than they did because NVidia owned the high end. SiS doesn't do too badly in the system integration market, and they've never owned the performance crown.

Well, it does remain to be seen how well they do with competition from graphics cards makers in that market increasing. I'll note that they (SiS) seem to have put some emphasis into increasing the performance of their integrated graphics and graphics cards, and I think this illustrates that performance does matter significantly even for them.
Note also that I tend to agree with the premise that nVidia will not follow 3dfx anytime in the foreseeable future...just doesn't seem worth addressing as plenty of answers have already been given.

If for some reason, ATI had a 2x performance lead for an extended period of time (like 1 year), NVidia might start to have their relationships eroded.

!?!? It would take 1 year so that nVidia might start to have their relationships eroded? I don't think that really makes sense.

Hmm...how about it might take 1 year to be able to tell if they've had their relationships eroded (and the counter would already have started on that year, I'd say, depending on how the GeForce FX plays out). This strikes me as quite a reasonable expectation.

As for "2x performance lead", I really don't know why you use that figure. Does only a minimum 2x performance lead count? :eek:

But the GFFX won't be beat by a significant margin (if at all), and I doubt the R350 will do much to change it. ATI and Nvidia are going to be "on par" with each other for the forseeable future.

Well, you started by saying performance doesn't matter, and end by saying they are going to be on par with regards to performance. I disagree with one, but tend to think the other is quite reasonable, in case my post isn't clear. Doesn't seem consistent to me, however, so maybe the problem is I took your meaning the wrong way somewhere.
 
Demalion already siad much of what I would have in response to DemoCoder's lat post, but I'd like to expand a bit on a couple points:

DC said:
Contracts with big players like Dell, Compaq, and Gateway are not really made on performance, but cost, support, relationship between the companies, etc.

I think we all certainly agree that having the "performance crown" alone is not what's going to make you "the dominant" player in the industry. I'd say we all also agree that the things you mention are important. (Though they all have complex interrelationships....)

What I don't understand is that you speak as if ATI is just some newcomer on the block, has no or little of these established "relationships" with these companies, has out of control pricing, etc. That all ATI has going for it is the performance crown of the flagship product.

It's actually quite the opposite, IMO. The biggist being that ATI has a long history of relationships with the big OEMs, both in discrete graphics solutions, and notebook solutions. (I'm talking system builders here, like you mentioned.) In fact, one thing that ATI has been missing for the past few years has been relative performance. That is precisely what has kept ATI from regaining the "market leader" status they once had. ATI and nVidia basically went at the OEMs from two different perspectives:

nVidia: Basically started being serious back in the Riva 128 days. "Best" 2D/3D integrated products, but zero OEM relationships. They used their superior products to establish and build the relationships.

ATI: Started way back in the DOS and Windows 3.X days. They had highly estabnlished relationships with OEMs by the time nVidia came on the scene, which is very recent. ATI used those relationships to help sell their products.

At this point, I would consider ATI and nVidia on pretty much equal footing: Both have well established realtionships, and both have very competitive technology / products. OEMs will be the winners here, because the will use the competition to put "the Squeeze" on both ATI and nVidia for lower prices.

With two well-established players like ATI and nVidia, oddly enough it DOES come down to which product is actually better...and that's what the OEMs go to, because that's what there customers will be willing to pay for.

And it's not like "all ATI has going for it" is the 9700 Pro. At this point, I would say that, product wise, all nVidia has going for it is the nForce2 chipset. And that's limited to performance athlon solutions. Product wise, ATI has the clear lead in mobile discrete, mobile integrated, intel chipset (by default), and practically every segement of add-in graphics cards.

In short, I would not say or imply that "all ATI has going for it" is the #1 performance high-end card. IMO, it's more accurate to say that "all nVidia has going for it" is a slight edge in OEM relationships.

And in all honesty, I would be very hesitant to say that either company is in the overall better position going forward. On one hand, technology leadership can change on a dime. On the other hand, OEMs will not forgo better technology in favor of "established relationships" for very long.
 
most configurable options I've seen at several compnaies whilst speccing out a PC for my boss have 9000's, Gf4Ti's and 9700Pro's as options. Not many Gf4MX's at all. It looks like most configurable options will give you a complete choice, whilst most pre-built high end gamimg machines have 9700Pro's and most mid priced systems have Gf4's still as 9500 range isnt really out yet in numbers.
 
Oh, and here's a timely bit of info, funnily enough, they mention DELL specifically.

http://www.bjorn3d.com/column.php?tid=20

I took some time to do some research on graphic card makers this week and some PC makers. The add-in card guys are not happy with nVIDIA due to the paper launch of GeForce FX. They state this has all but dried up demand for the Ti4600, and even the Ti4200 sales are starting to fall off. On the PC maker front, Dell does not even offer a Ti based card in their machines anymore. They offer a 440MX, and the rest are ATi based cards. I even called a sales rep at Dell who is a friend of mine, and he said sorry, no more Ti cards from nVIDIA. I asked him did he expect us to have them again and was told no.

I don't agree with everything that is said in that article (R400 due in spring?), and I'd say lack of Ti4600 demand is more due to the introduction of the 9700 than the paper launch of the FX. But the point still stands..."relationships" can be gained and lost just like technology leadership can be. What many people don't see is that "relationships" and "technology" are very related. OEMs seek several criteria from their sellers that they establish relationships with, not the least of which is the seller having the product / technology that is in demand. It's quite remarkable what "having the product that consumers want" does to solidify such relationships. ;)
 
misae said:
As a real world test try putting a Geforce 4 Ti or Radeon 9700 in a K6-2 system with PC100 RAM and AGP 2x.

Hardly very meaningful. With a K6-2+R9700 setup you will generally find that since games are always CPU-limited on such a setup, you get essentially the same framerate at all resolutions, straight from 512x384x16bit to 1600x1200x32bit, even with AA/aniso enabled for the latter. There just is no CPU performance cost to enabling AA/aniso/high resolutions, only GPU performance cost. If you then upgrade to a 3GHz P4, you will find that the performance at 512x384x16bit increases by a huge amount, while the performance of 1600x1200x32+AA/aniso increases much less.
 
sas_simon said:
Would you say that the kyro and kyro2 were very inefficient?
Did they have FREE 4xFSAA and 8x Aniso?
Oh wait, no they didnt, so whats your point again?

You didn't say that efficiency is bad, you did however say that more brute force is needed. Yes it is, but so is programmability. What you must realise is though that brute force is not the be all and end all of modern graphics, it was 2 years ago, but that is far in the past compared to the cards today.
So basically, you agree more power is needed.
So whats your point again?
Please, cease the condesending speech. I have been hanging here for a long time, and i am no fool. I dont need you to come along and tell me "how it is" - i know. Dont assume that i "need to realize" brute force isnt the be all end all. Where did i say it was? My only point (and one that was, and is, still valid) is that more power (oh yeah, power !=brute force, note brute force is YOUR wording, not mine) is needed, as current cards cannot render games at such high quality (at good speeds) that aliasing is not apparent.
 
Sorry, I was under the impression that kyro2 did have free anti aliasing.

And you seem to say that graphics cards will need to carry on getting more and more powerful until you can play at 1600x1200x128bpp with 4xfsaa and 8xanisotropic filtering. play what exactly? the newest games? I don't think so. I don't see the point of your post, not even the geforcefx has enough power to do what you are saying in the newest games, the nv40 won't, don't bet on the nv50, not even the nv60.

And who do you think you are, I don't care how long you have been here and how many posts you have. Do you know who I am, do you know my background? Also when did I call you a fool?

Please also ignore my "brute force" term and replace it with the word "power". I do appologise for my incorrect grammar
 
KYRO's AA is SuperSampling, so it is not fillrate free, the architecture, though, enables it to be bandwidth free. Combine MSAA and a tiler and you have totally free FSAA (well, the only overhead being your pointer list size increases).
 
DaveBaumann said:
KYRO's AA is SuperSampling, so it is not fillrate free, the architecture, though, enables it to be bandwidth free. Combine MSAA and a tiler and you have totally free FSAA (well, the only overhead being your pointer list size increases).

Hmm. Has anybody done this yet? What about the never-released Oak Technology WARP5? I remember seeing it running at E3 '97 and it was touted as having totally free FSAA. I can't remember much about it.

I got a WARP5 beta board. Bummer though is I believe it was one of the few that got ruined when my storage unit flooded. That reminds me, I think I'm going to auction off all my graphics boards(working and flooded) on eBay. Probably sell them all together. Anybody interested?

Tommy McClain
 
Has anybody done this yet?

All GigaPixel's designs were built as MSAA tilers. Although none were released they did have a GPI sample running and people have at least seen and written about that.
 
Nvidia isn't going to die from a few months without the performance crown. Indeed, if the current situation continues, with ATi videocards beating equally-priced Nvidia videocards at every price range except for Ti4200, nVidia will start to suffer. However, a huge, well-established hardware company does not die overnight. 3dfx was ailing since the VooDoo3; nVidia beat them not only in technology but in marketing, OEM relationships, and management. In contrast, nVidia currently suffers from a -slight- technology lag, but still has better marketing than ATi, and the OEM relationships are on par with ATi. It is nowhere near as bad of a situation as 3dfx was in.

Keep in mind that from the Rage to the Radeon8500, ATi's videocards were always slightly lagging the performance of the market leader. (originally 3dfx, later nVidia) But ATi's still here - living proof that you can survive even if you never have the performance crown for many years. Even if nVidia enters the pattern of lagging half a product cycle behind ATi, they won't die - they'll only diminish to being the underdog like ATi once was. Right now, the Radeon9700 has an insurmountable performance advantage over GeForce4 - but the same could be said for the GeForce3 over the original Radeon before the release of Radeon8500. The GeForce3 vs. Radeon was unfair because the GF3 was a whole generation ahead - the real battle was fought between GF3, R200, and GF4. For the same reason, you can't compare GF4 to R300... The real battle will come with the R350 and NV30.

It's likely that NV30 will lag the performance of R350 by a fair amount - but just as Radeon8500's performance disadvantage vs. GeForce4 didn't kill ATi, a moderate performance disadvantage of NV30 against R350 is unlikely to kill nVidia. Like ATi with earlier Radeons against GeForces, nVidia might end up being a few months behind their opponents' technology for a year or two - but that will not kill them any more than the GeForce3 killed ATi.
 
I think most here agree that nVidia isn't going to "die" anytime soon. ;) (I'll have to go back and check...but did anyone actually suggest that in this thread?)

But IMO, a significant change in market "leadership" can happen relatively quickly...within a year or so, if ATI maintains technology leadership and continues to execute their roadmaps with consistency: significant brand new cores every summer. In other words, I can easily see ATI becoming the "market leader" by early 2004 if ATI ships the R-400 on time in Summer 2003, and NV40 fails to ship in a similar time frame.

3dfx died because of a combination of lack of competitive products (due to lack of timely execution) and poor management decisions. Not one or the other. Right now, nVidia is having an execution problem. May be temporary, or may continue for a while. If nVidia continues to fall behind ATI in new product shipment shcedule, then you'll start to see the "OEM relationships" start to deteriorate rapidly in favor of the more reliable vendor.

Again, not predicting the "deth knell" of nVidia at all...just the reality that their position of "market leader" is not nearly as solidifed as some people may think....
 
DaveBaumann said:
All GigaPixel's designs were built as MSAA tilers. Although none were released they did have a GPI sample running and people have at least seen and written about that.

Only GigaPixel? Interesting. I wonder if we'll ever see that tech included in hardware from NVIDIA.

As for Oak WARP5, it looks like it used sort-independent anti-aliasing. Can somebody explain this with comparisons to the current most commonly used anti-aliasing techniques? I'm pretty well lost with most of the techniques out there.

[EDIT]
I found the press release from Oak Technology that announced the WARP5. It says it's a region-based renderer. Here's a snip from the PR:

This technique, said Oak Technology, enables the image's Z-values and anti-aliasing information to be stored on the chip at the sub-pixel level. Therefore, no external storage is required and memory bandwidth as well as capacity requirements are reduces, eliminating bottlenecks often associated with conventional 3-D design solutions. WARP 5 delivers the processing power of more than 50 million pixels/second, according to Oak Technology.

50Mpixels/s WOW! ;)
[/EDIT]

Tommy McClain
 
The problem for nVidia isn't that ATI is going to crush them, but that they lost their opportunity to crush ATI. If they had been able to announce the GeForce FX in July, the 9700 and its successor boards would have had a much smaller impact in the market. Game developers would have been more likely to keep following the trend of developing specifically or primarily for nVidia cards; the resulting compatibility problems would have further diminished the appeal of ATI cards vis a vis nVidia.

Instead, we have "The way it was meant to be played" games that run best on ATI cards. A lot of high-end gamers bought a lot of 9700s, and probably a lot of game developers as well. Compatibility problems with ATI cards that might have escaped developers' notice before are being found now. Future games will run better on ATI cards out of the box.

nVidia's stumble (and ATI's good execution) allows ATI to continue to play as a gamer-coveted GPU vendor.
 
arjan de lumens said:
misae said:
As a real world test try putting a Geforce 4 Ti or Radeon 9700 in a K6-2 system with PC100 RAM and AGP 2x.

Hardly very meaningful. With a K6-2+R9700 setup you will generally find that since games are always CPU-limited on such a setup, you get essentially the same framerate at all resolutions, straight from 512x384x16bit to 1600x1200x32bit, even with AA/aniso enabled for the latter. There just is no CPU performance cost to enabling AA/aniso/high resolutions, only GPU performance cost. If you then upgrade to a 3GHz P4, you will find that the performance at 512x384x16bit increases by a huge amount, while the performance of 1600x1200x32+AA/aniso increases much less.

I think what you will in fact find is that the GPU is asking for more than the CPU can handle no matter what mode you are using, be it 512*384 or 1600*1200. Applying AA and AF will have little bearing on the results as you mention.

However my argument is that a powerful GPU needs a powerful system feeding it all the time. I assure you that a Geforce 2 Ultra would be significantly faster in most situations on a system like a K6-2 rather than having a GF4 ++ card in there.

Like I said.. real world test... needs to be real world (I have done some testing with regards to this in fact).

Yes you are right it wont cost anything for the CPU to enable AA or AF but the GPU it will cost, however I am saying that your CPU and memory subsystem will be so choked in the first place your new supercard will perform slower than a morris minor packed full of 20 morris dancers ;)
 
Back
Top