State of 3D Editorial

radar1200gs said:
CorwinB said:
As ATi would be well aware, any hardware vendor is free to create their own Cg backend, and nVidia actively encourage this. Of course, ATi has never taken the time to actually do this, being far too busy slagging off Cg instead...

Sure. Nvidia encourages other hardware manufacturers to write back-ends for a language of which Nvidia controls all specifications. Upside of doing this when compared to using HLSL and GLSlang ? None that I can think of.

To continue your reasoning, you could say that Nvidia was far too busy slagging off the R300 technology ("A 256 bits bus is overkill", "You can't build a true next generation part on 0.15 microns", "I personally think 24 bits is the wrong answer") that they forgot building a competiting part...

nVidia controlling the specifications for the Cg language makes no difference whatsoever. Every backend implimentation must successfully compile the Cg program handed to it in the first place or it isn't doing its job properly... What the backend does is allow the hardware vendor to optimise the output for their own architecture and take full advantage of the features found in that architecture.

If ATi is unhappy with how Cg currently runs they only have themselves to blame. nVidia is under no obligation to make their competitors look any better than they have to...

I think your logic is flawed a bit here.. Why would ATI want waste time making a backend to support a standard that makes Nivida look good?? My meaning is there is already a standard DX9 HLSL.. So why Nivida took it upon themselves to make another standard is a step backwards. I don't want to go back to GLIDE...

So again I ask what is the benefit to ATI to optimize for CG when their GPU already does DX9 perfectly fine.

Just a note: NIvida made CG because they didn't feel it nessecary to participate in the DX9 discussion. That my friend is arrogance beyond anything. To just think you have so much influence as to think you can just make your own standard with no one elses input is nutz...

Unless your MS!!
 
well, it really does put under a new light recent comments from Dr. Kirk:

"FP16 is good enough for us"
 
andypski said:
digitalwanderer said:
JoshMST said:
Now, my question to you guys, is if we will truly see FP32 implemented across the board in the next generation of products. From reading here and elsewhere, it would seem that NVIDIA should just concentrate on FP32 and drop FP16 all together. It was nice of MS to add the partial precision hints for NVIDIA, though FP24 is designated as the minimum. But will we see NV go total FP32? Will ATI also decide to go with FP32? If that is the case, will there be much of a performance hit? Or will any kind of hit be offset by the fact that it will be a next generation part running faster?
What does that matter? It isn't going to happen during the productive life of the FX nV3x product, so it kind of is totally irrelavent.
Perhaps he's just curious, and would like to know what people think. I think that's allowed isn't it? He is just asking a question - that question can be treated on its merits rather than trying to tie it into some other subject and label it irrelevant.
I dunno, I felt he was presenting it more as a relevant fact to consider when purchasing a card more than idle speculation...that was my only problem with it.

andypski said:
me said:
I'm glad you're taking the time to address this Josh, that article is a bit of a travesty the way it stands. :(
This being the case perhaps witch-hunts with respect to people who are trying to improve things could be viewed as a bit anti-productive? Constructive criticism and allowing people a chance to address the issues that we raise seems a more appropriate approach.
Yeah, and I'm not meaning to attack/witch-hunt and I apologize if I come across that way. I tend to be a bit passionate about video cards and hate to see misinformation spread and tend to have a knee-jerk reaction to it lately.

My apologies to you Josh if I caused any personal offense. My criticism is intended for your article and not you personally and if I crossed that line I do apologize. :cry:

As an aside - sorry if this seems like an attack on your post DW, but you just happened to be the most convenient example at the bottom when I chose to reply, and I really mean my comments above as a more general impression on the way things seem to be going on these boards at the moment. I don't particularly like it when new posters turn up from other sites, apparently interested in taking the time to get feedback and information, and find themselves dragged continually over the coals for perceived mistakes.
De nada, don't worry 'bout it. I give out enough, it'd be damned small of me to be thin-skinned to be on the recieving end of some criticism...'specially when it's valid. :)

EDITED BIT: Fixed some quote formatting. :)
 
nelg said:
Dio, join the marketing department now
Is that a compliment or an insult? :D Actually I've just been watching sireric. He really gets it right - solid engineering facts presented clearly.
 
radar1200gs said:
As ATi would be well aware, any hardware vendor is free to create their own Cg backend, and nVidia actively encourage this. Of course, ATi has never taken the time to actually do this, being far too busy slagging off Cg instead...

Why would ATI support a HLSL controlled by Nvidia, If ATI wanted changes to the HLSL do you actually think Nvidia would say sure ATI, we will fix that for you right away :LOL:

Not only that, but the language is INFERIOR to DX9 HLSL..benchmarks prove it

http://www.beyond3d.com/forum/viewtopic.php?t=6864&start=20

Let's look at performances of both shaders with some small tests :

- On a Radeon 9800 Pro your HLSL code is 25% faster than your Cg one.

- On a GeForce FX 5600, your HLSL code is 10% slower than your Cg one.

- On a GeForce FX 5600 with _pp modifier, your HLSL code is 7% faster than your Cg one.

With AA and AF enabled, your HLSL code makes a bigger improvement. It is faster even on a GeForce FX 5600 without the _pp modifier.

Cg seems faster only with GeForce FX and when the bottleneck comes from register usage.

(The Radeon 9800 Pro is 10 X faster than the GeForce FX 5600 )


Radeon 9800 Pro HLSL : 125 MPix/s
Radeon 9800 Pro Cg : 100 MPix/s

GeForce FX 5600 HLSL : 11.2 MPix/s
GeForce FX 5600 Cg : 12.4 MPix/s

GeForce FX 5600 HLSL_pp : 14.8 MPix/s
GeForce FX 5600 Cg_pp : 13.8 MPix/s

GeForce FX 5600 HLSL AA/AF : 7.0 MPix/s
GeForce FX 5600 Cg AA/AF : 6.1 MPix/s
 
TheTaz said:
JoshMST said:
Now, my question to you guys, is if we will truly see FP32 implemented across the board in the next generation of products. From reading here and elsewhere, it would seem that NVIDIA should just concentrate on FP32 and drop FP16 all together. It was nice of MS to add the partial precision hints for NVIDIA, though FP24 is designated as the minimum. But will we see NV go total FP32? Will ATI also decide to go with FP32? If that is the case, will there be much of a performance hit? Or will any kind of hit be offset by the fact that it will be a next generation part running faster?

Hi Josh,

Glad you visited again.

Want to make a couple comments about the FP32 route, that you are "touting".

1. You say in your article that PS 3.0 will be in DX9.1, when it's more likely to be in DX 10. Now... whether DX9.1 PS 2.0+ (2.1?) is FP32 or not, I don't know, but I highly doubt it.

2. EVEN IF DX 9.1 with PS2.1 uses FP32, there are NO cards, currently, that can handle it, at present. NONE of the NV3x line is fast enough to DRIVE FP32. That is NOT Future proof. So Everyone that bought FX5200-FX5950's are SCREWED. Because they'll still run slow in FP32.

3. DX9 is going to be around for QUITE a while. The difference between PS2.0 and possibly PS2.1 would be the same as a Geforce 3 using PS 1.1 and a Geforce 4 using PS 1.3, in terms of eye candy. There's not going to be much difference... meaning Current FP24 cards have a LONG life for a DX9 Life, just as my Geforce 3 Ti500 last all the way through DX8 and a year of DX9 (before DX 9 games started coming out). TRUELY Future proof. :p

My point is... IF DX 9.1 supports FP32 somehow... it's NOT going to make any difference to the CURRENT ATi Line.... it's not going to help the CURRENT nVidia line... and until both companies can actually "Drive FP32 at a decent speed", it's not going to help the NEAR FUTURE cards.

(Now, whether NV40 and R420 can drive FP 32 well enogh, I don't know).

Regards,

Taz

Sorry to quote myself... but I have more to say! :p

I don't think DX 9.1 will have anything to do with FP 32. I think the Rumors about DX 9.1 helping out nVidia cards is more like DX 9.1 supporting "mixed mode" better in the HLSL compiler. (Closer to CG). If you want to call supporting "mixed mode" as "supporting FP 32"... go for it. I was talking PURE FP 32 in my earlier statement... NOT Mixed mode. ;)

Regards,

Taz
 
Sxotty said:
Well if HALO is anything to go by then the r300 is not going to run DX9 applications very well at all. (of course halo isn't really DX9) Once real DX9 games come out, not ones with a few DX9 features thrown on top the r300 will look pathetic. That is how it always is, however if you want to say that the r300 schooled the FX line then you would be correct :).

By all accounts everyone is saying Halo is not a very good port, and yet it still ran fine for me on my R300. There was some mention on Rage3d about forcing PS2.0 and screen res in the startup icon to workaound bugs, and after that it was very swift for me, with the occasional slow down in very big firefights.
 
Sxotty said:
I define a real DX9 game as one that is designed to run on dx9 as a minimum, not one that is designed to run on dx8 and then a few dx9 features are added.

So do you consider HL2 to be a DX9 game? Just because a developer wants to sell as many games as possible and so provides a DX7/8 path that provides an approximation of the DX9 path, that doesn't mean that their game not a DX9 game. If there is a DX9 path, then you need a DX9 card as a minimum in order to run it. In *my* book that makes it a DX9 game.

What you're suggesting is like saying that a DVD with Dolby Digital isn't a "true" DVD if you can play it on your TV with two speakers and get stereo instead of the 5.1 soundtrack.
 
Regarding Halo Performance, I think FX owners need to worry about that vs. R3.XX ;)

0,3363,i=49806,00.gif
 
andypski said:
It's happened at least twice recently - crazipper got a similar reception when he turned up here after his recent article. I would hope that Beyond3D can be a place where enthusiastic people can turn up, debate and learn about 3D graphics without coming under continual attacks. I personally post here because this is a site with highly intelligent and informed people, but at the moment knee-jerk criticism seems to coming too easily to people, and consistently polite and constructive criticism is becoming much more difficult to find. I'm finding it a bit tiring to be honest.

While I do agree it would be nice if perhaps if these guys could visit here first, ask questions, learn and then write an article. It does not do DW's heart any good to "shoot first and ask questions later".
;)
 
This isn't the 1st article where Nvidia's FX architecture has been touted as more sophisticated to the R3x0 and I do feel that it's mainly due to ATI following the principle of actions speak louder than words. While good in principle unfortunately the press and it's relatives tend not to follow the same path but rather rely on the PR departments and this is where ATI falls short thus once again, we see the results.
 
nelg said:
It does not do DW's heart any good to "shoot first and ask questions later".
;)
Didn't I mention? My heart is just fine. I went to the cardiologists on Wenesday and he told me it's better than before my heart-attack. (I am SO dumb-lucky that way! )
 
Hi Josh. Thanks for coming over here.

I do have an issue with your "Lies, Damn Lies, and Public Relations" section.

To do this, NVIDIA was in a position where it absolutely had to insert questionable code in order to retain mindshare against ATI’s products.

No they did not have to. We saw that with the Det 50.xx series that they can do legal optimizations with out cutting corners. To insert static clips planes in 3dmak and hand optimize for popular benchmark demos bases on the baoder of fraud.

NVIDIA is a publicly owned company, and as such it has certain responsibilities to its shareholders. Part of that responsibility is to retain value in the company by selling as many of its products as possible.

However part of NV resposibility is the card works for the users as advertised. If I see something that gets X fps in a game that I liked, then when I get the card and it gets X-Y while I play the game I am gonna to be down right pissed. I know your not taking NV side but it almost sounds like it your giving the ok for NV to piss all over their customers because thats what nV has to do to stay ahead? Again not saying your siding with them. Just saying it would have been a more acrurate touch if you had "laided into them" a bit more to ensure they now how wrong it was and how much greif they have caused all of us.
 
Well, seeing that NVIDIA really only released the 5x series of drivers, it appears that their driver and software technology at the time was nowhere near mature enough to adequately feed the NV3x processors. NVIDIA is in the business of selling chips and cards. If they want to sell them, they need to make them look good to the end user. Sure, they could legally optimize, but their software was just not ready. So, in goes the clip planes and other nasty tricks. Do I condone such behavior? Of course not. It is lying to the public about overall performance. Sure, 3D Mark looks good, but once an app comes out that utilizes that technology, those optimizations won't work, and the user gets poor performance. These optimizations were done for the sole purpose to make their cards look good against the competition, and that in turn sells cards.

I didn't like it when they did this, and I certianly didn't appreciate being lied to and misled. However, that is the nature of the business. I think we have all learned a significant lesson here. It is these companies business to sell chips, and they will try to sell them anyway they can. In the end such behavior is counterproductive, and it earns distrust from consumers. But these companies are betting that by the time everything comes out, they actually have real fixes and optimizations to give to the end user to help ease the pain. NVIDIA has done this to a degree, and they are still selling cards to users. And these users are in turn still buying these products.

It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing. The engineers probably wouldn't like what they would be forced to do, but in the end the big wigs at the top are authorizing their paychecks. When faced with the prospect of losing their jobs by not doing what management tells them, it is awfully easy to do questionable things. However, it is not like management is telling the engineers to perform genocide, so it is a lot easier to accept a few driver optimizations here and there and live with it.
 
JoshMST said:
It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing.
Well, except for the fact that when ATi was in a similar situation they did nothing of the sort. :rolleyes:

It's these kind of generalizations that I take issue with. It's not a case of "well everybody does it when they're behind so it's ok", there has NEVER been this level of cheating done by a graphics company before to try and make a product look better than it is!

nVidia took it to a whole new level, honest.
 
JoshMST said:
Well, seeing that NVIDIA really only released the 5x series of drivers, it appears that their driver and software technology at the time was nowhere near mature enough to adequately feed the NV3x processors. NVIDIA is in the business of selling chips and cards. If they want to sell them, they need to make them look good to the end user. Sure, they could legally optimize, but their software was just not ready. So, in goes the clip planes and other nasty tricks. Do I condone such behavior? Of course not. It is lying to the public about overall performance. Sure, 3D Mark looks good, but once an app comes out that utilizes that technology, those optimizations won't work, and the user gets poor performance. These optimizations were done for the sole purpose to make their cards look good against the competition, and that in turn sells cards.

I didn't like it when they did this, and I certianly didn't appreciate being lied to and misled. However, that is the nature of the business. I think we have all learned a significant lesson here. It is these companies business to sell chips, and they will try to sell them anyway they can. In the end such behavior is counterproductive, and it earns distrust from consumers. But these companies are betting that by the time everything comes out, they actually have real fixes and optimizations to give to the end user to help ease the pain. NVIDIA has done this to a degree, and they are still selling cards to users. And these users are in turn still buying these products.

It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing. The engineers probably wouldn't like what they would be forced to do, but in the end the big wigs at the top are authorizing their paychecks. When faced with the prospect of losing their jobs by not doing what management tells them, it is awfully easy to do questionable things. However, it is not like management is telling the engineers to perform genocide, so it is a lot easier to accept a few driver optimizations here and there and live with it.

Uhm... you can't say "it's not Ok, but it is Ok".

When you say "I don't condone it", you are saying "It's not Ok".
When you are saying "That's the nature of the business", you are saying "It's Ok".

This is where people are frustrated with your article.

People DON'T have to accept it as "The nature of the business". They vote with their wallets. In the long run, nVidia made a TERRIBLE decision, to cheat. Yes... they still sell their cards via their "spin"... but people that have jumped ship over to ATi have little reason to go back to nVidia, after this mess.

If anything "should be learned"... it's NOT that consumers "accept this nature of the business"... it's that the "nature of the business BETTER CHANGE, and QUICK".

Regards,

Taz
 
digitalwanderer said:
JoshMST said:
It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing.
Well, except for the fact that when ATi was in a similar situation they did nothing of the sort. :rolleyes:

It's these kind of generalizations that I take issue with. It's not a case of "well everybody does it when they're behind so it's ok", there has NEVER been this level of cheating done by a graphics company before to try and make a product look better than it is!

nVidia took it to a whole new level, honest.

That is true, people ALWAYS bring up Quack which was 5 -textures and was fixed with the following driver which no IQ loss and better performance.
ATI never inserted clipping planes, bullied reviewers with threats of no hardware, application detected filtering modes overiding control panel settings, over 180 applications detected in Nvidia drivers as per what Unwinder exposed.
I've never seen Matrox, PowerVR or any other IHV mislead its customers so much, the icing on the cake was the Free FSAA PR on their webpage with the 5800 launch for 4 months meanwhile review websites are showing 50 +% performance hits.

Now the market is flooded with inferior sub-par DX9 cards, especially low end all based of a Brand Name.
 
Doomtrooper said:
digitalwanderer said:
JoshMST said:
It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing.
Well, except for the fact that when ATi was in a similar situation they did nothing of the sort. :rolleyes:

It's these kind of generalizations that I take issue with. It's not a case of "well everybody does it when they're behind so it's ok", there has NEVER been this level of cheating done by a graphics company before to try and make a product look better than it is!

nVidia took it to a whole new level, honest.

That is true, people ALWAYS bring up Quack which was 5 -textures and was fixed with the following driver which no IQ loss and better performance.
ATI never inserted clipping planes, bullied reviewers with threats of no hardware, application detected filtering modes overiding control panel settings, over 180 applications detected in Nvidia drivers as per what Unwinder exposed.
I've never seen Matrox, PowerVR or any other IHV mislead its customers so much, the icing on the cake was the Free FSAA PR on their webpage with the 5800 launch for 4 months meanwhile review websites are showing 50 +% performance hits.

Now the market is flooded with inferior sub-par DX9 cards, especially low end all based of a Brand Name.
You forgot the "pretty much destroying FM's credibility with their bullying", but I agree with what you got. ;)
 
I am taking the stance on this as I do the human rights violations in China with the treatment of their prisoners. Do I condone the Chinese actions? No. Will they continue to treat their prisoners in this way even though I write an article exposing this? Yes. Will they change just because I want them to? No.

NVIDIA will not change the way they do business even though it is well known that they cheated. Yes, NVIDIA took this a step above what ATI did in the past with its 8500. ATI corrected their "optimizations" as NVIDIA is correcting theirs now. All I am saying is that nobody has their hands clean. SiS takes massive shortcuts and optimizations with their 3D products, but nobody talks about that (mostly because nobody really cares). It is the way it is. Not trying to make excuses, not trying to say it is ok, I am just saying that is the way it works. Do we have to like it? No. Do we have to deal with it? Yes. Companies have done this in the past, and they will do this again in the future. If they think they can slip this by users and reviewers, then they will.

By all means, vote with your wallets. But if there is a quick and dirty solution to a problem that a manufacturer is having, they will take it until a proper solution can be made. That is the only point I am trying to make.
 
Back
Top