ATi and Futuremark, 3Dc

Scali

Regular
Since Patric Ojala has given some insight in their cooperation with ATi, I'd like to set some thing straight.

Patric Ojala said:
Scali's original post in this already fairly long thread has many good points, but it is not true that ATI would be quiet about using DXT5 for compressing normal maps. ATI's guys actually offered the most valuable help implementing DXT5 normal map compression.

This is what I meant. andypski focused on how ATi works with developers, and indeed, they work closely to support DXT5 aswell as 3Dc.
But Richard Huddy's statement said that 3Dc would be better for 3DMark05.
That brings up the question why ATi helped Futuremark to implement DXT5 compression instead of 3Dc.
Then Richard Huddy and andypski claim that ATi does not have enough information, but they just *thought* it may have been better to use 3Dc.
Well that is also strange in my opinion, if ATi actually offered help to implement DXT5-compression, they would have known how their DXT5-compression works, right?

So this was what I was annoyed about. ATi saying one thing to developers, and saying another thing to the press.
andypski seemed to take it the wrong way, and got upset, but I was not aiming at him, but at Richard Huddy. It's the press-side that I don't agree with, not the developer-side.

If Richard Huddy would make a new, more nuanced statement, I think we can all still be friends.
 
uhm, why would a developer need help with dxt5? Would think people smart enough to create something like 3dmark05 would be able to compress stuff on their own.. lol
 
But Richard Huddy's statement said that 3Dc would be better for 3DMark05.
That brings up the question why ATi helped Futuremark to implement DXT5 compression instead of 3Dc
Because they are looking to give gamers the best quality under all cirumstances using thier hardware.

Unlike other ihvs they don't abandon old user bases when a new card comes out. They sold a ton of r3x0 cores that can't do 3dc . Yet dxtc5 can be used even if its a little worse than 3dc Of course from here on out all thier new cores will have 3dc and in the future it will be wise to move away from dxtc5 for certian things .


So this was what I was annoyed about. ATi saying one thing to developers, and saying another thing to the press.
heh.

andypski seemed to take it the wrong way, and got upset, but I was not aiming at him, but at Richard Huddy. It's the press-side that I don't agree with, not the developer-side.
this should be in the general forum.
 
Scali said:
Since Patric Ojala has given some insight in their cooperation with ATi, I'd like to set some thing straight.

Patric Ojala said:
Scali's original post in this already fairly long thread has many good points, but it is not true that ATI would be quiet about using DXT5 for compressing normal maps. ATI's guys actually offered the most valuable help implementing DXT5 normal map compression.

This is what I meant. andypski focused on how ATi works with developers, and indeed, they work closely to support DXT5 aswell as 3Dc.
But Richard Huddy's statement said that 3Dc would be better for 3DMark05.
That brings up the question why ATi helped Futuremark to implement DXT5 compression instead of 3Dc.

Because they respected a developer's wishes and reasons?

"ATI : John Carmack, why don't you use 3Dc for Doom3?

JC : Because it means all my artists would be pissed at me if I do. We started on Doom3 years ago. Instead of pestering me to use 3Dc, will you help me with DXT5?

ATI : No John, we won't help you. "

Think that will happen?

So this was what I was annoyed about. ATi saying one thing to developers, and saying another thing to the press.
andypski seemed to take it the wrong way, and got upset, but I was not aiming at him, but at Richard Huddy. It's the press-side that I don't agree with, not the developer-side.
Perhaps you should email Richard? Andy is not in DevRel. He's most certainly not Richard (I think Andy would like this :) ).

If Richard Huddy would make a new, more nuanced statement
He probably didn't know a person called Scali exists :)

, I think we can all still be friends.
I don't think we're enemies with each other. Discussions can get heated but that should be just it. The Internet really is cool, but is not the best for debates.
 
Scali you've beaten this horse to death, 3DC is a GOOD thing...doesn't require alot of developer input. Will be part of X-box 2, has game devs support with Serious Sam 2, HL 2 etc...

The one thing I noticed right away since you joined this forum, you know your stuff...but so does many others here. It seems you feel you need to impress everyone here, there is no better forum out there on the web that allows you to interact with other Devs and IHVs and I would say 90% of the time your approach stinks with a big fat ego. Your opinion is not of this forum on this subject, 3DC should have been used for 3Dmark 05 but I personally don't have much use for the benchmark after the FX fiasco, how it was handled and how it was managed.
Now that Nvidia has had a group hug with Futuremark, it is back to...how much money you give them(Futuremark) to exploit your hardware.

As always it is always about $$$$, that is fact.
 
Reverend said:
OMG, I almost agreed with DT!! :oops:

That was like his best post ever.

Quick check the IPs it cant be him :)

JK, Doom good to see you again. Hope all is well.
 
Uhm, I actually agree with futuremark on using dxt5, until all cards have 3DC, you'd be doubling all your normal maps in an already hefty sized package for download. Sounds like a nice little compression format for sure, not sure what the huge deal is about it, but if only one vendor can support it, then it's a lot of extra content that has to be justified, not so bad in a game that installs off a cd though..
 
Himself said:
Uhm, I actually agree with futuremark on using dxt5, until all cards have 3DC, you'd be doubling all your normal maps in an already hefty sized package for download. Sounds like a nice little compression format for sure, not sure what the huge deal is about it, but if only one vendor can support it, then it's a lot of extra content that has to be justified, not so bad in a game that installs off a cd though..
Well, there are ways to get around this. A developer could store the textures in a raw, uncompressed format, R16G16, and convert them during initialization or installation. It does take a bit of extra work and computation time, but it shows that it isn't entirely necessary to have two sets of textures. Apparently, Futuremark did consider this and decided that it wasn't worthwhile, though. . .
 
Ostsol said:
Well, there are ways to get around this. A developer could store the textures in a raw, uncompressed format, R16G16, and convert them during initialization or installation. It does take a bit of extra work and computation time, but it shows that it isn't entirely necessary to have two sets of textures. Apparently, Futuremark did consider this and decided that it wasn't worthwhile, though. . .

disclaimer: I have no knowledge of the decisions behind 3dmark05.

Still, I was closely involved in 3DMark2001 where the benchmark had just such a converter, i.e. all textures were converted from very high quality jpg to DXT1-5 during installation. This was done to minimize the download size.

In hindsight, I am not sure it was a good idea after all. Making such an converter 100% reliable was very hard, and meant even the install could fail; the converter required that you initialize Direct3D (and some texture surfaces on the graphics card), and how many installers you know that do that? :rolleyes:
 
It is, to my mind, more silly that they included DST+PCF.

Remember a major reason why DirectX was created was to standardise the graphics API so that software, ideally, shouldn't need to identify the graphics card it is running on. What we can see happening equally with both ATi and nVidia is the creation of proprietary extensions to the standard in order to act as a selling point for their cards.

Futuremark, as a benchmark, should be encouraging a vendor neutral stance simply using the API. Indeed it was my impression that this was their design goal. Certainly their reasons given for rejecting 3Dc look awfully like that to me.

Perhaps the most dangerous thing is that even assuming, and we have no firm evidence to assume otherwise, the best of intentions on Futuremarks' part by choosing to make an exception they have opened themselves up to accusations of bias. I had hoped they might have learned this lesson from the way they handled 3dmark2001SE with its pixel shader 1.4 test.
 
From what I read here, it's not a matter of detecting cards, just D3D caps, you'd have more of a case for the goals of opengl, but then you have extentions so really there isn't much difference between the two when it comes to one standard way of doing things. You want one standard, you need a console, that's about the only way you are going to get it.

As for converting textures at install, no need for that, just do it the first time the program is run. However, 3DC isn't about performance, just more precise normal maps, so it might look a tad bit better, but you wouldn't get a different score as such. You need to add a bit to all the shaders for the little bit extra for using 3dc, and arse with convertings textures or duplicating them, all for the sake of a minor visual difference, benchmark scores would remain the same and people would have one more reason to argue that it's not apples to apples. Doesn't seem like a big deal to me. I don't really see 3DC catching on until all cards have it, IMO.

As for DST/PCF, well, there has always been a bit of that kind of thing in 3dmark, 1.4 shaders used on one card vs 1.1 on the rest, etc. It's their benchmark, they can do what they like, if you find it useful, use it, if not, ignore it..
 
Himself said:
As for DST/PCF, well, there has always been a bit of that kind of thing in 3dmark, 1.4 shaders used on one card vs 1.1 on the rest, etc. It's their benchmark, they can do what they like, if you find it useful, use it, if not, ignore it..

The nature demo which used 1.4 were not counted in the final score.
 
Yes Pixel Shader 1.4 was never included in scoring in Onionmark 2001, and it was a part of DirectX, unlike DST. Better compression on 3DC could lead to better performance with improved visuals.
I had many arguements with Worm in this forum years ago about that, and the answer I constantly got was it would make comparisons on the ORB difficult
homer.gif
<-- not even this guy would buy that story.

Again there is really no reason NOT to include 3DC, besides laziness or my bet...money.
 
Back
Top