Futuremarks technical response

Joe,

Calm down. If you don't like [H], has you have stated on this board, it's ok and your problem. But, i saw some comments from [H] on the Ati's official on the link i put here, and [H] stated that they are waiting for more comments, more technical comments to be able to comment something.

Finally, i think THG was paroding Nvidia's statements, and [H] point of view was not on the same basis.
*edit*
No doubt there are some "on the fence" statements here that fail to address specifics inside the benchmark. We will be talking with ATI more this week about the specifics that need to be covered on this subject
 
Oh, and one more comment on the [H] Blurb about ATI's statement:

IF you have read our 3DMark03 Preview this week, you will see that we are not too keen on the benchmark and what the overall score represents.

Not surprisingly, a big part of the problem here is the [H] thinks that the overall score represents something that it doesn't. "It doesn't measure performance in any "real" games!

Correct. It doesn't. Problem is, it's not SUPPOSSED TO. The sooner [H] figures this out, the sooner they might recognize the value in it.

In all honesty, it's very difficult to ascertain EXACTLY what HardOCP believes the 3DMark score represents. All we know is that HardOCP tells us that they don't like what it represents...or what they THINK it represents.

So, HardOCP, please tell us, what the 3DMark score "represents". And if FutureMark tells you it represents something else, then why don't you believe them?
 
Calm down. If you don't like [H], has you have stated on this board, it's ok and your problem.

Honestly, it's Kyle that I have a problem with. Not "personally", but due to what is obvious to me (through this, and prior situations like "Quack") that he is a puppet for an IHV, while proclaiming he has "us" in his best interests.

But, i saw some comments from [H] on the Ati's official on the link i put here, and [H] stated that they are waiting for more comments, more technical comments to be able to comment something.

What I don't understand, is what light is ATI supposed to shed about the tecnical merit of 3DMark? If you want the TECHNICAL rebuttle, read FUTUREMARK'S already published response. It's all there.

All ATI needs to do is announce their support (or lack thereof) of the benchmark, and why. They've done this.
 
OT: How do I activate the easter egg in 3d-mark 2001 ? Maybe there's one on 3d-mark 03 ?

Thomas
 
I think you have already made your mind on this, so there's no discussion.

Made up my mind on what? FYI, this is what I "have made up my mind on". That HardOCP has decided not to use the 3DMark score under the following circumstances:

1) Without conveying that they understand what the 3DMark score actually represents.

2) Made that initial (final?) decision before asking clarifications from the benchmark developers about concerns they had, and only relied on their "intuition" and comments from nVidia.

3) After getting the clarifications from FutureMark, they have made no indication that they are being taken under consideration. Or if they have already considered and rejected them...why.
 
Brent said:
well, we don't = nvidia

That is a comment based on you're not having identified your viewpoints as reflecting nvidia's commentary being a factor in your 3dmark03 article, yet apparently having taken their position as your own. If you want to address that equation, address this issue. Hey, maybe I'm just mistaken?

nvidia made some good points, so did futuremark and ati

That is a fine evaluation, but the problem, as I said, is that nvidia's points are not identified as such, but as your own. Again, repeating myself, the problem is not that ATI's points are reflected as being from ATI and bearing scrutinization and careful evaluation, but that nvidia's points are not treated the same way, nor are they presented as being nvidia's points. If my impression of this is incorrect, please correct me by all means...that is a simple and direct response and we could move forward.

i agree with points from all of them

That is not the impression portrayed by how it appears to me you've presented nvidia's viewpoint as your own. This doesn't mean your statement isn't true, but it does mean it is the perception you (as a representative of the site) foster due to the issues above.

i think its important to take in and evaluate what everyone involved has said and draw input from that...

But your 3dmark03 article did not do that. Re-read my reasoning of why I commented "HardOCP = nvidia" and address why my impression or reasoning is invalid. I haven't had anything new demonstrated to me by your reply.
 
tb said:
My No, was to this part of the message "Game Test 2 and 3 don't change much between CPU vs GPU skinning because primarily they are Pixel Shader limited"

I think they are more vertex shader and fillrate limited than pixel shader calculation speed limited. The limitation goes from the vertex shader to the fillrate (and a little bit of pixel shader) when you increase the resolution. Test 1 is most of the time vertex shader limited, but fillrate comes into play with some very high resolutions. Test 2,3 and 4 are not that heavy limited by the vertex shader. Fillrate is the main limitation in these tests(2,3,4) and the pixel shader has a very less impact.

Sorry, don't have a radeon 9500 / 9500 pro :(

Thomas

What makes you say its more vertex shader/fill rate limited rather than pixel shader limited? Perhaps if you could point out from what in your tests gives you this impression it might make more sense to me. You say you "think they are more vertex shader and fillrate limited than pixel shader calculation speed limited" But I have yet to see evidence of that being the case (if its in that article you linked, theres too many things I don't understand in it to be able to figure out what you are referring too).

It is clear that pixel shader calculations will go up and down with resolution. vertex shader work will not. Because of the changes in resolution affecting performance sigificantly, there is reason to believe it is Not vertex shader limited. That does not mean we are not running into bandwidth or fill rate limits at some resolutions, but it Does suggest that Pixel shader limitations could be part of the cause, and that Vertex shader limited tests 2 and 3 are not.
 
I am really confused with all this rhetoric about 3Dmark2003. It seems to me that nvidia is upset that finally ps1.4 is being used in a legitimate benchmark. But what is the big deal here? I mean really 3DMark 2001 came out for dX8 using PS1.1 that were not highly proliferated at all and still bloody well arn't.(One could attribute that to the extremely common Geforce MX series cards, thanks Nvidia.)

There always have been complaints about 3DMark. Is this is a whole different sort of benchmarking philosophy with "futuremark"?(The name makes a lot more sense now then it did not too long ago.) It is really a shame that nvidia insists on using the older PS1.1 and move on like ATi has.

The real problem here is that Nvidia had absolutely no problems when MadOnion used PS1.1 in 3DMark2001 and ATi had no supporting hardware for that particular shader, they loved it. It seemed that 3DMark2001 was essentially made to benchmark nvidia hardware. But now the future has arrived they are still stuck back in 2001 wanting the same pixel shader they have used for nearly two years and is entirely a part of DX8 and want it to be portrayed as being a part of DX9.(the same cannot be said for PS1.4 AFAIK according to some article I read sometime back now PS1.4 goes a long way to being like PS2.0)By all means correct me if I am wrong here, too say that PS1.4 is DX8 is not quiet correct PS1.4 is absolutely connected to Microsoft DX8 upgrade DX8.1a. It all seems horribly hypocritical.

To make matters worse nvidia is putting the worst sort of pressure on Futuremark by withdrawing from beta testing they are pulling financial support out. Also spreading their brand of hypocrisy accrossed popular hardware sites it seems.

It will be a sad state of affairs if nvidia is able to force Futuremark to exclude a support for PS1.4 so that their hardware scores better, even though it really isn't likely that PS1.1 is technology that will be used as much in the future as PS1.4 is a superior implement from what I have read.

Consider that ATIs entire line of Radeon products from the Radeon 8500 up support PS1.4 even their low end cards, this truly speaks volumes in my opinion on just how much ATi has left nvidia in the dust in terms of putting newer and better technology on the market.

Nvidia needs to get their ass in gear and stop whinnig about technological advancement they really aren't winning any points with the GeforceFX in light of the R300 line up. Dare I say it...... nvidia has lost its edge, big time.

EDIT: Hrm upon looking into the matter a little harder it appears that the benchmark supports PS1.1, it is just they are upset that PS1.4 is supported in the benchmark? My sakes, its worse then I had thought. It isn't that PS1.1 isn't supported but that the benchmark actually supports PS1.4 showing how inferior PS1.1 really is. :oops:
 
tb said:
OT: How do I activate the easter egg in 3d-mark 2001 ? Maybe there's one on 3d-mark 03 ?

Thomas

start '01, in the project box hit edit, change 'my benchmarks' to Holy Cow!

ok, then start game demo....
 
I hope I don't fail to take into account something in the quoted text I snip...

tb said:
My No, was to this part of the message "Game Test 2 and 3 don't change much between CPU vs GPU skinning because primarily they are Pixel Shader limited"

I think they are more vertex shader and fillrate limited than pixel shader calculation speed limited.

Based on what? Pixel shading performance hit may be the same on every card for the functionality exhibited in the test (I don't know how many clock cycles each pixel shader takes on each card, and that is something the 9500 non pro versus 9000 Pro could maybe tell us about how this has improved), but that doesn't mean the hit is not a significant factor in the benchmark.

Your results from "no txt" show an increase, but your results from "no ps" show more of an increase when you raise the resolution to 1024x768. This supports the idea that the pixel fillrate determines the absolute limit of performance, but pixel shading limitations determine how much of that can be realized.

BTW, does "No txt" mean no textures read, or no pixel output (i.e, "No coloring" to my mind)?

What your analysis (as I understand things, which of course may be in error) seems to ignore is the idea that just because the 9700 can run pixel shaders fast enough to not significantly limit pixel fillrate, that doesn't necessarily mean other hardware can do the same (some questions concerning the GF FX come to mind). I.e., it doesn't mean that pixel shaders are not a limit of the benchmark, just that the 9700 performs them quickly enough so that they are not a limit. If all cards are not limited in this way, however, it does mean that the benchmark is useless for evaluating comparative performance of that factor...hence my question regarding the 9500 Pro and my curiosity about future nv3x parts, on whql drivers, in regards to this benchmark.

The limitation goes from the vertex shader to the fillrate (and a little bit of pixel shader) when you increase the resolution. Test 1 is most of the time vertex shader limited, but fillrate comes into play with some very high resolutions.

But that is the case for any fillrate limited benchmark....it is fillrate limited until you lower the fillrate demands so that the impact is less than that of the other demands. The thing is that the test scales downwards wrt performance from 1024x768 and up, and as far as I can recall that is part of the baseline assumptions of the benchmark...so to say that it is "most of the time" vertex shader limited based on testing at 320x200 seems a distortion.

Test 2,3 and 4 are not that heavy limited by the vertex shader. Fillrate is the main limitation in these tests(2,3,4) and the pixel shader has a very less impact.

On a 9700, bandwidth is not a factor for many situations, but it exhibiting this behavior for an application does not mean that the application doesn't depend on bandwidth, just that it isn't a good comparative benchmark for comparing cards that share this behavior.

If the 9500 non pro versus the 9000 pro gives us indication of equal performance under the right conditions, this illustrates that the GT 2 and GT 3 are fillrate limited for the purposes of that comparison, which should give us indication of how ps 1.4 execution characteristics of that card compare to the characteristics of the 9500 non pro (i.e., they are the same). This would be a good reason to start believing this reflects what the benchmark (as far as GT 2 and GT 3) is able to test (the usefulness of a benchmark in the real world is dependent on the set of applicable items there are in the real world to be benchmarked), pending confirmation of actually evaluating other cards as they are released.

Sorry, don't have a radeon 9500 / 9500 pro :(

Thomas

Someone who does should get cracking! ;)
 
Joe DeFuria said:
I think you have already made your mind on this, so there's no discussion.

Made up my mind on what? FYI, this is what I "have made up my mind on". That HardOCP has decided not to use the 3DMark score under the following circumstances:

1) Without conveying that they understand what the 3DMark score actually represents.

2) Made that initial (final?) decision before asking clarifications from the benchmark developers about concerns they had, and only relied on their "intuition" and comments from nVidia.

3) After getting the clarifications from FutureMark, they have made no indication that they are being taken under consideration. Or if they have already considered and rejected them...why.

I visited NVIDIA offices last week and we discussed 3DMark03. At that time we had the benchmark for a well over a week and I think NV had it for a day. At that time I had already made my mind up that 3DMark03 did not represent gaming. We KNEW NVIDIA's thoughts had merit when they came to us with their opinions.

This tells mounds of information IMO.
 
Doomtrooper said:
I visited NVIDIA offices last week and we discussed 3DMark03. At that time we had the benchmark for a well over a week and I think NV had it for a day. At that time I had already made my mind up that 3DMark03 did not represent gaming. We KNEW NVIDIA's thoughts had merit when they came to us with their opinions.

This tells mounds of information IMO.

That does seem to be a terribly different sort of conclusion then what Beyond3D came up with. Funny that.

http://www.beyond3d.com/articles/3dmark03/index.php?p=6

Conclusion
It would certainly seem that 3DMark03 is a highly comprehensive benchmarking suite for the latest generation of Shader enabled graphics boards, and yet manages to stay closer to an actual gaming environment that may be seen the coming years. It also offers a number of tests outside of the standard 3D graphics performance which could assist in benchmarking many different element of the PC that are utilised during gaming.

No doubt Beyond3D will soon be utilising elements from this benchmark in upcoming tests and reviews.
 
:oops: 8) :D

Sabastian said:
EDIT: Hrm upon looking into the matter a little harder it appears that the benchmark supports PS1.1, it is just they are upset that PS1.4 is supported in the benchmark? My sakes, its worse then I had thought. It isn't that PS1.1 isn't supported but that the benchmark actually supports PS1.4 showing how inferior PS1.1 really is. :oops:


:LOL:

Finally somebody understood ;)

I hope 3Dmark03 gives developers a jolt of reality. Finally there is reason to consider coding for ps1.4 (more efficient than PS 1.3 PLUS more widely supported than PS2)

I think by summer ALL cards available on the market will be DX9 and thus will be able to use Ps1.4 which makes the installed base of users for 1.4 HUGE by the time a game developed now, ships.
 
NVIDIA have got a real bee in their bonnet about PS1.4. I was talking to a few guys at Dusk-till-dawn about it and they are unhappy that Futuremark have supported a technique that 'only a bunch of Canadians' have. I pointed out that "well, it is a part of DirectX afterall" to which the responce was "well, it shouldn't have been".

The fact is, it is in DX, its a viable rendering option, and games are supporting it (I have a big list...).
 
'only a bunch of Canadians'

Hmm me not Canadian last time I checked ;) Very nice thought though, I thought the industry was much more caring and sharing :LOL:

Edit: er went weird on me...
P.S. Is it cos I is black? :LOL:
 
DaveBaumann said:
NVIDIA have got a real bee in their bonnet about PS1.4. I was talking to a few guys at Dusk-till-dawn about it and they are unhappy that Futuremark have supported a technique that 'only a bunch of Canadians' have. I pointed out that "well, it is a part of DirectX afterall" to which the responce was "well, it shouldn't have been".

The fact is, it is in DX, its a viable rendering option, and games are supporting it (I have a big list...).

Whoa, thats just great. They trying to turn this into some sort of nationalistic issue? My my, that seems rather pathetic. Never mind that "bunch of Canadians" worked with Microsoft to create it. What a shame. :?
 
Back
Top