DeltaChrome preview at ExtremTech

YeuEmMaiMai said:
One thing that suprised me about the S8 was it's DX9 performance.....It is extremely good for being alpha drivers and pre-production (engineering sample) board. They even have performance that is superior to Nvidia's in most of the DX9 benchmarks....

Thanks to app detection :rolleyes:

BTW, Deltachrome doesn't support FP render target either like GF FX.
 
parhelia said:
YeuEmMaiMai said:
One thing that suprised me about the S8 was it's DX9 performance.....It is extremely good for being alpha drivers and pre-production (engineering sample) board. They even have performance that is superior to Nvidia's in most of the DX9 benchmarks....
Thanks to app detection :rolleyes:
NVIDIA app detects too, so what's the problem? As does XGI. :rolleyes:

-FUDie
 
madshi said:
FUDie said:
NVIDIA app detects too, so what's the problem? As does XGI. :rolleyes:
So if everybody does it, it must be right? :?
Of course not, and not everyone does it, but what I am saying is that parhelia is blaming app detect for DeltaChrome's performance, while the board he supports (XGI) is also using app detect in the driver. Also, someone commented that DeltaChrome compared favorably to the 5600 in shading tests, and parhelia, again, blamed app detect. Well, duh, NVIDIA does app detect too so they should be on even ground.

-FUDie
 
FUDie said:
Of course not, and not everyone does it, but what I am saying is that parhelia is blaming app detect for DeltaChrome's performance, while the board he supports (XGI) is also using app detect in the driver. Also, someone commented that DeltaChrome compared favorably to the 5600 in shading tests, and parhelia, again, blamed app detect.
Ok, I see what you mean - although XGI claims the application names found in the driver would be unused leftovers which would be removed with the next revision.
FUDie said:
Well, duh, NVIDIA does app detect too so they should be on even ground.
Whether we have equal ground or not - we simply do not know. One driver might use big quality tradeoffs while another driver only minor tradeoffs. We need IQ comparisons to decide...
 
madshi said:
FUDie said:
Of course not, and not everyone does it, but what I am saying is that parhelia is blaming app detect for DeltaChrome's performance, while the board he supports (XGI) is also using app detect in the driver. Also, someone commented that DeltaChrome compared favorably to the 5600 in shading tests, and parhelia, again, blamed app detect.
Ok, I see what you mean - although XGI claims the application names found in the driver would be unused leftovers which would be removed with the next revision.
That doesn't seem to be the case. People who have changed those names in the files have noted performance loss with image quality gains.
FUDie said:
Well, duh, NVIDIA does app detect too so they should be on even ground.
Whether we have equal ground or not - we simply do not know. One driver might use big quality tradeoffs while another driver only minor tradeoffs. We need IQ comparisons to decide...
This is true. Of course, we already know that NVIDIA is sacrificing image quality (force _pp in some cases, brilinear, etc.) It remains to be seen what S3 and XGI are up to.

-FUDie
 
FUDie said:
That doesn't seem to be the case. People who have changed those names in the files have noted performance loss with image quality gains.
That's news to me, thanks for the information.
 
parhelia said:
YeuEmMaiMai said:
One thing that suprised me about the S8 was it's DX9 performance.....It is extremely good for being alpha drivers and pre-production (engineering sample) board. They even have performance that is superior to Nvidia's in most of the DX9 benchmarks....

Thanks to app detection :rolleyes:

BTW, Deltachrome doesn't support FP render target either like GF FX.

Where's the evidence that S3 is doing app detection? I sure haven't seen any.
 
FUDie said:
That doesn't seem to be the case. People who have changed those names in the files have noted performance loss with image quality gains.

It was the opposite for me. Maybe it's due to the different driver revision, I really don't know.
Changing 3DMark exe did not result in any change performance wise or image quality (tested with a single chip V8 though, no idea about the Duo).
However changing the exe of Halo resulted in some rendering errors (some flickering) without any change in filtering. Trilinear seemed to have been used before and after the exe change.
 
parhelia said:
FUDie said:
That doesn't seem to be the case. People who have changed those names in the files have noted performance loss with image quality gains.

It was the opposite for me. Maybe it's due to the different driver revision, I really don't know.
Changing 3DMark exe did not result in any change performance wise or image quality (tested with a single chip V8 though, no idea about the Duo).
However changing the exe of Halo resulted in some rendering errors (some flickering) without any change in filtering. Trilinear seemed to have been used before and after the exe change.
Parhelia, I'm sure you know well enough that changing the exe is not what you have to do. You have to edit the driver file and change the name inside the driver file....

XGI is not detecting the exe filename but its name when it's in memory.
 
Hanners said:
parhelia said:
Thanks to app detection :rolleyes:

Is there any solid evidence of that?

Don't be silly. parhelia doesn't need evidence. parhelia's job is merely to post FUD about anyone who isn't XGI and to defend XGI wherever he can. Evidence isn't necessarily needed for this behaviour.

Persoanlly I'd urge everyone to ignore parhelia's comments until you hear something from credible sources.
 
DaveBaumann said:
Don't be silly. parhelia doesn't need evidence. parhelia's job is merely to post FUD about anyone who isn't XGI and to defend XGI wherever he can. Evidence isn't necessarily needed for this behaviour.

I'm glad someone said it. :)
 
I think there's one pleasant little surprise missing for the puzzle of S3's DC.
 
parhelia said:
S3 forces FP16 in certain cases instead of FP24.

I assume you're referring to the screenshots from ShaderMark on The Tech Report's article - That isn't evidence of reduced precision, that's evidence of a driver issue.
 
Tridam : no I didn't know that. I'll check it out later on.


DaveBaumann said:
Don't be silly. parhelia doesn't need evidence. parhelia's job is merely to post FUD about anyone who isn't XGI and to defend XGI wherever he can. Evidence isn't necessarily needed for this behaviour.

"Defend XGI wherever he can"?
Now tell me, I don't think saying that the Volari has rendering errors in Halo is actually defending it, is it?
We must not have the same definition of "defending" something. :rolleyes:

Persoanlly I'd urge everyone to ignore parhelia's comments until you hear something from credible sources.

Proof, eh? Here you go :

Tridam said:
How do you know ???
AFAIK S3 hasn't any FP16 hardware.

Let me quote Extremetech who confirmed my "FUD" :
The most interesting data point by far is the DX9 floating-point precisions supported by the S8: FP16 and FP24. Recall that DX9 has what are called Partial Precision Hints, which an application can send down to a GPU driver along with a pixel shader program. These hints tell the driver that lower floating-point precision (FP16) will be adequate to correctly execute the pixel shader program, and not introduce any visual artifacts, such as banding, as a result of rounding errors
http://www.extremetech.com/article2/0,3973,1417246,00.asp
 
DaveBaumann said:
Don't be silly. parhelia doesn't need evidence. parhelia's job is merely to post FUD about anyone who isn't XGI and to defend XGI wherever he can. Evidence isn't necessarily needed for this behaviour.

Persoanlly I'd urge everyone to ignore parhelia's comments until you hear something from credible sources.

I don't see were you get this. Maybe there is more to this idea than I see. Do you have some XGI or S3 info that is not posted yet?
 
Back
Top