Trilinear Test (Warning 0.5MB PNG)

Which side is "Traditional Trilinear"?

  • Right

    Votes: 0 0.0%
  • Both

    Votes: 0 0.0%
  • Neither

    Votes: 0 0.0%
  • Can't Tell

    Votes: 0 0.0%

  • Total voters
    449
I have blown up that image as far as I can (1600% I think) and I CANNOT tell a difference in the two to save my life. The simple fact that ATI is being so forthcoming (albeit late) and giving Dave B. an in-house diagnostic tool leads me to believe that they have NOT degredated IQ for the sake of performance. If this optimization proves to be +/- 2% of true trilinear AND improves performance because of the optimization, aren't we just splitting hairs here?

ATI's responsibility is to give us the highest IQ/performance ratio possible for our money. I am content with this optimization because it appears be done for the right reasons. The motives for Nvidia's application specific optimization was obvious...FUD...and to blatantly cheat benchmark scores. I do NOT see ATI being that stupid and their motives (similar IQ at higher performance) seems to be reasonable and well placed.

At some point we will be forced to compare apples to oranges no matter what we do because of the different proprietary solutions from each company. IQ, performance and other metrics concerning GPU's will continue to be more and more subjective as time goes on.
 
After staring at it for a while, I can see a moire pattern too.

I'm not sure if it is simply the image that is more prone to moire effect, or if it is just my monitor.

IE: I know my monitor has a more pronounced effect in one corner (This is noticable even on a higher end Sony Trinitron CRT) using this pattern after moving it around the entire screen.

http://i.i.com.com/cnwk.1d/i/lab/0515/sm_Moire-Montage.gif

I know I have bad vertical convergence in the right hand side of my screen... So it kind of screws up the results I mention.
 
DaveBaumann said:
Because there would have been no difference between left and right (and because this app has only just been provided to me, by ATI, something that NVIDIA probably wouldn't have done - they have test apps in house that I've seen and asked for and never got).

Anyway, the point of the application is to provide textures that will defeat the optimisation - so, on the left hand side you have an filtering via the optimised mode and on the right to have "optimisation defeated" (i.e. Traditional) trilinear.
Just keep in mind that a test application developed by ATI is much less likely to show problems, as the algorithm in question is likely to have been optimized for that test application. Without knowing the actual algorithm used, it's very hard to know the worst-case scenario for this filtering.

Anyway, it looked to me like the right side had slightly higher LOD (you can see the difference if you look along the line at the center), so that's what I voted for.
 
Chalnoth said:
Anyway, it looked to me like the right side had slightly higher LOD (you can see the difference if you look along the line at the center).

I have a nice 21.3" Samsung SyncMaster 213T and the center line as you know is not the same on both sides. The very middle color is gold then it transitions to red/lavender/purple on the right and it transitions to green/aqua/blue on the left. I think you are seeing the purple more clearly than the blue becuase it is darker NOT because it is clearer.
 
Chalnoth said:
Just keep in mind that a test application developed by ATI is much less likely to show problems, as the algorithm in question is likely to have been optimized for that test application. Without knowing the actual algorithm used, it's very hard to know the worst-case scenario for this filtering.

No, the application has just been built. Their testing has been done on textures frequently used in gaming environments. The point of this application is to be somewhat of a pathological worst case.
 
overclocked_enthusiasm said:
I have a nice 21.3" Samsung SyncMaster 213T and the center line as you know is not the same on both sides. The very middle color is gold then it transitions to red/lavender/purple on the right and it transitions to green/aqua/blue on the left. I think you are seeing the purple more clearly than the blue becuase it is darker NOT because it is clearer.
No. As I look again, I was looking at identical colors near the center, not the exact boudary.
 
DaveBaumann said:
No, the application has just been built. Their testing has been done on textures frequently used in gaming environments. The point of this application is to be somewhat of a pathological worst case.
I would still much rather know what's being done, not be told it's the worst case.
 
I've no idea.

However, after staring at that image for 5 minutes I suddenly feel the subliminal urge to donate £100 to Dave's bank account... ;)
 
Mephisto said:
There is no sense in comparing still images (especialy if they're provided by ATI) - mip map transitions are very annoying in the moving image, thats where brilinear has its flaws.

Well, this isn't exactly brilinear (at least, not the nvidia way of brilinear). Supposedly, it's a much better and more intelligent method of adaptive tri, so the whole point of this is to see if we can spot the difference. The fact that it's been active on the 9600s since day one speaks volumes towards this goal I think.
 
My eyes must be broken, but the only Moire patterns that lept out at me are 2/3 of the way from the bottom to the top and don't look any smaller on the right side, just... different. Slightly. I could not say which side I PREFERRED, and would have to say that stills or gameplay, I'd have no preference.
 
Diplo said:
However, after staring at that image for 5 minutes I suddenly feel the subliminal urge to donate £100 to Dave's bank account... ;)

Only £100? I'll upload a new version...
 
just for grins, I thought I'd try looking at the image on a 23" Apple HD Digital Display (we've got a few of them here at the CCIT at the UofA) and honestly, I can't see a difference. I don't know if I'm not seeing what other people are, or if Apple's monitors aren't as "high quality" as they're claimed to be, but myself nor my co-worker could spot a difference.
 
The display really makes a difference. On my 21" Vision Master Pro 450 at work, I could see the moire pattern on the left of the vertical middleline halfway along. I could see a slighter moire on the right but I really had to look for it.

On my 17" ADI Microscan 5GT at home, I could only barely see the moire on the left. On the 14" LCD of my laptop, all moire is gone. None nada.

edit: An on my laptop, the left image appears slightly sharper (especially the edges) than the right.
 
hmm bit late to this thread.

I could however see that the right was trilinear and the left brilinear without reading the rest of the thread, you'll just have to trust me on that :p (iiyama pro454 19" diamondtron btw since this seems to be a state your monitor thread :))

on which is better, well the left gives the impression of begin sharper (which is odd since while it doesn't start blending lower res mipmaps until further back it finishes with the higher res ones earlier than the trilinear) but this may cause alaising in motion. The difference is pretty small though.


for those having trouble spotting it. try not zooming in but just move back from the screen a little and just notice that the slightly darker band of the lower res mipmap starts further back on the left than the right.


just for grins, I thought I'd try looking at the image on a 23" Apple HD Digital Display (we've got a few of them here at the CCIT at the UofA) and honestly, I can't see a difference. I don't know if I'm not seeing what other people are, or if Apple's monitors aren't as "high quality" as they're claimed to be, but myself nor my co-worker could spot a difference.

well lcds still can't match crts for colour accuracy or contrast (and crts are some way from perfect themselves on the colour part)
 
Back
Top