Grid 2 has exclusive Haswell GPU features

There is no forcing, there are just priorities, which I am sure you noticed have quite changed in recent times. Anisotropic filtering is not exactly rocket science, it actually takes more effort to do it "wrong" (but cheap) than to do it properly, if you are willing to pay the area/perf cost for it.

That's all clear to me nao; my original message just was in cliff notes that it was about damn time.
 
But what I meant by double standard is you're just picking one thing to be mad about (retroactively, since it's not longer an issue). To bring it back onto the topic of this thread, why not similarly criticize IHVs for not providing programmable blending for 15+ years? Seems like Haswell and some mobile parts are the only ones that have workable solutions there today. Arguably that's even more important than LOD computation accuracy.

I think we bump into each other for as many years that we know more or less where each and everyone of us comes from. See also my reply to nao above; I've been personally always VERY finicky with texture filtering. Those that have a strong memory might remember that I was part of NV's focus group from 2004 to 2006. The first ever thing that knocked me over when I laid hands on a GF6800 back then was the intense occassional texture shimmering. I as a layman wrote even two whole write-ups about it back then and my humble personal contribution to the attention the press and gaming community layed on to the topic after that.

The nastiest of the AF related optimisations originally were in the "underfiltering" ballpark; still today in my book if an application requests for trilinear it should receive trilinear.

Programmable blending might be a point from your side but under normal conditions I don't get eye cancer from all the dancing meanders/noise that can spread over my screen with crappy filtering.
 
You've never seen it pre IvyBridge? Andrew admitted that it "used to be terrible" and that he doubts anybody would defend the quality of the past. In fact, on the "quality" setting (supposedly the best) it created much worse shimmering than AMD ever did (it basically looked like applying a -1 LOD bias combined with subpar, highly angle dependent filtering). It was clearly the worst solution at that time.
There is no need to defend it today as intel did their homework in that field.


Of course I have seen it. I tested that regularly. On my old HDD I should have Filtertester videos from HD3000. No, HD3000 never had shimmering issues nor banding issues unlike AMD. The shimmering level on HD3000 is not any different to HD4000 or Haswell. All what changed is the angle dependence. Nobody was forced to use the quality setting with the negative lod bias. That's just a software thing. A different setting than balanced never was a recommended option.
 
It reproduces less details in the middle setting. Together with the strong angle dependency (which was clearly visible ingame unlike nV's very slight angle dependence) it was the worst solution, even in the default setting (which was usually also slightly less shimmering than HQ in AMD's case btw.; intel's HQ was right from shimmering hell, much worse than anything else [besides point sampling]). The occasional banding and shimmering bugs of AMD's old AF algorithm don't change that (It was obviously really just a bug in the implementation using wrong weights for the interpolation of the samples. As embarassing as this is they didn't save efforts as with early implementations or intel was doing before IB). One may provoke corner cases with manually set mipmaps (not correctly filtered or not matching between the levels as it is the case with one of the filter test programs out there, iirc; but for that you can't blame the hardware imo).

As I said already, you don't need to defend something what even Andrew Lauritzen called terrible and admitted that intel was the last to move in the right direction. Intel did their homework, AMD ironed out their bugs too. Let's be happy about that. ;)
 
Last edited by a moderator:
it was the worst solution

When it comes to shimmering and banding, AMD had the worst solution not long ago. Oh well Trinity and Richland still suffer from shimmering.

intel's HQ was right from shimmering hell

Irrelevant since balanced always was set as default from the driver. Nobody used quality.

As I said already, you don't need to defend something what even Andrew Lauritzen called terrible and admitted that intel was the last to move in the right direction.

I don't defend anything, I just correct wrong statements from you. It is funny that mostly people who never tried it in games are the biggest moaner.
 
I don't defend anything, I just correct wrong statements from you.
Then you have to correct also quite some tests out there. :LOL:
I gave you my reasoning which is backed up by facts established by tests from respected sites concluding that for overall AF quality intel sat in the last position back then. Miraculously, this appears to be also Andrew's position here in the thread. You have chosen to argue that intel's AF wasn't that bad and defend it with weaknesses of competitors. I can accept that you weigh the factors differently. Do what you want with that.
That's a nice example falsifying Paran's statement.
 


That has nothing to do with the AF from HD3000 itself. At that time I tested it and reported to Intel. What happens here is that the App didn't enforce AF. In fact AF wasn't enabled. If you disable AF on Nvidia and AMD it shimmers as well in this app (edit: not anymore on Nvidia and Intel, just checked with their newest Google Earth, probably same for AMD). This bug has been corrected from google in their next next major version. AF properly enforced and no shimmering anymore.

That's a nice example falsifying Paran's statement.


That's a nice example of someone who never tried Intel graphics and has no clue about it.


Back to topic. Here is a Grid 2 test with Iris Pro 47W+55W and AMDs newest APU: http://techreport.com/review/24954/amd-a10-6800k-and-a10-6700-richland-apus-reviewed/5

In Grid 2 Iris Pro does a pretty good job.
 
Last edited by a moderator:
Seems like you've got a lot of insight into the new Haswell stuff nAo. You don't happen to have an explanation why these two Haswell-Goodies aren't available in the "normal" Haswell-Versions but only in Iris-versions?

As I said, I don't think that's true. If it is, it's just a judgement call on performance in that specific engine. The extensions are available on any Haswell GPU.

nAo and I both worked on this stuff (see authors for the papers I linked), hence the "insight" :)


You were right Andrew, it works on every Haswell. I just tried Grid 2 on a GT2 HD4600, the two Intel options are available. So it's not bound to Iris graphics only.
 
Performance numbers on my HD4600. I did three runs from the integrated Benchmark.

Medium Preset 720p no Intel extensions= 60,5 fps
Medium Preset 720p Advanced Blending+Smoke Shadows= 55,5 fps

That's a 8,3% loss with the two Intel extensions enabled.
 
it looks like they dont benchmark (i think cos i dont read german, despite doing it in school ;) )
that they dont benchmark the intel features and msaa at the same time, any reason for that ?
 
it looks like they dont benchmark ... the intel features and msaa at the same time, any reason for that ?
No real technical reason, but it would probably require lowering quality settings to maintain high performance. Note that there are also screen-space AA techniques in Grid 2 which smooth out a lot of the remaining jaggies. MSAA is fairly expensive on Haswell, so OIT + post AA is probably a better trade-off.
 
Back
Top