Grid 2 has exclusive Haswell GPU features

Discussion in 'Architecture and Products' started by Davros, May 29, 2013.

  1. NThibieroz

    Newcomer

    Joined:
    Jun 8, 2013
    Messages:
    31
    Likes Received:
    8
    Sorry about the font in my first post. I used Word to compose my answer and didn't realize the copy/paste would keep the (large) font.
    The initial release of TressFX had problems, clipping was one of them. A game update improved this particular issue considerably. In general, there is a lot more Crystal and AMD would have loved to add to TressFX for this game but we ran out of time (this is mentioned in Jason Lacroix' GDC presentation). One example is the use a separate hair shadow map for shadow casting onto Lara's face.

    I think we have already reached the line beyond which public discussions with a competitor on the merits of our respective products become unhealthy. So I will not comment on this.

    I haven't mentioned an extension so far. It's too early to talk about this.
     
  2. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Sure, but like I said we're just talking tech here, not pricing. I did say the latter is obviously relevant to consumers, but not to an architectural discussion.

    This is just out-dated thinking... "image quality" concerns were mostly something for the DX9 era where the APIs were very loose in their requirements. These days (DX10+ GPUs) the only significant degree of freedom is in anisotropic filtering and LOD computation. As nAo already mentioned, I think Intel's is currently the best as of Ivy Bridge, although all three are really very good at this point. Several of the Haswell reviews had image quality comparisons and none that I saw found any noticeable differences between the modern implementations, as expected.

    Of course the way you phrased that makes me think that you just have a preconceived notion that Intel graphics has to be worse "somehow", and are just looking for ways to support that...

    Yeah that's cool, I just wanted to explain what I meant about my "misleading" comment, since you replied to that. As I stated twice, I realize you have to play "the marketing game" when speaking to the media anyways.

    Well I'm interested to hear whatever comes of it :) That said, I still don't think there's any way that you can retroactively change the DX spec and make those semantics safe to ship on "any DX11 card", so it effectively is an extension, regardless of how it might be implemented.
     
  3. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,515
    For example, Anand mentioned some form of texture flashing in BF3 (a major game). Bugs and lacky game profiles are something to be expected from Intel solutions, which will also directly affect image quality.

    Maybe, I just have a hard time agreeing on the idea that the recent and young Intel offerings could be better than the old and gold solutions from NVIDIA and AMD , the difference in experience and support is vast.
     
  4. Homeles

    Newcomer

    Joined:
    May 25, 2012
    Messages:
    234
    Likes Received:
    0
    They're just as expected from AMD and Nvidia solutions. Bugs happen.
    There's a difference, but it's not even remotely as pronounced as it used to be.
     
  5. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    There's a difference between driver maturity and image quality. Texture flashing is hopefully a driver bug. Texture filtering is pure hardware. Of course, it's pointless to have the best image quality if you're rendering the wrong triangles. :wink:
     
  6. Paran

    Regular Newcomer

    Joined:
    Sep 15, 2011
    Messages:
    251
    Likes Received:
    14

    That's a minor driver or game issue and has nothing to do with image quality on a hardware level. Such issues exist on AMD and Nvidia too. You have to read some driver threads and you will find similar issues.
     
  7. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,848
    Likes Received:
    2,267
    Well then it looks like the only way to resolve this is
    wait for it....

    IHV Deathmatch
    Nick, Andy you both have Quake 3 right
     
  8. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,416
    Likes Received:
    178
    Location:
    Chania
    Not that it's relevant to the topic, but Intel's engineers should be publicly lynched if nowadays they wouldn't deliver high quality AF considering the abominations that used to be found in Intel's past integrated GPU solutions years ago.

    The real question here would be why Intel had to be forced to care indirectly by increasing competition about such details in the first place, where from some point and onwards these "details" should be self explanatory and not try to sell a negative LOD slider as any form of AF for example.
     
  9. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    You're applying a double-standard here. Everyone's AF implementations used to be terrible, Intel just started to care about graphics the most recently of the three major IHVs.

    Do you understand how the math for LOD calculations works? It's obvious how all vendors used shortcuts in the past to avoid expensive stuff like square roots. There's no LOD "bias" going on, it's just that by necessity when you have poor approximations they must be on the side of blurrier LODs, else you expose aliasing and often violate the spec range of acceptable LOD computation. There's no conspiracy/trickery; in the past everyone used poor approximations to this LOD computation from gradients, and now that has been improved.
     
  10. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,416
    Likes Received:
    178
    Location:
    Chania
    Am I? Let's see.

    Diplomatically phrased no doubt. Intel has had obviously for the longest time horrible AF exactly because they started to care just recently about graphics. Else starting with IVB we got AF that is approximately on the level what the competition had more or less by default over probably more than half a decade prior to that.

    No I'm a simple layman and I can't write a single line of code; if that's good enough for you. However my eyes as a user are trained well enough to be able to keep apart a negative LOD offset (which can be verified with AF testing applications in any case) from real anisotropic filtering or any optimisation for the latter.

    In the meantime again kudos for getting on par with the competition, but I beg your pardon Intel or anyone associated to it has also to take the criticism that it took TOO LONG to get as far. Better late than never for sure.
     
  11. nAo

    nAo Nutella Nutellae
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,325
    Likes Received:
    93
    Location:
    San Francisco
    There is no forcing, there are just priorities, which I am sure you noticed have quite changed in recent times. Anisotropic filtering is not exactly rocket science, it actually takes more effort to do it "wrong" (but cheap) than to do it properly, if you are willing to pay the area/perf cost for it.
     
  12. Paran

    Regular Newcomer

    Joined:
    Sep 15, 2011
    Messages:
    251
    Likes Received:
    14

    Half decade? Prior to GCN AMD had horrible shimmering and banding issues. In fact this issue is still present on their APUs (edit: Kabini and Temash with GCN of course are fixed). Banding is fixed with VLIW4 but shimmering is even worse. Intel never had shimmering issues at least. Nvidia learned their lesson since G80 indeed. Intels AF implementation is top notch nowadays, however in directx games it can't be enforced through the driver. So without AF application support Intel users won't get AF. I reported this numerous times to them but it seems they don't care unfortunately.
     
  13. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Sure, no doubt, I'm not defending the bad quality in the past (I wasn't even at Intel then), and I doubt anyone else would either... at that time integrated graphics was just what you slapped on the spare space on the die. I'll also note that everyone still does very "brilinear"-like optimizations, so if you're going to complain about AF quality, you really should be quite mad about that.

    But what I meant by double standard is you're just picking one thing to be mad about (retroactively, since it's not longer an issue). To bring it back onto the topic of this thread, why not similarly criticize IHVs for not providing programmable blending for 15+ years? Seems like Haswell and some mobile parts are the only ones that have workable solutions there today. Arguably that's even more important than LOD computation accuracy.
     
  14. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    When all the rest was horrible, who cares how good the AF quality was? Hardly anyone played at high quality settings anyway.

    First fix the important stuff, then improve the rest.
     
  15. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    I think that's not exactly fair. I agree that intel obviously didn't care too much about graphics quality until recently. But then they did it right. No slow crouching to improve on some mediocre approximations but just a jump to the optimum and the "textbook AF" if you want to name it that way. Currently, the are really providing a top notch solution for AF. I think it's not justified to argue with the shortcomings of their older graphics.
    While I agree on the shimmering issue (it was apparently a bug in the interpolation algorithm for the samples, they didn't use fewer samples, but averaged them with wrong weights), overall it was still orders of magnitudes better than intels earlier AF solutions.

    But that is all the past. Their (AMD, intel, nV) latest solutions are quite on the same (close to perfect) level, at least for DX10/11. On DX 9, an application can request bilinear AF where GCN still appears to have some issues. That's why I think Andrew's assertion of intel providing the probably best AF solution on the market is likely true. While nV doesn't show the issues with bi AF as GCN, they still have some slight angle dependency in their AF (but that is practically really, really close to indistinguishable). But technically it isn't as perfect as intel's.
     
  16. Paran

    Regular Newcomer

    Joined:
    Sep 15, 2011
    Messages:
    251
    Likes Received:
    14
    What better? AMD had or still has horrible shimmering and banding issues, Intel didn't.
     
  17. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,129
    Likes Received:
    903
    Location:
    still camping with a mauler
    From the reviews it looks like Haswell GT3e offers a good performance/watt GPU in laptops. But for the cost of a GT3e chip, you could get a 'good enough' Haswell CPU plus faster dedicated graphics from NVIDIA or AMD. Realizing that, Iris Pro becomes much less appealing.

    In fact, after checking all the vendors I know of, I can't figure out how to buy a laptop with GT3e graphics right now. My google skills have failed me :(
     
  18. DavidC

    Regular

    Joined:
    Sep 26, 2006
    Messages:
    347
    Likes Received:
    24
  19. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,129
    Likes Received:
    903
    Location:
    still camping with a mauler
    Oh, well now I don't feel so stupid :)
     
  20. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    You've never seen it pre IvyBridge? Andrew admitted that it "used to be terrible" and that he doubts anybody would defend the quality of the past. In fact, on the "quality" setting (supposedly the best) it created much worse shimmering than AMD ever did (it basically looked like applying a -1 LOD bias combined with subpar, highly angle dependent filtering). It was clearly the worst solution at that time.
    There is no need to defend it today as intel did their homework in that field.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...