NThibieroz
Newcomer
Sorry about the font in my first post. I used Word to compose my answer and didn't realize the copy/paste would keep the (large) font.ps: nick -- thanks for turning up but fonts size fer christ sake ;D
pps: you make tress fx sound very rosy but it was a big hit on fps and thats with only a single character supporting (a quick google I saw between 30 and 50%) it plus sometimes the hair would clip through her shoulder blades and make her look like she had hairy armpits
The initial release of TressFX had problems, clipping was one of them. A game update improved this particular issue considerably. In general, there is a lot more Crystal and AMD would have loved to add to TressFX for this game but we ran out of time (this is mentioned in Jason Lacroix' GDC presentation). One example is the use a separate hair shadow map for shadow casting onto Lara's face.
I think we have already reached the line beyond which public discussions with a competitor on the merits of our respective products become unhealthy. So I will not comment on this.But then in other parts of the interview, you basically imply that AMD APUs are somehow in a different performance category; for instance: "when it comes to high-end gaming, current Intel integrated graphics solutions usually force users to compromise between quality or performance, which is a tough choice to impose on gamers." The same statement can be made for all integrated graphics... it's sort of obvious - they run in a fraction of the power and thermal budget of the discrete GPUs.
So you're saying that Intel's solutions are not fast enough to be usable, but yet Iris Pro is faster than any of the current APUs that AMD have released as far as I can tell from reviews. So does that mean AMD APUs are magically more usable just because they say AMD on them? Invoking the PS4 is "misleading" since AMD ships nothing of that class of part for PCs currently; the current parts aren't even GCN-based yet, so the comparison is pretty far-fetch IMHO.
I haven't mentioned an extension so far. It's too early to talk about this.I don't doubt for a second it can be made safe, what I doubt is that it can be made safe on any past, present and future DX11+ GPU. At that point it just becomes a proprietary extension but I thought you were arguing developers don't like that