Grid 2 has exclusive Haswell GPU features

Discussion in 'Architecture and Products' started by Davros, May 29, 2013.

  1. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,854
    Likes Received:
    2,272
    Just a heads up
    Grid 2 supports avx, and avx exclusive features seem to be advanced blending and smoke shadows
     
  2. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
  3. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    So to get the game looking at its best, you have to play it on a slow as hell GPU?

    Greaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaat.
     
  4. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Grid 2 runs great on Haswell, as demo'd at GDC, etc...
     
    #4 Andrew Lauritzen, May 29, 2013
    Last edited by a moderator: May 29, 2013
  5. zorg

    Newcomer

    Joined:
    Aug 1, 2012
    Messages:
    32
    Likes Received:
    0
    Location:
    Sweden
    With medium settings on a GT3e design, which will not be the part of any Haswell that use LGA1150 socket. Please tell me you're joking.:smile:
     
  6. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Yeah, I doubt it could do max settings om any Haswell GPU at anything above 1280x720, if that.

    Plus, well, Intel decided to not put the best GPU on the desktops.

    *sigh* It's like Nvidia's shenanigans again, except even worse. How so? At least with Nvidia's bullcrap, you can just simply replace your video card if you absolutely must have X feature you want in Y game. With this? You have to buy a laptop and accept reduced frame rates and resolution.
     
  7. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    2,743
    Likes Received:
    106
    Location:
    Taiwan
    I believe these effects require specific hardware features in Haswell's GPU. It might be possible to make similar effects using general pixel shaders, but it's not likely to be efficient.
     
  8. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    I read it in a similar way; those effects depend on hardware capability of the Haswell GPU. If you don't have (or aren't using) a Haswell GPU? That's fine, play it at the "highest available" settings for your platform.

    I see this as no different than adding in some CUDA-based effects for your game. Don't have a CUDA capable video card? No problem, feel free to play without it. Does that significantly detract from your ability to enjoy the game? That depends on you... Seemed to work OK for a slew of other games, but perhaps that's different "NVIDIA BULLCRAP" because -- uh, it's NVIDIA I guess?
     
  9. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    I just dislike the entire idea of vendor specific features, and this case is especially bad because of what I outlined in my last two posts.

    FWIW, I am a seven year running user of Nvidia cards. I keep using them mainly because AMD keeps having shortcomings I dislike. I highly, highly dislike Nvidia's CUDA crap or the physX or the batman anti-aliasing debacle. It was all bullcrap, and the way I see it, so is this Haswell thing.
     
  10. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    High settings, 1080p, but yeah. Do you guys just make this stuff up to fit your preconceptions or do you actually have any real information? ;) And who honestly cares about whether a CPU is socket-able these days? I have not upgraded just a CPU in many, many years... because it's pointless and almost always better to just build a new system and have two working systems (or give away the old one).

    And yes, the design point is laptops where you'll find it to be very competitive. There are the "R-series" parts I believe if you want to bump the TDP a bit further, but yes - news flash - Intel doesn't make discrete GPUs.

    Like pchen said, it requires special hardware features (pixel shader ordering). If you read the page I linked above you'd know all this, and if you read the associated papers that are now years old you'd know that you can indeed do it without the hardware feature, but it's multipass and quite inefficient. The samples we have released have both implementations fully available in source code; people usually just choose to not use the general DX11 implementation because it's too slow.

    Yes I won't be able to use it on my GTX 680 or 7970 either, but it's not like the game fundamentally depends on it. In any case I'm not going to let other folks' hardware choices hold me back from innovating in graphics...

    i.e. if you're buying a laptop/all-in-one and love GRID 2 or similar, you might want to take a look at the Intel parts. If you're on a desktop with a big discrete GPU, just ignore the settings and be happy.
     
    #10 Andrew Lauritzen, May 29, 2013
    Last edited by a moderator: May 29, 2013
  11. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Couldn't this be done via Compute shaders? That';d make it compatible with any DX11 card...

    As for CPU upgrades? Well, I just did one a few weeks ago. All I did was swap out the cooler and the CPU.
     
  12. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Short answer, no. Read these:
    http://software.intel.com/en-us/articles/adaptive-volumetric-shadow-maps
    http://software.intel.com/en-us/articles/adaptive-transparency

    From what to what? These days I never feel the need to upgrade CPU more than once every few years and given that cycle, it's as much time to upgrade everything else in the system as well. To each his own, but swapping CPUs holds zero interest for me.
     
    #12 Andrew Lauritzen, May 29, 2013
    Last edited by a moderator: May 29, 2013
  13. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    I went from a dual-core Conroe based CPU to a Yorkfield based one. Double the cores, faster FSB and more L2 cache were what I was gunning for and I got it.

    Still, it's not that uncommon if you look around the Internet. Given Intel changes sockets once every two and a half years(Not that I blame them for it usually, though I do question why they changed from LGA 1156 to LGA 1155... Surely the benefits couldn't have been THAT great?), there are more system upgrades than ever, but AMD's been keeping stuff backwards and forwards compatible for a while now. If you look at folks with AM* sockets, they're more likely to just upgrade from an old CPU to a new one.

    As for the rest of my points? I retract them, it's clear I didn't have enoguh knowledge. That said, I'm never gonna be all that happy about vendor specific features. I hope one day in a few years, GPUs will get the features Haswell has and Codemasters patched GRID 2 to use a somewhat more generic form of what Haswell users will get out of it.

    Sorry if this post is a little unclear, I'm a bit light headed right now. I'll redo it in a separate post later to clarify things if needed.
     
  14. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,373
    Likes Received:
    242
    Location:
    NY
    About time you bums did something besides ray trace quake. :razz: :wink:

    Hopefully only the beginning...
     
  15. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    I fully agree with you here. In fact, I've personally pushed these features for standardization (starting years ago) with the major API folks, but ultimately things are moving very slowly in the API space right now. Rather than let that slow GPU innovation, it seems better to go and prove how useful a feature is so that everyone applies pressure on the other hardware vendors and API overlords to add it too. i.e. your reaction makes me very happy, and I think Codemasters would like nothing more than to have it supported everywhere too :)
     
    #15 Andrew Lauritzen, May 29, 2013
    Last edited by a moderator: May 30, 2013
  16. doob

    Regular

    Joined:
    May 21, 2005
    Messages:
    392
    Likes Received:
    4
    Hasn't crytek implemented volumetric fog shadows on their recent crysis 3 title? They didn't use avx instructions for it so what they achieved was through some clever trick?
     
  17. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,012
    Likes Received:
    112
    The extension mechanism used looks rather hilarious to me (create specially constructed D3D11_USAGE_STAGING/D3D11_CPU_ACCESS_READ buffers, e.g. for querying if the extensions are available put "INTCEXTNCAPSFUNC" into the initial data). Looks even more funny than d3d9 fake formats :) though should be much more powerful I guess. Are doing other vendors the same (or don't they have any extensions)? I agree though it would be very nice if extensions like that could be standardized. Well you could with OpenGL but I don't think intel did?
     
  18. Wynix

    Veteran Regular

    Joined:
    Feb 23, 2013
    Messages:
    1,052
    Likes Received:
    57
    I was about to post something similar to what I.S.T did, but damn that is a great response.
    It addresses my issue and then some, well done. :grin:
     
  19. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Ha yeah, that's the reality of hacking through an interface without an official extensions mechanism ;) There are helper headers that clean it up and make it look prettier :)

    Yeah it's pretty much unavoidable. Especially extensions in shaders there's no clean way to extend HLSL since it goes through Microsoft's compiler, so it always ends up being hacky.

    Yeah I believe the features will be supported in OpenGL as well (and as you note, via the official extension mechanism), just DX was the priority for most game developers.
     
  20. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,012
    Likes Received:
    112
    Yes I noticed I was just wondering how it finds its way around dx11 "no extensions possible" :).
    Ok, sounds like a plan!
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...