Grid 2 has exclusive Haswell GPU features

Davros

Legend
Just a heads up
Grid 2 supports avx, and avx exclusive features seem to be advanced blending and smoke shadows
 
So to get the game looking at its best, you have to play it on a slow as hell GPU?

Greaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaat.
 
Yeah, I doubt it could do max settings om any Haswell GPU at anything above 1280x720, if that.

Plus, well, Intel decided to not put the best GPU on the desktops.

*sigh* It's like Nvidia's shenanigans again, except even worse. How so? At least with Nvidia's bullcrap, you can just simply replace your video card if you absolutely must have X feature you want in Y game. With this? You have to buy a laptop and accept reduced frame rates and resolution.
 
*sigh* It's like Nvidia's shenanigans again, except even worse. How so? At least with Nvidia's bullcrap, you can just simply replace your video card if you absolutely must have X feature you want in Y game. With this? You have to buy a laptop and accept reduced frame rates and resolution.

I believe these effects require specific hardware features in Haswell's GPU. It might be possible to make similar effects using general pixel shaders, but it's not likely to be efficient.
 
I believe these effects require specific hardware features in Haswell's GPU. It might be possible to make similar effects using general pixel shaders, but it's not likely to be efficient.

I read it in a similar way; those effects depend on hardware capability of the Haswell GPU. If you don't have (or aren't using) a Haswell GPU? That's fine, play it at the "highest available" settings for your platform.

I see this as no different than adding in some CUDA-based effects for your game. Don't have a CUDA capable video card? No problem, feel free to play without it. Does that significantly detract from your ability to enjoy the game? That depends on you... Seemed to work OK for a slew of other games, but perhaps that's different "NVIDIA BULLCRAP" because -- uh, it's NVIDIA I guess?
 
I read it in a similar way; those effects depend on hardware capability of the Haswell GPU. If you don't have (or aren't using) a Haswell GPU? That's fine, play it at the "highest available" settings for your platform.

I see this as no different than adding in some CUDA-based effects for your game. Don't have a CUDA capable video card? No problem, feel free to play without it. Does that significantly detract from your ability to enjoy the game? That depends on you... Seemed to work OK for a slew of other games, but perhaps that's different "NVIDIA BULLCRAP" because -- uh, it's NVIDIA I guess?

I just dislike the entire idea of vendor specific features, and this case is especially bad because of what I outlined in my last two posts.

FWIW, I am a seven year running user of Nvidia cards. I keep using them mainly because AMD keeps having shortcomings I dislike. I highly, highly dislike Nvidia's CUDA crap or the physX or the batman anti-aliasing debacle. It was all bullcrap, and the way I see it, so is this Haswell thing.
 
With medium settings on a GT3e design, which will not be the part of any Haswell that use LGA1150 socket.
High settings, 1080p, but yeah. Do you guys just make this stuff up to fit your preconceptions or do you actually have any real information? ;) And who honestly cares about whether a CPU is socket-able these days? I have not upgraded just a CPU in many, many years... because it's pointless and almost always better to just build a new system and have two working systems (or give away the old one).

And yes, the design point is laptops where you'll find it to be very competitive. There are the "R-series" parts I believe if you want to bump the TDP a bit further, but yes - news flash - Intel doesn't make discrete GPUs.

*sigh* It's like Nvidia's shenanigans again, except even worse
Like pchen said, it requires special hardware features (pixel shader ordering). If you read the page I linked above you'd know all this, and if you read the associated papers that are now years old you'd know that you can indeed do it without the hardware feature, but it's multipass and quite inefficient. The samples we have released have both implementations fully available in source code; people usually just choose to not use the general DX11 implementation because it's too slow.

Yes I won't be able to use it on my GTX 680 or 7970 either, but it's not like the game fundamentally depends on it. In any case I'm not going to let other folks' hardware choices hold me back from innovating in graphics...

i.e. if you're buying a laptop/all-in-one and love GRID 2 or similar, you might want to take a look at the Intel parts. If you're on a desktop with a big discrete GPU, just ignore the settings and be happy.
 
Last edited by a moderator:
Couldn't this be done via Compute shaders? That';d make it compatible with any DX11 card...

As for CPU upgrades? Well, I just did one a few weeks ago. All I did was swap out the cooler and the CPU.
 
Couldn't this be done via Compute shaders? That';d make it compatible with any DX11 card...
Short answer, no. Read these:
http://software.intel.com/en-us/articles/adaptive-volumetric-shadow-maps
http://software.intel.com/en-us/articles/adaptive-transparency

As for CPU upgrades? Well, I just did one a few weeks ago. All I did was swap out the cooler and the CPU.
From what to what? These days I never feel the need to upgrade CPU more than once every few years and given that cycle, it's as much time to upgrade everything else in the system as well. To each his own, but swapping CPUs holds zero interest for me.
 
Last edited by a moderator:
I went from a dual-core Conroe based CPU to a Yorkfield based one. Double the cores, faster FSB and more L2 cache were what I was gunning for and I got it.

Still, it's not that uncommon if you look around the Internet. Given Intel changes sockets once every two and a half years(Not that I blame them for it usually, though I do question why they changed from LGA 1156 to LGA 1155... Surely the benefits couldn't have been THAT great?), there are more system upgrades than ever, but AMD's been keeping stuff backwards and forwards compatible for a while now. If you look at folks with AM* sockets, they're more likely to just upgrade from an old CPU to a new one.

As for the rest of my points? I retract them, it's clear I didn't have enoguh knowledge. That said, I'm never gonna be all that happy about vendor specific features. I hope one day in a few years, GPUs will get the features Haswell has and Codemasters patched GRID 2 to use a somewhat more generic form of what Haswell users will get out of it.

Sorry if this post is a little unclear, I'm a bit light headed right now. I'll redo it in a separate post later to clarify things if needed.
 
That said, I'm never gonna be all that happy about vendor specific features. I hope one day in a few years, GPUs will get the features Haswell has and Codemasters patched GRID 2 to use a somewhat more generic form of what Haswell users will get out of it.
I fully agree with you here. In fact, I've personally pushed these features for standardization (starting years ago) with the major API folks, but ultimately things are moving very slowly in the API space right now. Rather than let that slow GPU innovation, it seems better to go and prove how useful a feature is so that everyone applies pressure on the other hardware vendors and API overlords to add it too. i.e. your reaction makes me very happy, and I think Codemasters would like nothing more than to have it supported everywhere too :)
 
Last edited by a moderator:
Hasn't crytek implemented volumetric fog shadows on their recent crysis 3 title? They didn't use avx instructions for it so what they achieved was through some clever trick?
 
The extension mechanism used looks rather hilarious to me (create specially constructed D3D11_USAGE_STAGING/D3D11_CPU_ACCESS_READ buffers, e.g. for querying if the extensions are available put "INTCEXTNCAPSFUNC" into the initial data). Looks even more funny than d3d9 fake formats :) though should be much more powerful I guess. Are doing other vendors the same (or don't they have any extensions)? I agree though it would be very nice if extensions like that could be standardized. Well you could with OpenGL but I don't think intel did?
 
I fully agree with you here. In fact, I've personally pushed these features for standardization (starting years ago now) with the major API folks, but ultimately things are moving very slowly in the API space right now. Rather than let that slow GPU innovation, it seems better to go and prove how useful a feature is so that everyone applies pressure on the other hardware vendors and API overlords to add it too. i.e. your reaction makes me very happy, and I think Codemasters would like nothing more than have it supported everywhere too :)
I was about to post something similar to what I.S.T did, but damn that is a great response.
It addresses my issue and then some, well done. :D
 
The extension mechanism used looks rather hilarious to me.
Ha yeah, that's the reality of hacking through an interface without an official extensions mechanism ;) There are helper headers that clean it up and make it look prettier :)

Are doing other vendors the same (or don't they have any extensions)?
Yeah it's pretty much unavoidable. Especially extensions in shaders there's no clean way to extend HLSL since it goes through Microsoft's compiler, so it always ends up being hacky.

Well you could with OpenGL but I don't think intel did?
Yeah I believe the features will be supported in OpenGL as well (and as you note, via the official extension mechanism), just DX was the priority for most game developers.
 
Ha yeah, that's the reality of hacking through an interface without an official extensions mechanism ;) There are helper headers that clean it up and make it look prettier :)
Yes I noticed I was just wondering how it finds its way around dx11 "no extensions possible" :).
Yeah I believe the features will be supported in OpenGL as well (and as you note, via the official extension mechanism), just DX was the priority for most game developers.
Ok, sounds like a plan!
 
Back
Top