With medium settings on a GT3e design, which will not be the part of any Haswell that use LGA1150 socket. Please tell me you're joking.:smile:Grid 2 runs great on Haswell, as demo'd at GDC, etc...
*sigh* It's like Nvidia's shenanigans again, except even worse. How so? At least with Nvidia's bullcrap, you can just simply replace your video card if you absolutely must have X feature you want in Y game. With this? You have to buy a laptop and accept reduced frame rates and resolution.
I believe these effects require specific hardware features in Haswell's GPU. It might be possible to make similar effects using general pixel shaders, but it's not likely to be efficient.
I read it in a similar way; those effects depend on hardware capability of the Haswell GPU. If you don't have (or aren't using) a Haswell GPU? That's fine, play it at the "highest available" settings for your platform.
I see this as no different than adding in some CUDA-based effects for your game. Don't have a CUDA capable video card? No problem, feel free to play without it. Does that significantly detract from your ability to enjoy the game? That depends on you... Seemed to work OK for a slew of other games, but perhaps that's different "NVIDIA BULLCRAP" because -- uh, it's NVIDIA I guess?
High settings, 1080p, but yeah. Do you guys just make this stuff up to fit your preconceptions or do you actually have any real information? And who honestly cares about whether a CPU is socket-able these days? I have not upgraded just a CPU in many, many years... because it's pointless and almost always better to just build a new system and have two working systems (or give away the old one).With medium settings on a GT3e design, which will not be the part of any Haswell that use LGA1150 socket.
Like pchen said, it requires special hardware features (pixel shader ordering). If you read the page I linked above you'd know all this, and if you read the associated papers that are now years old you'd know that you can indeed do it without the hardware feature, but it's multipass and quite inefficient. The samples we have released have both implementations fully available in source code; people usually just choose to not use the general DX11 implementation because it's too slow.*sigh* It's like Nvidia's shenanigans again, except even worse
Short answer, no. Read these:Couldn't this be done via Compute shaders? That';d make it compatible with any DX11 card...
From what to what? These days I never feel the need to upgrade CPU more than once every few years and given that cycle, it's as much time to upgrade everything else in the system as well. To each his own, but swapping CPUs holds zero interest for me.As for CPU upgrades? Well, I just did one a few weeks ago. All I did was swap out the cooler and the CPU.
I fully agree with you here. In fact, I've personally pushed these features for standardization (starting years ago) with the major API folks, but ultimately things are moving very slowly in the API space right now. Rather than let that slow GPU innovation, it seems better to go and prove how useful a feature is so that everyone applies pressure on the other hardware vendors and API overlords to add it too. i.e. your reaction makes me very happy, and I think Codemasters would like nothing more than to have it supported everywhere tooThat said, I'm never gonna be all that happy about vendor specific features. I hope one day in a few years, GPUs will get the features Haswell has and Codemasters patched GRID 2 to use a somewhat more generic form of what Haswell users will get out of it.
I was about to post something similar to what I.S.T did, but damn that is a great response.I fully agree with you here. In fact, I've personally pushed these features for standardization (starting years ago now) with the major API folks, but ultimately things are moving very slowly in the API space right now. Rather than let that slow GPU innovation, it seems better to go and prove how useful a feature is so that everyone applies pressure on the other hardware vendors and API overlords to add it too. i.e. your reaction makes me very happy, and I think Codemasters would like nothing more than have it supported everywhere too
Ha yeah, that's the reality of hacking through an interface without an official extensions mechanism There are helper headers that clean it up and make it look prettierThe extension mechanism used looks rather hilarious to me.
Yeah it's pretty much unavoidable. Especially extensions in shaders there's no clean way to extend HLSL since it goes through Microsoft's compiler, so it always ends up being hacky.Are doing other vendors the same (or don't they have any extensions)?
Yeah I believe the features will be supported in OpenGL as well (and as you note, via the official extension mechanism), just DX was the priority for most game developers.Well you could with OpenGL but I don't think intel did?
Yes I noticed I was just wondering how it finds its way around dx11 "no extensions possible" .Ha yeah, that's the reality of hacking through an interface without an official extensions mechanism There are helper headers that clean it up and make it look prettier
Ok, sounds like a plan!Yeah I believe the features will be supported in OpenGL as well (and as you note, via the official extension mechanism), just DX was the priority for most game developers.