I'm not talking about mining-client signatures, but about the kernels themselves. For given crypto algorithms, they probably have a very specific profile. But I realize another problem now: Completely new algorithms would not be catched by this property of the drivers alone.
I'm not sure that's a serious problem in the short-medium term. ASIC-resistant algorithms generally derive their resistance by purposefully bottlenecking on some resource a dedicated ASIC cannot readily scale, like on-die capacity or local DRAM bandwidth. Popular algorithms go further and select some architectural facet that is common in client hardware and somewhat less successfully not scalable with more expensive setups or data centers.
That's why Ethereum and derivatives often revolve around pseudorandom access to DRAM, which blows out on-die storage and doesn't reward clusters (hence getting away with very little PCIe bandwidth).
Others like Equihash balance that bandwidth demand out with additional compute with proof of capacity, though that still heavily focuses on a subset of the architecture.
A general heuristic, aside from obvious checks like how many cards in a system, what their PCIe width is, and some common tweaks for some miners like the heavy undervolting of the GPU and overclocking of RAM, would be the heavy use of a subset of system corners.
A high sustained rate of non-linear misses to DRAM, straightforward resource allocations, little or no standard GPU hardware path utilization, the math/logic used, and the very high level of sustained performance.
Dedicated instructions that accelerate hashes or a mix that's heavy on integer math and logical comparisons could show up as a clear signal as well.
Limiting them outright, or duty-cycling them if they consistently hit a high threshold of use for some effectively impractical time period for a game seems plausible. It's not clear from the profiles shown that resource utilization in gaming avoids serious trail-off near the tail end of a 16/33ms frame, or avoiding at least some of the graphics pipeline taking up a measurable percentage of time. Gameplay-wise, full saturation seems extremely improbable for more than a few seconds, and a gamer would likely be physically incapable of managing a full-bore game scneario for 12 hours or more in a sustained fashion. I'm not seeing how checks for such scenarios would affect gaming generally enough to not be handled on a case-by-case basis.
For a miner, getting around that could translate into tens of percent lopped off throughput at the top end, and significant periods of throttling in a 24 hour period. "Faking" utilization checks literally means leaving hash rate off the table due to underutilization or creating a fake graphics load sufficiently heavyweight to compromise utilization.
However, that's a reason to make a miner pay for hardware that is able to lift such limits, rather than creating a mining SKU that costs them less.
Giving a cheap mining option provides miners the chance to raise their earning potential so that they can buy standard GPUs in addition to mining SKUs.
Avoidance of the checks with new algorithms has some back-pressure.
Since this is utilization-based, they're either very different or not efficient.
Very different means it might take them out of the GPU-friendly space.
Very different may compromise the appeal of the algorithm, since part of the motivation was to broaden the hardware base.
Very different may shrink the amount of money that would flow into the coin's cap, leaving it niche.
Very different may take some time to be created and to ramp to significant numbers.
Very different means fighting the inertia of the existing market.
Up-charging may also have synergies with the profit motive of miners. They might pay more for GPUs with the limiters removed, but this also weakens the hash rate contribution for the duty-cycled gaming cards while reducing competition for optimized hardware.
Turing is the name of quick deployment GPUs designed specifically for crypto market, named after Alan Turing who broke the Nazi "Enigma" cryptography. NVIDIA will indeed block mining on consumer hardware!
https://www.techpowerup.com/241552/...ng-chip-jen-hsun-huang-made-to-save-pc-gaming
Turing was also hugely influential in formalizing computational theory. Turing machines (any truly programmable machine), Turing-complete languages, contributions to theory and AI, etc.
More than the other scientists used so far, for a company pursuing a fully-generalized programming model and AI, I'd think Nvidia wouldn't want to waste his name on something that is doing so little to advance humanity.