HDR is an extremely bleeding edge feature, on displays. Like, you need to peak at 1000 cd/m² or something locally (at that'd be preliminary HDR displays) while keeping strong blacks (very high contrast ratio) while your screen is calibrated and the room appropriately and predictably dark or something? This has zero chance of working on a handheld LCD.
Calibration plays no part in any kind of HDR standard or certification.
1000cd/m2 is the requirement for UHD Alliance's Premium certification, which is extremely limited and not that smart because for example doesn't refer to the kind of local dimming which is a lot more important unless you're planning on using the TV on very bright rooms.
For example the mid-end Samsung KS7000 has that certification whereas some OLED models can't get it.
HDR is simply a term that encompasses the usage of 10bit (or more) per sample in the image data.
All they'd need for credible HDR is a 10bit panel or at least one with 8bit + FRC and an LCD driver that accepts that data, together with local dimming.
It shouldn't be that much expensive, really. Edge-lit LCDs are probably cheap-ish to make, even more if they're small.
For the Switch I agree that should be in the bottom of the priority list, though I feel exactly the same about expensive rumble motors, MEMS and IR distance sensors in the joycons.
Imagine if all that money had been spent on a decent 16FF SoC instead.
I'm sure the customization is setting the clocks, power curve, voltage i.e. no customization at all besides tweaking firmware values. Perhaps unused blocks are "laser cut" (camera controller, PCIe?) unless firmware is enough to wall them off anyway.
If firmware customization is all that nvidia did, then they're throwing away a substantial area of the chip.
One would think the volume for the Switch should be enough to make a new, smaller chip without that stuff.
Or maybe nvidia already knew this thing was going to tank hard so they didn't even bother.