Not sure about that, but the more relevant way to word that, IMO, is, has any company ever released a dev kit that is significantly slower (like Pascal vs. Maxwell) than the final hardware?
And I'm not really aware of any, although I haven't really kept track of all devkits through the years.
IMO, if Pascal was the target GPU of final hardware, they wouldn't be using a TX1 for the devkits.
Regards,
SB
Yes, initial PS3 dev kits were clearly weaker for instance. However, these things depend on the console, obviously. Modelling what in essence is medium power gaming PCs is super easy, since you have something very similar but more expensive/power hungry on store shelves already.
The Switch is not so easy. If they want developers to have a good idea of the capabilities of the retail device, use early versions of the development tools, and nVidia will supply the whole shebang, just what did they have to choose from, really? The Tegra X1, that's what.
There are two additional indications though. The dev kits are said to be fan cooled. The final device has what looks like rather substantial vents (compare with tablets for instance, or even many compact laptops), which, since they are present on the final device, indicate that it will draw rather substantial power at least when stationary. This implies that since the screen shuts down when in the dock, the device as a whole would draw substantially less when docked. But it seems odd to assume that the Switch, when portable, would draw enough power to require fan cooling as that would imply really short battery life. (Typical phablet battery capacity is roughly 10Wh. The 10" Pixel C has a 34Wh battery, and lasts roughly five hours playing games. Subtract the screen.) If the Switch wants to have reasonable battery life when mobile, SoC+memory, screen and wireless communication can't draw more than a maximum of say 5W, with three watts or so going to the SoC, on the outside.
That doesn't require forced air cooling, or substantial vents. So it seems reasonable to assume that the device will actually draw more power when docked, even though the screen is off, which implies different clocks docked and mobile.
Now, Nintendo could intend to save money and order a very heavily cut down Parker SoC, but then you have to question why on earth they would push such a SoC so hard that it requires forced air cooling. Seems unlikely. It only really makes sense if Nintendo actually chooses the X1 itself or a 16nm device with X1-equivalent (smaller FF SoC pushed harder) or higher performance. But there is no way to differentiate between these three from the choice of X1 for the dev kit, because the X1 was the only option that nVidia could provide early on at that power level or up. They unveiled Parker August 22nd. God knows the state of silicon right now.