If you want to limit things to "gaming as it currently stands" then it'd probably have about the same impact as "waggle" on gaming as it stood in the PS2 era. If you build in a condition of nothing being different then I guess no, nothing would be different.
I only use "gaming as it currently stands" as a qualifier, as "as it currently stands" there are next to no gaming implementations using biometric feedback data to change or adjust the way a game is being played. There's nothing to go off, so if you can't sit down and think about any reasonable implementations that would meaningfully improve gaming, why would you invest millions simply to throw something out there and hope that developers would? I know devs are the creatives of the industry, but they can't turn lead into gold. The original Wiimote proved that, and i believe we're seeing a similar thing with kinect and hands-free gaming. On the other hand, both those devices still offer more imaginable gaming applications than biometric feedback ever could.
Maybe, maybe not. You find out by making it available and seeing what gets done. Getting physiological data from millions of players and plotting it against specific in game events would give you fantastic information as a game designer, if nothing else. A university would kill for that kind of data. Changing in game variables (colours, behavioural states, difficulty) or triggering events that you know (normally) have a specific effect on players of various "types" could all be done on a more user specified level. Just thinking about it for literally a few seconds already give you more things than you could try and test in a year.
But then it becomes a good idea for improving the game developement process by better understanding how gameplay devices and mechanics affect the player. You won't get consumers to pay extra money for a peripheral soley to feed data back to the game developers. If you want to obtain that kind of data you're better off doing focus-testing in a controlled environment. Gamers love gaming, and many love the devs that make the games they love, but none that are reasonable human beings would invest in extra peripherals simply so that game devs and pubs can capture more data. It's not a consumer product.
It was a great idea. It was a silly implementation. Clipping some crap on your finger was never going to take off.
It really wasn't. It was a silly idea and a silly implementation. Nintendo realised the error of their ways and promptly corrected it after embarrasing themselves with its reveal. It was a pulse monitor for Wii-fit and little else.
Biometrics on the whole could be interesting in affecting say for example enemy AI in horror games. But like the Vitality sensor for fitness applications, on a whole the potential benefits they offer to gaming on the whole probably aren't worth the price of admission, and their limitations as technology at the present time are legion. I agree with you that ultimately for stuff like full immersion VR, biometrics would be amazing going hand-in-hand with stuff like advanced AI and voice recognition for natural conversations with in-game NPCs. At the moment though those other technologies aren't there yet, and probably won't be for a while, and likewise with biometric feedback tech.
Nintendo should look for a new gimmick that's realistic and can add something to gaming as it is now, whilst opening up things to new forms of play. Unfortunately, I personally don't believe there's much out there technologically that fits that bill. So i firmly believe that next-gen will revolve more around deepening the more traditional gameplay experiences as well as advancing current interface technologies like motion, pointer, voice rec etc..
Edit:
Actually I this is getting a bit off topic, but i do think that there's value in a discussion about the merits or not of biometric feedback technologies and their potential in gaming. Perhaps a MOD can create a spin-off thread?