I can sort of buy your lowest common denominator argument. There's a whole PC gaming industry that revolves around games designed to run well on that class of hardware. Which is fine. But then there's a whole class of games, some of which you mention, which just aren't designed for IGPs at all. Min-spec for those is completely somewhere else, and for good reason. Devs are actively not caring about an Intel IGP any more.
I think you have it backwards. Games never cared for IGPs at all. In fact, IGPs are a relatively new development. Back in the day videocards were nearly always discrete, and if they were integrated on the motherboard, usually they were nothing more than the discrete components (including videomemory) all built onto the same PCB.
So the earliest breed of PC games tended to run on Hercules and/or CGA graphics... EGA was mostly skipped, and then many games started to demand VGA graphics, which was usually a discrete ISA card.
Then we moved on to SVGA with VLB and PCI cards, and then the 3d revolution started, ofcourse again with discrete cards like the Voodoo. Games started demanding 3d cards... and it wasn't until later that integrated videochips actually started offering 3d acceleration at all... very poorly at first, but at some point IGPs from ATi and nVidia started to adopt most of the basic technology of the discrete cards, meaning that they were more or less on par with discrete cards, just far slower.
So until recently, IGPs just weren't an option for gaming by default, because they were too far behind. Intel is the only one that hasn't quite caught up yet, but they are now finally making the effort, it seems. I don't think anyone took Intel seriously with graphics anyway. And deservedly so, because only a few years ago, their drivers were so horrible that even Office applications were not always rendered properly.
I'm all for affordable graphics, and it's good to see that basic perf and IQ level sitting some way above what it normally is in a historic sense. But neither do I think comparing IGPs to discrete is missing the point either. There's a certain class of device where the form factor and power budget means a $4 IGP is the only thing that can ever be considered to push the pixels.
But then there's a whole enormous market that currently bundles nothing but IGP that could augment a discrete GPU into the system for little extra costs, but never does, where your performance expectation is just massively higher, giving the consumer a much better gaming experience.
That's what bugs me the most. The low-end discrete market is clobbered by IGP sales in laptops and desktops alike because OEMs don't see gaming as a value addition down there.
I wonder though... AMD is trying to start off an 'IGP war' with their 780G chipset, which tries to close the gap with low-end discrete cards like the Radeon 2400 or GeForce 8400.
Intel is now pretty much 'up to date' with its drivers aswell (the X3100 hardware has actually been around for quite a while, but hardware vertex processing, DX10 features and some of the video acceleration was just never enabled until recently). And ofcourse Intel is gearing up for Larrabee next year. You can think of X3000/X4000 as a 'test-run' for their graphics team. They are now building up experience with dynamically allocated unified shaders, and general DX9/DX10 driver compatibility/optimizations...
So on the one hand, Intel may produce a serious discrete card with Larrabee, and on the other hand, Intels IGPs will probably make quite a leap in performance once the first Larrabee-spinoffs are introduced.
So, the feature gap between IGP and discrete card has already been closed now, and it seems that the performance gap will be narrowed by ATi/Intel aswell (and nVidia cannot remain behind ofcourse). DDR3 also delivers more bandwidth for IGPs, and Nehalem will have a triple-channel controller, for even more bandwidth goodness.
So I'm just saying: Intel may finally get serious with graphics... this X3100 is looking good so far . They seem to have the basic hardware going, and the drivers are starting to mature aswell... X4000 should also bump up the performance with higher clockspeed, more memory bandwidth, and more processing units... And then we'll have to see what Larrabee is capable of.