It'll mean the difference between 30 and 33 frames a second. Or rather between 30 and 27 fps, with a smidgeon of screen tearing versus none. Or 1920x1080 versus 1728x1080. If placed side by side, you might notice the higher spec'd XB1 a little better than the normal-clocked XB1, but in terms of consumer experience it's not enough to make a real, notable difference. Heck, there's even debate over how much difference 50% more CUs will actually make on screen and whether that'll be enough to sway consumers. It's all a matter of value. If that overclock comes at negligible extra cost, we want it. But if it results in, say, increased failure rate, would you really prefer the 10% extra framerate/resolution over an increased chance of your console dying? Or if it's the difference between silent operation and a noticeable noise? Different folks will value it differently, but from a business POV I'm not seeing the value in spending big bucks on an upclock. I'm not really seeing the value in spending
small bucks on an upclock.