We also have Joker telling us some games definitely got capped on XB360 for maintaining parity. That's always been the double-edge to cross-platform titles - more software for more people, less use of platform specifics. Given the similarities of development and that XB1 is both less powerful and harder to use, there's probably more scope for devs to throw in a little extra for PS4, but only regards a setting or three, like the...
#PS3_GRASS_LEVEL = 1
#XB360+GRASS_LEVEL = 3
...on some current-gen games.
To the casual gamer, perceivable difference won't be significant to sway opinion. However, this thread is about how developers handle development, rather than what the market repercussions are.
Every platform has pros and cons. This thread about how developers handle the differences. If a platform has some advantages like amazing CPU power, but it's a considerable investment to utilise those advantages versus utilising the rival's, chance of a dev actually taking advantage of those platform specific advantages are reduced. In the case of next-gen, given the broad similarities between architectures (PS4, XB1 and PC), it seems to me from a business POV that the cheapest, most direct development path would be 'target a 6 core Jag + 12 CU platform and dial up for more CPU and/or GPU for other platforms.' Getting into the specifics of working your engine to XB1's idiosyncrasies is likely going to decrease ROI. Exclusives will likely want to do that, but I'm not sure cross-platform will. Of course, with games using middleware, those middleware vendors can refine their engine per platform, improving utilisation of more esoteric features/functions.