I disagree. The game varies from mediocre to good looking but the variance between Series S and X has to be the smallest I've seen from any game released this gen, whether cross-gen or current gen only.
Really? A resolution of 2.56x higher, plus other setting increases, is the smallest difference for any game released this gen?
Also, the term "higher" is subjective and I expected a difference of PC min vs PC Ultra between S and X. I'm just not seeing that here.
Series X Starfield having higher resolution shadows, higher res dynamic cubemaps, higher foliage settings for grass (which munches fillrate), environmental fine detail pushed further out, doesn't seem subjective as DF have shown places where it's definitely the case.
Anyway, how is the Series X going to do a jump from PC Min equivalent settings to PC Ultra while also maintaining a huge and very much demanded jump in resolution?
Series X's biggest area of advantage over the Series S is, going by the numbers, probably compute. It's about 3x higher. Everything else is more like 2.4 or 2.5 times higher (BW, fillrate, cache size, cache BW) or lower. The 3D pipeline gets a bit more efficient as you increase resolution, but even so 2.56x base resolution jump plus other increased settings is pretty good.
If you'd preferred them to keep the same res and bump up some other settings fair enough, but this is what I'd expect to see for a game tuned to the strengths of the two machines. I think it's perfectly legitimate for the primary difference between the two to be resolution (and resolution of e.g. reflections) - this is what pretty much every other game does and it's what MS marketed the consoles on.