1) The vast, vast majority of games so far have been cross-gen games. Of course Series S is fine with these.
2) The criticism isn't necessarily that Series S cant play next gen titles, it's that it's adding a bunch of extra work and headache for developers that it shouldn't have if Microsoft had better spec'd the memory setup. The whole intent of Series S was supposed to be that you'd get the exact same games as Series X, just with a lower resolution, and for devs, it'd be equally as simple, just turning down the resolution and bam, done.
Because anything else would be a problem. Devs already have too much on their plate. We know this, we've seen this, it's been a huge issue. Even with games taking like 4-5+ years now, it's still not enough time in most cases, as games keep getting released with lots of issues. The last thing devs needed was having to do a bunch of extra work to reduce memory demands enough to work on Series S.
If anything, Microsoft and Sony should have been working to create LESS work for developers wherever possible. Sony seems to have done a decent job of this as it seems PS5 is incredibly easy to develop for, but Microsoft have made it so that developers now have THREE target platforms to work for, each often requiring multiple different flavors of quality/performance modes.
Basically, Series S is just making life more difficult for developers. We know that if there's enough will and resources, you can scale games down quite a lot. Witcher 3 running on Switch is a great example. But that was an optional thing the devs decided to do cuz there was a potential financial upside in selling on a whole new platform. Forcing devs to have to make things work on Series S is just an annoyance that they HAVE to do if they want to release on Xbox at all, and they get no extra monetary reward for doing this extra work.
It's a failure of the brief.