No, it really isn't.
This whole conversation about "I/O compression will mean consoles are finally faster than PC's at something!" ignores an obvious fact: consoles have, for years, been doing I/O compression the same way PC's have been able to do it for 20+ years: the actual files themselves are compressed. In order to cram more and better assets onto a reasonably-sized storage (and even memory) footprint, game assets have been compressed for almost as long as games have existed; certainly for as long as 3D textured games have existed.
For as long as this site has existed, B3D has hosted conversations about texture compression schemes, depth compression schemes, audio compression schemes, mesh compression schemes, video compression schemes (back when it made sense, instead of now just using the real-time game engine to produce cutscenes), now on to using procedural math to create effects instead of static objects, even intermediary frame buffer compression schemes. Nobody sends enormous, uncompressed assets onto media to be distributed because it's wasteful.
So, all this blabber about "I'm gonna totes compress the I/O" makes a dumb assertion: that the underlying I/O is compressible to begin with. The news might be there's now a "hardware offload" for the decompression side, but that's not saying compression got literally any better based on all the things I just covered in the sentences above. PC's are in no specific danger of being "I/O outclassed!!one!11eleventy!!" by a console. Even today there are games that take dozens of seconds to load a level (looking at YOU, Red Dead Redemption) on arguably the lowest-latency storage hardware in existence (an Optane 905p on a 4.2Ghz Ryzen 3950x.)
Let's stop the insanity, shall we?