This is weird, I am of the belief you are just trolling now. Or we are really really talking about something very different.
I'm not trolling, I think you are unfamiliar with the basic principles of multiplexing in RF communications.
MIMO, as to my knowledge, implemented in Wifi is this :
https://en.wikipedia.org/wiki/MIMO
Right, MIMO is turnkey multiplexing solution that is designed to be able to work in any radio or microwave communication pair. It's intra-deployable into the transceiver stack so and implemented point-to-point, i.e. end devices don't need to understand it. As such, it's designed to scale according to the bandwidth and frequencies used which are very varied. Ergo, it's not limited to 2,000+ sub channels - going back to Shifty's post.
Not directly related to sub-channels, not I am not even sure you mean sub-channels = resource units as used with OFDMA. That is the closest I get to your 2000+ sub-channels, when talking about WiFi.
Quick history lesson, OFDM standardised a typical Wifi5+ connection of 312.5 kHz, of having four sub-channels of 78.125kHz. This means where bandwidth (capacity) was less important than latency, you could transmit data faster across the four sub-channels instead. OFDMA expanded on this by divorcing the four-sub channels from each device/router pair to the whole spectrum and all devices on the network.
The 2,000+ sub-channels is the number of 78.125 kHz sub-channels in the 6 Ghz spectrum. If you look at this and the MIMO implementation and think they look a lot alike, you are correct. This is not a new technology or approach, just the latest ITU standard.
And it actually takes proper planning and maintenance to be real good, what every brand wifi solution you have. And the RF environment is forever changing so aka proper maintenance is needed. Sure worse quality products are worse than good quality products. but by itself WiFi is prone to more errors than cabled. The maths that goes into these technologies are way over my head, but there is a meaningful gap in the BER number between WiFi and cabled, which is basically my point.
Putting your router in a good spot
is important but I do zero 'maintenance' on my wifi network other than periodic firmware updates. The moment wifi moved into the 5 Ghz band, the number of available frequencies combined with the side-effect of 5 Ghz not penetrating neighbouring rooms/buildings as well as 2.4 Ghz, made worrying about signal noise something that wasn't an issue. Unless you're in a massive open space or a sky scrapper made out of wood and paper, the disadvantages of 5 Ghz become an advantage.
What are these sub-channels, I trying to understand what you mean with sub-channels. And if its mandatory would not a cheap domestic router have to use it? If not it breaks the standard?
I think I answered this above, but let's dive into it as it's not complicated. Let's take for example Wifi 6E (6 Ghz) which in Europe typically operates in the 5.925 GHz and 7.125 GHz spectrum. That is an absolutely massive frequency space available for consumers so even when you, your family and friends all connect their devices to your router, there is there is more than enough spectrum for every device to have good performance, unlike the days of 802.11a, b and g where the RF space was narrow and very congested.
With OFDM, each device/router connection typically represents 312.5 kHz of the spectrum (more in Wifi7), but rather than imposing a single 312 kHZ connection, OFDM compromises four sub-channels of 78.125 kHz each. Where latency is more important than bandwidth (capacity), you can send data across four sub-channels simultaneously (multiplexing!). This is why latency times using OFDM are much reduced. What OFDMA introduced was this notion that sub-channels were sacrosanct per device connection, instead they are deployable across all of the devices using the network. So low-usage connections of devices could be used by high-usage device devices.
Here is a CISCO visualisation:
As for standards in domestic wifi, well that is bit of a mess which isn't much different to standards in other complex protocols like HDMI. Why do some TVs support 1440p and some don't? It's not an official HDMI standard but many TV support it as a feature. It's the same with routers, many implement non-standard protocols to improve their performance. It's a competitive market after all.