This is as much of an assumption as anything else. A whole G80 GPU didn't need to be inside the RSX. Save for video codecs (which would be handled by the Cell anyway), the architecture was just as modular as its predecessors and successors.
PS3 was originally planned for 6 months (or more) before G80 appeared. Despite the modularity of GPUs, smaller GPUs than G80 did not appear until later in 2007.
You're suggesting that it would be realistic for Nvidia to accelerate their unified program by at least 6 months (!?), and to do so for a
massive volume, low margin part, to the detriment of their first unified chip in their core market, which was a
massive margin, trickle volume GPU.
We have seen that neither Nvidia nor ATI / AMD have ever introduced a radical new architecture for consoles 6+ months before the PC space. And Xenos was not an example of that either - it was not an "early" 2900, it was a spinoff from an abandoned PC GPU.
To suggest a G80 spinoff for PS3 was as reasonable as RSX is not reasonable.
GF3, GF4 Ti and NV2A are all part of the NV20 family. The only thing NV2A got over the existing Geforce 3 Ti500 was a second vertex shader. The rest was pretty much alike. IIRC the performance difference between a GF3 Ti500 and a GF4 Ti at the same clocks was lower than 10% in most titles.
The only truly innovative thing going into the Xbox from nvidia was Soundstorm.
PC games designed for machines with limited vertex processing abilities are a poor way to judge how much "faster" GF4 could be in heavily vertex bottlenecked games due to its second vertex shader. Unsurprisingly, games with low vertex counts won't benefit greatly from a huge increase in vertex processing capability.
You'll note that even NV2A was a spinoff from the GF3/GF4 line, and not an early introduction of a radical new architecture.
Back in 2004 Microsoft had designed zero GPUs, whereas Sony's internal teams had developed both the PS1 GPU and the PS2's Graphics Synthesizer, plus a co-processor inside each console's CPU dedicated to geometry transformation and lighting (T&L coprocessors in practice).
I'm not sure if you're really missing the point, or simply being obtuse.
MS had a mass of experience with graphics. They were responsible for DX, were working on DX10, and would have been working with Nvidia, ATI and PC engine and middleware makers on an ongoing basis in order to understand and plan for evolving technologies.
PS1 and PS2 graphics chips gave Sony
zero visibility on where the PC was going and had nothing at all to do with Xenos or RSX. They were isolated - in terms of both hardware and software evolution - from the PC space.
Assumptions...
No one except some people at nvidia can know for sure if a G80 derivative could or could not be included in the PS3. That point is moot.
All we know is nvidia put out G80 graphics cards in the shelves 3 days after the PS3 was launched, and SCE hasn't made business with nvidia ever since.
That's not all we know, but it seems it's all you're willing to concede.
RSX would have needed to be in mass production before G80 started to trickle off the line, and it was originally intended for PS3 to launch many months earlier then it did. There was no unified shader part in RSX's performance and power segment until 18 months [edit: nope, 12 months] and one node change after PS3 was originally planned to launch.
It's also possible a 90 nm G80 derivative would have performed worse given the same area and power. Some developers here actually argued that point in the past.
You'll note that in games of the time, perf/mm^2 was lower for G80 than for the 7900 GTX, and that this was especially marked at lower resolutions:
https://www.techpowerup.com/reviews/NVIDIA/G80/6.html
Yep, you heard right.
Especially in light of a vertex cull monster like Cell, it's quite possible that a similarly sized G80 derivative would have been less ideal than RSX.
I'll reiterate -
even if a G80 derivative had been possible (and it probably wasn't) it may have been a worse fit for PS3.
But this is all getting very off topic. Suffice to say, Nintendo and Nvidia will both have their reasons for what's in Switch, and they're probably very good reasons.