My experience is anecdotal, so it should be evaluated as such, but the majority of people I know view multi-card setups as something rather exotic.
From what I've seen of chatter on the internet (a limited sample and I don't frequent a lot of the boards where members like to pimp their rigs), AMD crossfire is frequently mentioned in the context "you can use crossfire to equal Nvidia's single-card offering".
That's not exactly resounding praise for AMD, but I guess you have to get kudos where you can.
I'm interested in the numbers for this scheme.
Is it better for Nvidia to sell a very expensive high-margin Ultra, or can AMD manage better with the volumes of selling multiple lower-margin RV670s?
On one hand, it is preferable to have a single-chip or single-card solution, if possible. It seems clear that for one manufacturer, the limit of possibility is a little closer.
After all, why get two cards for 1.8X scalability if you can buy a single card that is twice as capable and offers by default 1X scalability?
Then again, it is possible that Nvidia's having more difficulty playing the manufacturing game with larger die sizes.
The determination of which manufacturer is better off is dependent on just where on the cost of manufacturing/integration curve they are at.
Perhaps AMD is better served by better groundwork for multi-card solutions, though in my admittedly conservative opinion, Crossfire or SLI are at least in part another way for the IHVs' drivers to louse things up.
Just like with multicore CPUs, you don't go for multiples unless you can't get any further with one.
It's a pain in the ass for AMD, then, that apparently someone else can.
If Nvidia can keep manufacturing issues at bay, it can keep an edge.