Their results are just what happens when they can't use as much extra power consumption and silicon to outweigh their disadvantage to TBDR like they did in the desktop space and even in the mobile co-processor chip space for a while.
That's not quite right: the reason why the GoForce business degenerated has more to do with integrating a 3D core *at all* rather than the 3D core itself not being very good. Furthermore it's not clear to me power efficiency would be bad on these chips; the real problem would be cost, because they're using a truckload of on-chip SRAM for the framebuffer to reduce power consumption.
Here's what happened: between Q1 2003 and Q1 2004, MediaQ/NVIDIA released four chips: the MQ2100, GoForce 2150, GoForce 3000, and GoForce 4000 (all of them on 150nm, except perhaps the first on 180nm?).
Those were all mostly done by the time NVIDIA bought MediaQ I suspect. All of those chips were quite successful, making the MediaQ acquisition a good one overall. Then NVIDIA started dictating the roadmap. They took the GoForce 4000's multimedia and added a very expensive 3D core (1280KB of framebuffer+textures SRAM!) on the same 150nm process. The chip was very expensive and besides the epic fail also known as the Gizmondo handheld console, it never shipped into anything.
Then comes the GoForce 4800 in 1Q05: they removed half the TMUs, shrunk it to 130nm, and added VGA video decode/encode (MPEG-4, not yet H.264). It was still clearly a high-end solution. And then in 1Q06, they released the GoForce 5500 with completely revamped (and frankly extremely good on paper) multimedia and a clock-bumped 3D core finally using stacked DRAM. But it was still a high-end solution with ASP around $20 iirc, so it didn't get much traction in a market that just didn't care about 3D much.
And finally, three long years after their last non-3D chip (the GoForce 4000), they released the GoForce 5300 on 65nm using the same multimedia architecture as the 5500 but on-chip eDRAM instead of stacked DRAM, no Image Signal Processor, and limited to 3MP cameras. A good chip on paper, but it no longer made sense in a market that suddenly started caring about 3D again (iPhone!) and was still engaged in a camera megapixel race.
With 20/20 hindsight, the same architecture, and similar investment, here's a roadmap that could have done substantially better:
- GoForce 4000 shrink on 130nm instead of GoForce 4500 on 150nm
- GoForce 4800 should have used a Imageon-like memory hierarchy (some SRAM+stacked DRAM)
- GoForce 5500 should have been released ~2 quarters later but on 65nm, no need for GoForce 5300.
- Remaining resources used for a lower-end 65nm Tegra (720p decode, VGA encode, 1xTMU, 8MP, etc...)
And regarding Tegra1, I think the main problem besides not jumping on Android early enough is the lack of CPU power. And there were two possible solutions there: either switch to a Cortex-A8 (I always wondered whether the ARM11 MPCore license was Rayfield's first move or his predecessor's last move...) or, and this would have been more interesting, use a Triple Gate Oxide process to achieve higher clock rates on the CPU (and throughout the chip to a lesser extent). Yes, it's not cheap, but neither is a Cortex-A8 or increasing your number of TMUs to get the same performance effect.
Anyhow what's done is done, and now the only thing that really matters is the level of success for their reasonable number of major Tegra2 design wins and, even more importantly, their execution on Tegra3. We'll see - I'm not sure I really care enough to do anything but wait one year from now and judge how things are by then. Way too many factors to consider before then.