I think this is wrong, because Adreno 200 definitely consists of only one vec4 FP32 ALU. I don't think anyone would be publicizing an increase in 2x and 4x performance if Adreno 205 and 220 changed to 4 and 8 vec4 ALUs respectively.
As for the ULP GeForce, it's an embedded design and shares parts from several NV DX9 desktop architectures; alas if one expects an embedded design to be a direct shrink/copy of a desktop design cluster.
And really just a cursory review of GeForce 6's design as presented in GPU Gems shows so many differences from GeForce ULP. It's really hard to call it "derived" from that on the information given so far.
metafor said:
With a write-back, inclusive cache, I don't see this being much of a problem as evictions would occur on their own. If hardware enforces evictions-on-conflict, I don't see what routines the OS could be doing in which it relies on being able to invalidate a set/way before write-back. That seems fairly dangerous...
So what you're saying is that you don't see a problem if the flush operations fail to actually flush what you think you're flushing. It's not enough for a cache to be write-back when you need dirty data to be flushed NOW, for instance when other devices (not coherent with the core) need to see what changed. If it were write through that may be another story but it isn't always.
metafor said:
Unless they come out with a drastically improved Atom uarch, I'd wager that point is well below the 5W range.
Maybe, but until Intel actually does release < 17W CPUs it's speculation. I don't think it's a lock that IB keeps scaling down to 10W at your specifications. Maybe someone should try measuring power consumption at full load with these hard clock limits.
metafor said:
My point isn't that IB will be advantageous in that it'll support legacy apps. My point is that IB at the high-end tablet's power envelope will be competitive in its own right. Both in terms of performance and power consumption.
Competitive while running what? How much of a demand do you think there currently is for vastly more power than tablets currently provide, but while running the same programs?
metafor said:
Aren't Celerons just whatever design-of-the-year they're currently selling on the high-end, but with certain features and a bunch of cache disabled? It's a different name, but that'd be IB. And performance-wise, it'd still be on par if not ahead of A15 based SoC's.
When I say crippled I mean a lot of its die space disabled, not just cache. For instance there are single core Celerons. I didn't mean to imply it wouldn't be the same uarch, but we're talking a major difference in performance levels. Lack of turbo makes a really big difference too, when you're talking about base clocks around 1GHz.
metafor said:
Which is what has been changing, since it's the cheap tablets that are now getting the lion's share of non-iOS sales.
Yes there's a market for more expensive stuff, but you're losing focus of your original claim. I asked why the list only has 10" stuff with a bunch of 4 + 1 Cortex-A15s. You said it's because nVidia is looking to compete against Ivy Bridge. This isn't saying there's a market here, that's saying it's the ONLY market nVidia sees its next gen tablets as playing in. Which I really, really doubt. What do you see nVidia speccing for those cheaper tablets? Obviously not Grey.
So far the market for more-expensive-than-Kindle Fire (and not an iPad) really has been pretty marginal. Especially if you take out turns-into-a-laptop-form-factor as a side feature.