All the jokes aside from the elaborate RHY hoax, is there any confirmed accounts of TK1's die size? I'm surprised it isn't easily attainable given TK1 has been out for over a month now (with Jetson).
Yes, I wonder why no one just pops the lid off the K1 on the Jetson and takes a high res picture. Its only $192. You would think some technical publication would be able to swing a $192 purchase.
https://developer.nvidia.com/jetson-tk1
The memory is off-package for K1, which helps. Die size is 121mmsq.
Apple's A7 is 102mm2.Isn't that extremely large for a mobile SoC? Or have they grown in recent years?
Ah, so mobile SoCs have indeed grown in recent years. Thanks.
The 5422 is about 136.6mm2.
Which might sound OT, but brings me to the point of perf/mm2 again. Even if Samsung would use the newer T760 Mali GPU IP the perf/mm2 ratio especially for the GPU is a complete joke to what GK20A is doing in K1
Isn't that kinda of a pointless point since nVIDIA will not compete on the mainstream mobile phone/tablet market anymore? Whatever the T760 Mali GPU offers is good enough for the market it targets, while GK20A is just overkill on the same market.
If you should mean DX11 for instance I can't blame them either; IMG's marketing was shouting from all rooftops that Rogue will bounce from DX10 to DX11.x, ARM also mentioned somewhere stuff about tessellation in its T7xx material (albeit the word behind the curtain is that it's far from being DX11), Vivante made similar announcements and while Qualcomm had been hammered in a relative sense for being supposedly late with a DX11 GPU they're amongst the first to deliver them after all.Which ultimately was a result of terribly bad judgement and planning by nVIDIA when developing Tegra, IMO.
If you should mean DX11 for instance I can't blame them either; IMG's marketing was shouting from all rooftops that Rogue will bounce from DX10 to DX11.x, ARM also mentioned somewhere stuff about tessellation in its T7xx material (albeit the word behind the curtain is that it's far from being DX11), Vivante made similar announcements and while Qualcomm had been hammered in a relative sense for being supposedly late with a DX11 GPU they're amongst the first to deliver them after all.
I'm not necessarily speaking about feature set, as much as market segmentation. Just as they have piss poor GPU SKU's, they should have planned for less overpowered SoC as well. This way, as the overkill SoC did not gain track, they could at least manage to keep a presence on the market.
Α DX11 tessellation unit isn't exactly irrelevant even indirectly to power consumption as it consumes more die area than many would imagine, plus of course all the other requirements.
From what I had heard Imagination had skipped improved rounding in present Rogue GPUs which accounted for a healthy die area difference in ALUs, which bounced them from DX10.1 to DX10.0. That alone given I wouldn't want to imagine what all the additional requirements for DX11 would cost in hw. I might be wrong but I wouldn't be surprised if it would be in the =/>50% region.
Take that hypothetical figure and consider a SoC GPU block that's 20mm2 and another that's 30mm2 under the same process and frequency target and tell me what power could look like for the first vs. the latter.
GK20A is arguably overkill for how the ULP SoC market has shaped up in terms of featureset; T760 and any other GPU IP therefore is typically highly scalable. You can scale on a T760 theoretically up to 16 clusters if needed. In the above 5422 case example you have a 136mm2 SoC monster, for which TegraK1 is by a bit smaller but if you clock the GPU even at just 150MHz it'll run circles around the first.
In that regard I'm more puzzled about why QCOM chased its heels to get DX11 into its Adreno4x0 family of GPUs than anything else; NV to its defense had Kepler already developed and it's sounds like less work to try to shrink an existing design then start from a tabula rasa with a somewhat ~DX10 full compute capable design.
WTF do I need tessellation for in the next GalaxyS5 exactly? Unless someone has the superdumb idea to create a live wallpaper for Android with a shitload of tessellation in order to empty the phone's battery in a blink of an eye, I don't see it being of any meaningful use there.