As a Home & Mobile console all-in-one.
At the very least I would think the Tegra chip in the Switch is matched to a 128 bit memory bus, processed on 16nm FinFet, and clocks north of 1Ghz. This is one powerful portable console.
It's the standard configuration for Parker (nVidias 16nmFF TegraX1 successor).That would require two LPDDR4 x64 packages, or 4 x32 ones. That's probably very expensive...
Why?At the very least I would think the Tegra chip in the Switch is matched to a 128 bit memory bus, processed on 16nm FinFet, and clocks north of 1Ghz.
Why?
Seems doubtful they'd blow past the leaked dev kit specs by doubling up on the bus width or increasing clock speeds, both of which have implications to die size, TDP & power consumption. Shifting from 20nm to 16nm is mostly a win for power consumption with miniscule density increase (short of massive re-architecting).
If they shaved off the A53's, that would further save die space.
Seems doubtful they'd blow past the leaked dev kit specs
The memory bus takes up chip perimeter, so it depends.Is it impossible to match up a standard Tegra X1 to a 128 bit memory bus?
For something that goes into a vehicle...The move to 16nm Finfet allows the Tegra Parker to boost clocks up to 1.5Ghz.
Does anyone know if the Tegra X1 is bandwidth starved? On paper it looks bad, and you would think its a huge bottleneck for a 512Gflop GPU rendering games like Doom BFG and Trine 2 at 1080p, but it pulled those off on the shield console with fantastic results, and this is running on the Android API. I know the 2nd Gen Maxwell incorporated improved compression, but holy crap.
Honestly, does anyone know if this leak appeared anywhere else other than a single GAF post from a totally random guy who basically dumped the specs of a Tegra X1?
AFAIK that leak is just as plausible as Goodtwin's speculation.
I don't think anyone disputes the idea that the TX1 was used in the developnent kits, but that doesn't mean it's final hardware. I remember Shin'en said that development kits became more capable as the release of Wii U drew near. I think TX1 gives us a ballpark idea of performance, but I fail to see how Nvidia spent hundreds of man hour years slightly modifying a TX1.
Sent from my SM-G360V using Tapatalk
I remember Shin'en said that development kits became more capable as the release of Wii U drew near.