Assuming Ubisoft is continuing the AC Chronicles series, then I would bet that it makes its way to Switch for sure.
So Dave over at Gaf has been continuing to run test, and it seems that the CPU can throttle the GPU, and vice versa. Basically, they cant both sustain max heavy loads without throttling. It makes sense for this to not be immediately clear when running benchmarks, seeing as how benchmarks are typically only pushing the CPU or GPU, not both at the same time. Shield TV acts more like a tablet/phone than a console. It throttles whenever it can. Once he was able to completely lock t the CPU clocks to max speed, the GPU would then throttle between 614 Mhz and 768Mhz. I would still like to see what the clocks do while gaming, but I think its likely that the clock speeds are not at maximum when playing games. Who knows, perhaps if a game doesn't push the CPU, then the GPU can stay clocked at 1Ghz, but if t the CPU is being pushed, the GPU lowers clocks. The fact that these test keep showing a 768 Mhz throttled speed under load seems to suggest that its the sweet spot, going higher isn't sustainable with such meager cooling.
Simply clocking high isn't a big deal, but clocking high under load is another story. The custom API is only going to allow utilization to go up. So at a given frequency, the Switch is going to actually be doing more work than the Shield TV can. Android is a thick API that doesn't allow developers access to the metal. I think in a few days its going to be pretty obvious that the Tegra chip powering switch is the best Nvidia's got within the design parameters, and not a cut down TX1 like some thought after hear clock speeds.
Dave also ran some test for bandwidth, and his test showed good performance at 1080p 8X MSAA. The scene looked pretty simplistic, but results didn't point to an obvious bottleneck with memory bandwidth.
Just a few more days......
So Dave over at Gaf has been continuing to run test, and it seems that the CPU can throttle the GPU, and vice versa. Basically, they cant both sustain max heavy loads without throttling. It makes sense for this to not be immediately clear when running benchmarks, seeing as how benchmarks are typically only pushing the CPU or GPU, not both at the same time. Shield TV acts more like a tablet/phone than a console. It throttles whenever it can. Once he was able to completely lock t the CPU clocks to max speed, the GPU would then throttle between 614 Mhz and 768Mhz. I would still like to see what the clocks do while gaming, but I think its likely that the clock speeds are not at maximum when playing games. Who knows, perhaps if a game doesn't push the CPU, then the GPU can stay clocked at 1Ghz, but if t the CPU is being pushed, the GPU lowers clocks. The fact that these test keep showing a 768 Mhz throttled speed under load seems to suggest that its the sweet spot, going higher isn't sustainable with such meager cooling.
Simply clocking high isn't a big deal, but clocking high under load is another story. The custom API is only going to allow utilization to go up. So at a given frequency, the Switch is going to actually be doing more work than the Shield TV can. Android is a thick API that doesn't allow developers access to the metal. I think in a few days its going to be pretty obvious that the Tegra chip powering switch is the best Nvidia's got within the design parameters, and not a cut down TX1 like some thought after hear clock speeds.
Dave also ran some test for bandwidth, and his test showed good performance at 1080p 8X MSAA. The scene looked pretty simplistic, but results didn't point to an obvious bottleneck with memory bandwidth.
Just a few more days......