Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
PS4 and Xbox are fully custom SoCs. AMD does not sell anything even remotely similar...

yes they do. do you're research
Not really. Both have notable customisations. PS4 has new APU features, 8 ACES, and stuff. XB1 has DX12 enhancements, a few extra gubbins (3 display planes, 2 more DMA units), DSP, yada yada. Customisations aren't massive, but are different from 'just cut down versions of the originals'.

Not to mention 8 CPU cores!

my bad, but the ps4 uses a The PS4 uses a "modified" HD 7870 right?
 
IMO, if Pascal was the target GPU of final hardware, they wouldn't be using a TX1 for the devkits.

But part of rumors say that the NS is faster when docked. I don't think we should discount that posibility just yet. It is posible that the GPU is comparable to TX1 when in haldheld mode and comparable to TX2 or slightly higher when docked. If that's the case, it makes sense to target the TX1 level of performance with the 720p screen in mind and then use the higher capabilities when docked to increase the resolution.

PS: When considering the viabilty of the NS as a "competitor" to the XB1/PS4, at least when it comes to third party developers targetting the console, has anyone considered Multi-res Shading? It looks like it could help improve the perceived image quality, without sacrificing a lot of performance.
 
Last edited:
PS4 and Xbox are fully custom SoCs. AMD does not sell anything even remotely similar...
semi custom.
easiest way of seeing it is they take base parts from both gpu and cpu and made veering levels of customisations to make the soc.
so you can see where the roots are from but doesn't make them the same.
replying to you as you said fully custom, but is just a general statement
 
Don't dev kits usually overshoot the specs? Like how the ps4 and xbox one dev kits had 7970s in them when the end consoles were about half as powerful?
I think they typically have more memory for debugging.

Pascal or Maxwell doesn't really change much in terms of performance. Same core count at the same clocks will result in the same performance. Pascal is better for energy consumption, and I agree that it seems like that would be important to Nintendo with a product like this. Assuming the rumored 25GB/s bandwidth is correct, I would also think the badwidth saving compression techniques incorporated into Pascal would be important. I lean towards Pascal for the retail units because of the energy consumption improvements. Not saying it couldn't be a Maxwell 20nm part, but with Nintendo's priorities, it seems like those qualities would be to beneficial to pass up.

Sent from my SM-G360V using Tapatalk
 
semi custom.
easiest way of seeing it is they take base parts from both gpu and cpu and made veering levels of customisations to make the soc.
so you can see where the roots are from but doesn't make them the same.
replying to you as you said fully custom, but is just a general statement

Well, it's semantics at that point IMO. I just wanted to highlight the fact that they are not 8 Jaguar cores and a 7870 slapped together with tape. It's a completely different chip that just happened to have similar GPU specs.
 
I'm not sure it really matters if the Switch is Pascal or Maxwell as long as it's on 16/14nm for the power savings. The SOC will not be pushing 1+ GHz speeds so the speed optimizations of Pascal are kind of irrelevant.

Will compute preemption or the dynamic scheduling really matter here when we have so few shaders? I doubt the VR specific additions are needed.

The only feature that looks to be really nice is the improved delta compression. However, we don't know if the custom SOC will have any embedded ram. An embedded RAM pool could make the gains of the compression negligible. 10-16MB of very fast embedded SRAM could probably go a long way to helping with slow bandwidth, especially if the target resolution is 720p. It seems like a Nintendo thing to do.
 
Well, it's semantics at that point IMO. I just wanted to highlight the fact that they are not 8 Jaguar cores and a 7870 slapped together with tape. It's a completely different chip that just happened to have similar GPU specs.
I knew what you meant that's why I said it's more as a general statement.
amd calls it semi custom, full custom is something else that's all.
 
I knew what you meant that's why I said it's more as a general statement.
amd calls it semi custom, full custom is something else that's all.

"Semi Custom" is a term used AMD to describe that arm of its business. However, AMD doesn't describes the console products produced by that business as semi-custom apus. AMD themselves in their marketing material describes the parts it provides to Sony and AMD as "custom" processors. Yes, the core technology is AMD but there are no 8 CPUs based apus, GDDR5 based apus, esram based apus or apus sporting greater than 1 Tflops outside of current gen consoles.

Both processors employ designs unique and differ in non-trivial ways to anything produced by AMD. Plus when has the definition of a "custom processor" required novelty all the way down to an architectural level.
 
Last edited:
@sebbbi
Can you shed some light on just how much game engines are using FP16? I have heard that Unreal 4 uses it extensively for mobile games, but I honestly cant find any detailed reports of this being true. If usefulness for FP16 operations in games has increased significantly, I suppose that could help a Tegra chip bridge the gap between itself and the Xbone/PS4.

There are reasons for optimism. Epic Games and Bethesda haven't supported Nintendo platforms in......ever? They are both listed on Nintendo's developer list, and Skyrim was shown on the trailer. I know Skyrim isnt actually announced, but I really think that's just Nintendo being coy about things.
 
@sebbbi
Can you shed some light on just how much game engines are using FP16? I have heard that Unreal 4 uses it extensively for mobile games, but I honestly cant find any detailed reports of this being true. If usefulness for FP16 operations in games has increased significantly, I suppose that could help a Tegra chip bridge the gap between itself and the Xbone/PS4.

There are reasons for optimism. Epic Games and Bethesda haven't supported Nintendo platforms in......ever? They are both listed on Nintendo's developer list, and Skyrim was shown on the trailer. I know Skyrim isnt actually announced, but I really think that's just Nintendo being coy about things.

I see no reason for optimism. This will be a more powerful nividis shield it's not gonna run most third party games properly, and developers are not gonna put the extra effort on a already risky platform to support. Nintendo games will looks nice. The look nice on wiiu but more powerful hardware sure would help them and the switch should be a 2x jump.
 
Epic is announced because the Switch support the Unreal Engine 4. Like the Wii U supported the 3. On paper. Having games using it is another story.
 
https://twitter.com/ImageForm/status/793482754767945728

Image & Form Games said:
Can't comment on specs, I'm afraid. But Nintendo are definitely not skimping on power!

Maybe it should be considered that this dev has never released a 3D game. It doesn't mean none of its elements have developed on 3D engines (or that they can't read a spec sheet and form an opinion), but as a team 3D games isn't really their expertize.
They did release the Steamworld games on many platforms (from 3DS to PS4) though.
 
Not sure about that, but the more relevant way to word that, IMO, is, has any company ever released a dev kit that is significantly slower (like Pascal vs. Maxwell) than the final hardware?

And I'm not really aware of any, although I haven't really kept track of all devkits through the years.

IMO, if Pascal was the target GPU of final hardware, they wouldn't be using a TX1 for the devkits.

Regards,
SB
Yes, initial PS3 dev kits were clearly weaker for instance. However, these things depend on the console, obviously. Modelling what in essence is medium power gaming PCs is super easy, since you have something very similar but more expensive/power hungry on store shelves already.
The Switch is not so easy. If they want developers to have a good idea of the capabilities of the retail device, use early versions of the development tools, and nVidia will supply the whole shebang, just what did they have to choose from, really? The Tegra X1, that's what.
There are two additional indications though. The dev kits are said to be fan cooled. The final device has what looks like rather substantial vents (compare with tablets for instance, or even many compact laptops), which, since they are present on the final device, indicate that it will draw rather substantial power at least when stationary. This implies that since the screen shuts down when in the dock, the device as a whole would draw substantially less when docked. But it seems odd to assume that the Switch, when portable, would draw enough power to require fan cooling as that would imply really short battery life. (Typical phablet battery capacity is roughly 10Wh. The 10" Pixel C has a 34Wh battery, and lasts roughly five hours playing games. Subtract the screen.) If the Switch wants to have reasonable battery life when mobile, SoC+memory, screen and wireless communication can't draw more than a maximum of say 5W, with three watts or so going to the SoC, on the outside.
That doesn't require forced air cooling, or substantial vents. So it seems reasonable to assume that the device will actually draw more power when docked, even though the screen is off, which implies different clocks docked and mobile.

Now, Nintendo could intend to save money and order a very heavily cut down Parker SoC, but then you have to question why on earth they would push such a SoC so hard that it requires forced air cooling. Seems unlikely. It only really makes sense if Nintendo actually chooses the X1 itself or a 16nm device with X1-equivalent (smaller FF SoC pushed harder) or higher performance. But there is no way to differentiate between these three from the choice of X1 for the dev kit, because the X1 was the only option that nVidia could provide early on at that power level or up. They unveiled Parker August 22nd. God knows the state of silicon right now.
 
Its certainly relative to what the developers needs are though, so I can understand why Tottentranz wouldn't read to much into their comment. Skimping on power could very well be comparing it to what Nintendo had with the 3DS. For them and the games they create, the Switch may have plenty of overhead to deliver their vision.

Looks like Sebbi has commented on FP16 and its usefulness in other threads:

Sometimes it requires more work to get lower precision calculations to work (with zero image quality degradation), but so far I haven't encountered big problems in fitting my pixel shader code to FP16 (including lighting code). Console developers have a lot of FP16 pixel shader experience because of PS3. Basically all PS3 pixel shader code was running on FP16.

In this case, the increased FP16 execution resources allow the GPU to run majority of the existing software faster while also maintaining good battery life.

We also have Mark Cerny talking up FP16 for the PS4 Pro.

Now correct me if I am wrong, but Xbox One and the standard PS4 do not see higher throughput with FP16 operations. So while its not a silver bullet, it does seem likely that there is a significant advantage to the Tegra chip supporting FP16. I think the 1 Tflop performance for the X1 was perceived to be a metric that was little more than a marketing bullet point, but from what I am reading regarding PS4 Pro supporting it, that no longer seems to be popular opinion.
 
Its certainly relative to what the developers needs are though, so I can understand why Tottentranz wouldn't read to much into their comment.

I only stated that 3D games aren't in that team's portfolio (which caters to their own needs as you mentioned). However, I actually counter-weighted that with the fact that this doesn't mean their members don't have experience with 3D engines or that they wouldn't be able to properly read into a spec sheet. And on another plus, they have developed for a lot of platforms which gives them a notion of comparison that others wouldn't have.


Don't bother, though. If you consult on Picao84's posts from the past 7 days, you'll note he only posts to follow some weird personal crusade.
All I can do is report and trust moderation to act...
 
My point was just: the fact that they may not have big needs should not be a measure of what they say. If that is how we would measure anyone's opinion, then we should all just shut up because the majority of us here likely has even less experience than them, plus no access to privileged info. In essence, Need != Knowledge. I just thought your comment was a little bit unfair and almost patronising to them.

@ToTTenTranz - if that is how we are going to play, expect some reports from me as well.
 
I guess power is always relative. I just bought a Lumia 950 XL (for development and out of curiosity for the Continuum functionality), and that thing has 2500x1440 resolution or something crazy like that on a 5.7" screen and a pretty powerful processor (I'm running the Continuum dock now and typing this into a browser on a 24" monitor with a regular keyboard and mouse, and it feels ... futuristic). I think for gaming the resolution that Nintendo has chosen for the Switch makes sense, in that it will make it relatively more powerful. It will be interesting to see if they manage to not disappoint for once - so far everything they've ever made bare one or two devices perhaps have always been underpowered, but that generally hasn't been a problem.
 
Status
Not open for further replies.
Back
Top