Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Exactly, which is why it doesn't make any sense to bring Puma into the discussion. A57 was suggested as an alternative to the CPU in the consoles. Timelines didn't seem to line up for A57, so it was already established that it was not really an alternative at the time (although I still think it would have been relatively very easy to have it earlier if there had been interest in 64-bit ARM from the "big boys"). Next up, performance was discussed, in relation to the consoles obviously. How could you lose track?
I didn't. I just reminded everyone including you, that if one hypothetically discusses later CPU core designs (A57 as in the Tegra X1 in the Shield) as alternatives, you should always consider contempory designs for comparison (and I brought up an example of a chip released significantly before the one chosen by you). (Not only) otherwise the discussion is pointless. If the A57 implementation in the X1 would have been an alternative, surely the same would apply for the Puma cores in Mullins, as the chip was available roughly a year earlier than the X1. If it's not, you are argumenting in a meaningless vacuum. I hope you get my drift. ;)

==========================================

Anyway, how is the general opinion regarding the dock? There is some halfhearted denial from Nintendo, regarding the claim that the performance goes up when docked. For me, it appears pretty natural the clock speed wouldn't be limited by concerns about battery life anymore. So the performance should be easily scalable by some significant percentage when docked, isn't it?
 
The emotive language here suggests an investment in Nintendo that'll make fair discussion impossible...

By "trash", do you mean achieve more technically or look better? Because these games you cite are definitely very pretty, but they're also very controlled. I can't think of any similar titles on PS3 to make a reasonably fair comparison to. Those going 2.5D tend to be pushing the envelope in other ways, such as LBP's novel lighting and dynamic materials. And obviously smaller indie titles don't have the budget or art talent of Nintendo.

Not at all, I can say with no reservation that Nintendo made mistakes with Wii and Wii U, esp. the latter. On the flipside I can see some anti Nintendo rhetoric, I mean what you're writing makes it seem like you think Wii U has no advantage over Ps3 when it clearly has a much more modern gpu and superior ram set up. With the Wii, adding enough power to make games in HD and a bigger cpu uptick would've done wonders, Nintendo themselves have said Wii should've been HD from the start.

Both really, although to be fair very few high profile ps3 developers made games with a cartoony aesthetic. Lighting and material shaders are the most obvious improvements in Wii U games over Ps3 software.

1) Why not play on the TV you have? 2) SD res on even a 32" looks terrible. 3) I've never heard anyone compliment Wii's visuals, and I heard plenty of ordinary people complain about how things looked. they did the job. Had the hardware been much better, the platform may have had a longer lifespan and not have been relegated to the drawer so quickly.

1. You can say that about any console, Why not play an N64 on a big HDTV? Well, that's not how the graphics were meant to be displayed.

2. Looks good to me on a 32". Sure the jaggies are there, but real 16:9 480p is crisp enough.

3. Anecdotal, i've heard many people praise a lot of different Wii games for their visuals. I bet the same people that bash Wii/Wii U graphics will praise Xbox/Ps3 graphics and how they were so impressive for their time, and still look good. But neither of our statements here are objective, so let's move on.
This argument has zero relevance to the technical thread and the hardware. You're arguing subjective aesthetics and art style. Panzer Dragoon Orta recreated on PS4 would look a lot better - more power enables you to do more.

More power lets you do more, but I was just pointing out that some of the techniques developers use nowadays only make games look worse than they used to. I'm not interested in hearing arguments for effects like temporal AA, excessive lens flare and chromatic aberration. Even on the technical side of things, an original game that was made with older hardware in mind may lose some of its positive aesthetic in the transition to newer hardware. But that's another long discussion.
It doesn't need quantifying. It's obvious when a game screenshot couldn't come from a previous generation. SWBattlefronts and Quantum Break are what a generational advance typically means. Wii U versus Wii is clearly a generational advance. If Nintendo were to create a next-gen TV console, running Mario in 1080p high fidelity, actual AA and AF, fabulous dynamic GI, they should be looking at a 4 TF machine coming out next year, and it'd clearly look a generation ahead of Wii U and be unmistakable.

NS is probably more like a half-step upgrade, a PS4Pro. Which makes sense in the context of its portability, but goes against your assertion that slow progress of technology means 3x Wii U is adequate for a TV console. It's not slow technology progress preventing Nintendo from releasing a 4TF next-gen machine.

Funny you mention Battlefront and quantum blur but they happen to use the standard resolution for Xbox 360 games. Nintendo's not the only one that skimped on that 1080p high fidelity goodness, haha.

But yeah, on the gpu side of things the Switch appears to be a Ps4 pro type of upgrade, but we'll see. Although again, the cpu is a massive leap, probably will be better than the current consoles on that front, and the Ps4 pro only has an extra half gig of memory for games.
 
they finally swallowed their pride and their only option to get things done was to reach out to Nvidia to do the custom OS with them.
They swallowed their pride long before that when they went with Nvidia in the first place....
 
I suspect Nintendo doesn't know how to handle increased cpu and gpu performance from a software perspective, so that's why they won't bother with improved performance when docked.
 
Remember, their initial launch window was well before Holiday 2016. I suspect the delay is entirely related to software issues and not due to hardware issues. That should temper people's hardware expectations.

I could completely see Nintendo underestimating the amount of time it takes them to write a custom OS for the device. Once they hit and were delayed by that issue, they finally swallowed their pride and their only option to get things done was to reach out to Nvidia to do the custom OS with them.
Or it could've been because they needed more time to produce more hardware, due to 16nm chip yields. Even if it's closer to TX1 than 2, most likely the chips won't use the flawed 20nm process.
 
I suspect Nintendo doesn't know how to handle increased cpu and gpu performance from a software perspective, so that's why they won't bother with improved performance when docked.
That is pretty trivial. CPU load shouldn't vary that much, so not really a problem. Graphics clocks could certainly be scaled especially if the system is overspec'd for the mobile side (720p) and downclocked. The problem is bandwidth. You aren't going to double bandwidth just by docking, which means you have to provide enough for all the AAA titles to render at 1080p while docked, and that is just not realistic (IMHO). The most likely answers are either: 1) it can provide more performance while docked but not enough for Nintendo to just blanket say 1080p when docked and they don't want to say "sometimes yes sometimes no", 2) the extra performance wasn't deemed enough to really matter in most cases, so avoid confusion and always provide the same experience.
 
I suspect Nintendo doesn't know how to handle increased cpu and gpu performance from a software perspective, so that's why they won't bother with improved performance when docked.

Honestly, what is with this idea that Nintendo are technically unambitious? Their programmers are among the best in the industry and they always aim for polish. I'll grant, their Wii U games have been very safe but that wasn't the case with the Wii. They quickly realized they made mistakes with Wii U and just didn't know what to do with it.

Using extra gpu power is easy, just go from 720p to 1080p. Although there's no way just docking it would give that much of a boost. More likely is that the games would just be down sampled on the handheld. Plus the DX11/12 era gpu will let them use more advanced effects, although yeah the games won't have vastly more complex geometry and such.

On the cpu side, for their platformers yeah they didn't really need extra power but when working on Star Fox Miyamoto said a lack of cpu power was the biggest issue they had, so clearly they have things in mind.

And that old Cpu in Wii U is clearly affecting Zelda, if not their platformers.
 
If they want to scale down power consumption and performance in undocked mode, it may be more efficient to cut the power entirely to a fraction of the CUDA cores.

For example, due to leakage effects, turning off half of the CUDA cores while keeping the clock speed unchanged may result in the same performance but LESS power consumption than simply halving the clock speed. Transistors consume a certain amount of leakage power regardless of the clock speed, and generally, transistors designed for speed leak more. This is why you see some SOCs with different implementations of the same CPU core optimized for different clock speeds. Don't waste the leakage of fast transistors when you are just going to clock slow anyway.

Sent from my Nexus 6P using Tapatalk
 
Winchester.

Ba-Dum Tss






At my place it works great. PS4 connected through Gigabit to Asus RT-AC68U and it works really well.
Of course, fast paced games can't be done but slower ones (e.g. Heavy Rain remaster) are great to play streamed.




What exactly is Pascal if not "custom Maxwell", then? Architecturally, it's an evolutionary update at best. Just like the PS4 Pro may be a lot closer to Vega than Polaris (2*FP16 but no HBM), the "NX1" could be a lot closer to Pascal than Maxwell without getting the full Pascal features like e.g. that VR stuff.

Custom could mean many things from my understanding , it could mean a cut down version of Maxwell or downglocked with some other benefits, but if you watch eurogamer video about leaked specs they say at the end of the video all sources point to Maxwell, and the link I posted about the ubi developer hints at maxwell. I hope it's Pascal but knowing Nintendo and how they don't really care about power, I'm thinking they got a Maxwell dirt cheap.
 
Custom could mean many things from my understanding , it could mean a cut down version of Maxwell or downglocked with some other benefits, but if you watch eurogamer video about leaked specs they say at the end of the video all sources point to Maxwell, and the link I posted about the ubi developer hints at maxwell. I hope it's Pascal but knowing Nintendo and how they don't really care about power, I'm thinking they got a Maxwell dirt cheap.
Pascal doesn't give much of a performance boost over Maxwell from what I read. It's mostly power consumption improvements. 256 cores, Maxwell or Pascal, get pretty much the same performance.

Sent from my SM-G360V using Tapatalk
 
Pascal doesn't give much of a performance boost over Maxwell from what I read. It's mostly power consumption improvements. 256 cores, Maxwell or Pascal, get pretty much the same performance.

Sent from my SM-G360V using Tapatalk

Well maybe you're right people keep on saying it's a big difference, but what's the price difference I'm guessing Maxwell is much cheaper and Nintendo would choose that option.
 
I'm guessing Maxwell is much cheaper
There is no incentive for NV to price their tech in this manner.... If the chip is truly custom, the design fee would be the same regardless. The long term cost variables Nintendo would be concerned with are chip size (manufacturing cost), memory interface & long-term costs of design and materials. There is no sane reason AFAICS, that Nintendo would choose Maxwell over Pascal. Pascal will be more power efficient, which means more battery life or a smaller battery (reduced cost for the life of the system). Now, Nintendo has certainly made many poor decisions in the past, but even I don't believe they are that dumb. And even if they were, I would expect Nvidia to talk them out of it, because a poor product is going to reflect poorly upon them as well.
 
Don't forget that Nintendo is risk-averse. The risk of bugs or other issues with an older silicon-proven microarchitecture may be lower than with something newer.

Sent from my Nexus 6P using Tapatalk
 
There is no incentive for NV to price their tech in this manner.... If the chip is truly custom, the design fee would be the same regardless. The long term cost variables Nintendo would be concerned with are chip size (manufacturing cost), memory interface & long-term costs of design and materials. There is no sane reason AFAICS, that Nintendo would choose Maxwell over Pascal. Pascal will be more power efficient, which means more battery life or a smaller battery (reduced cost for the life of the system). Now, Nintendo has certainly made many poor decisions in the past, but even I don't believe they are that dumb. And even if they were, I would expect Nvidia to talk them out of it, because a poor product is going to reflect poorly upon them as well.
There is no pascal on 28nm or 20nm which would be a lot cheaper than 16nm. That means maxwel is cheaper than pascal if nintendo wants to use cheap tech.
 
There is no incentive for NV to price their tech in this manner.... If the chip is truly custom, the design fee would be the same regardless. The long term cost variables Nintendo would be concerned with are chip size (manufacturing cost), memory interface & long-term costs of design and materials. There is no sane reason AFAICS, that Nintendo would choose Maxwell over Pascal. Pascal will be more power efficient, which means more battery life or a smaller battery (reduced cost for the life of the system). Now, Nintendo has certainly made many poor decisions in the past, but even I don't believe they are that dumb. And even if they were, I would expect Nvidia to talk them out of it, because a poor product is going to reflect poorly upon them as well.

please look at wii and wiiu, many of the choices were not sane, I expect them to use 20nm Maxwell, and it sales are good, or great, they will switch to pascal in a future revision. let's be realistic Nintendo probably made this deal more then a year ago with nividia, and the most likely specs are Maxwell because of devkit leaks, and other sources. I'm not saying this fact, but I would put my money on it.
 
please look at wii and wiiu, many of the choices were not sane, I expect them to use 20nm Maxwell, and it sales are good, or great, they will switch to pascal in a future revision. let's be realistic Nintendo probably made this deal more then a year ago with nividia, and the most likely specs are Maxwell because of devkit leaks, and other sources. I'm not saying this fact, but I would put my money on it.

just wanted to add this this,
eurogamer.com tech analysis, he says all sources point to Maxwell, meaning developers, are all telling him these are the likely the final specs, and they haven't heard anything about pascal.
 
just wanted to add this this,
eurogamer.com tech analysis, he says all sources point to Maxwell, meaning developers, are all telling him these are the likely the final specs, and they haven't heard anything about pascal.
This is an example of people seeing and hearing different things from the same presentation.
The video was made before the Switch reveal, Richard state clearly that the reports he has had is about the devkit, and speculates that the final product may use an updated SoC.
There is NO point in the video where he says what you claim about "final specs", in fact he is speculsting the direct opposite!
 
Status
Not open for further replies.
Back
Top