Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Comparisons between phones and tablets and a games consoles are potentially hugely invalid and terribly myopic.

Consoles have to be able to run the equivalent of a power virus on both the CPU and GPU simultaneously without throttling or overheating (if you don't believe me look at the Hot Chips presentation on the 360S). And a toy maker can't have a handheld become red hot or even "really really warm".

NX isn't tuned to run a damn android benchmark once with minimal throttling, it's built to take years of abuse from hour after hour of platform specific, hand tuned code without ever deviating from its performance profile.

And so you think the "toy" must scale back to half the CPU clocks and 1/3rd the GPU clocks to meet "toy" demands, compared to tablet/consumer version of what is rumored to be the same chip?

If this was remotely true, the PS4 Pro should have a 500MHz GPU.
 
And so you think the "toy" must scale back to half the CPU clocks and 1/3rd the GPU clocks to meet "toy" demands, compared to tablet/consumer version of what is rumored to be the same chip?

If this was remotely true, the PS4 Pro should have a 500MHz GPU.
There is a difference. You don't hold the PS4 in your hands, its core demoraphy isn't kids, holding the device their hands. Application wise - tablet apps rarely push cpu/gpu to 100% usage for extended periods. It would be a large design blunder if Switch hurt kids hands from the heat like Tablets can if you run benchmarks that push all cores to their limits.
 
I don't get this. At the start of the gen we had game run on ps4/xbox one and xbox 360 / ps3. The switch in portable mode is I believe a quad core cpu with a 320 gflop gpu and 4 gigs of ram. That goes against a tri core cpu with a 240gflop gpu and 512 gigs of ram. The gpu in the switch is also a modern gpu vs the one designed in 2004 for the xbox 360.

I think ports will work just fine they will just be lower resolution with lower res textures. Like I've said in the past I believe the system will make games that look great on that 6 inch screen and will be fine in docked mode. I just wish it had a bit more power in portable mode
For comparative purposes, it's 160 gflops fp32.

I'm more worried about multi platform game development for the next 6 years which should be the minimum acceptable lifespan of a non-FC/non-BC new console. As all games must be made for two targets because of mid-gen, multi plat wanting to be on switch, and ps4/pro, and xb1/scorpio, have this range of hardware to pass testing...

0.16TF switch mobile
0.384TF switch docked
1.3TF xb1 fat
1.4TF xb1 slim
1.84TF ps4 fat/slim
4.2TF ps4 pro
6.0TF scorpio

Making a game engine that can scale 0.16TF to 6TF without either extreme suffering compromises is going to be development hell.
 
Last edited:
And so you think the "toy" must scale back to half the CPU clocks and 1/3rd the GPU clocks to meet "toy" demands, compared to tablet/consumer version of what is rumored to be the same chip?

If this was remotely true, the PS4 Pro should have a 500MHz GPU.

That's complete crap, and you're in a state of wilful denial.

PS4 Pro runs massively below peak RX480 clocks - and RX480 throttles under games, never mind a power virus. PS4 Pro can be trusted to maintain its performance profile even under hand tuned software - while RX480 throttles even under generic DX11 stuff.

There's a huge difference between a hardware configuration that runs at its highest possible performance profile when it can (phones and graphics cards), and one that runs at a baseline it can always maintain (consoles) however badly it's tortured.

And phones and tablets get fucking hot even when running their shitty and/or poorly optimised software. Somewhere after the point they get worryingly hot, they start to throttle. NX can get neither worryingly hot nor throttle, and it needs to be sure of this by some margin, and be so sure of this it can laugh it off for hours after hour after hour.

Or maybe you think there's no engineering reason for this, and Nintendo just dropped clocks so they can rub one out over fan disappointment?
 
I hear you, trust me, if anyone here would love for this to be wrong its me, but Eurogamer gained my trust with their dead on leaks back in July.

The problem here is that Eurogamer article mixed up two completely different stories, and these stories don't even align with each other. One is the Twitter rumor which is basically just a dump of TX1 specs with apparently ignorant mistakes like suggesting the presence of 14.4 ROPs. The other is the report on the clocks they got today.

The only thing they have going for the Twitter leak is some developer saying it's "uncannily similar" to the specs they heard themselves.
But the twitter leak said 2GHz CPU and 1GHz GPU. And now Eurogamer's own sources are claiming the clocks are actually halved in the CPU and 25% to 70% lower on the GPU.
So either the twitter specs are not as "uncannily similar" as Eurogamer's earlier source suggested, or the recent clock numbers are wrong.
You can't have both.

Simply picking up the lowest common denominator of both new pieces and then assuming those are the final specs is, IMO, just as much wishful thinking and naiveness as fanboys thinking there will be magical fairy dust that triples performance if you fondle the console while saying Reggie Fils-Aime 5 times in a row


There is a difference. You don't hold the PS4 in your hands, its core demoraphy isn't kids, holding the device their hands. Application wise - tablet apps rarely push cpu/gpu to 100% usage for extended periods. It would be a large design blunder if Switch hurt kids hands from the heat like Tablets can if you run benchmarks that push all cores to their limits.

So kids can't use tablets? Or tablets being used by kids can't run games for long periods of time? And who/how would anyone control that?
There aren't any app store / play store guidelines stopping developers from making full use of the GPU + CPU in any app. There are 3D demanding games for tablets, and kids use those when/if they want. Kids' hands aren't burning everywhere due to using tablets with e.g. Real Racing 3, and their GPUs are not having to go down to 1/3rd their nominal clock rate to stop kids' hands from burning.

I don't mean to say there aren't additional steps needed if you want to target that demographic. Sure, there are. It's just that the gap here is so big that it's not believable.
 
The problem here is that Eurogamer article mixed up two completely different stories, and these stories don't even align with each other. One is the Twitter rumor which is basically just a dump of TX1 specs with apparently ignorant mistakes like suggesting the presence of 14.4 ROPs. The other is the report on the clocks they got today.


Maybe, just maybe there is OS Reservation of hardware of 10% just like it was on Xbox One and PS4 on launch, and the numbers specified were what the Game Developers have access to. Nah, that doesn't fit your narrative so that can't be it.
 
If the clocks are believed, and that it's based on the X1 (regardless or it being 20nm or 16nm), what does it change whether the 14.4 is a mistake or not?
 
Well, as expected, disappointment. The undocked clockspeed cut seems pretty drastic though - 1080p -> 720p reduces pixel count by a lot, but I would have expected the geometry load to be much closer to a constant. Or is the idea that you can also get away with significantly less complex models due to the fairly small screen?
 
Well, as expected, disappointment. The undocked clockspeed cut seems pretty drastic though - 1080p -> 720p reduces pixel count by a lot, but I would have expected the geometry load to be much closer to a constant. Or is the idea that you can also get away with significantly less complex models due to the fairly small screen?

GPU bottleneck should most commonly been fragment shading. Certainly, if you're planning around this it should be. Lodding should mean at 720p you cull polys too small to be seen at a closer distance and so geometry load should be reduced.
 
Maybe, just maybe there is OS Reservation of hardware of 10% just like it was on Xbox One and PS4 on launch, and the numbers specified were what the Game Developers have access to. Nah, that doesn't fit your narrative so that can't be it.

I'm not smart enough to know if this is possible, but is that specific metric possibly a metric brought on by memory clock speed being reduced when in portable mode? I know the memory doesn't actually limit the ROP theoretic maximum, but is it possible that the math suggest that full ROP usage is impossible with LPDDR3 1300?

Eurogamer remained quiet on the specs subject, and with only a few weeks lead up to the full reveal, it seems like they could have remained silent on the subject if they weren't pretty confident. Why tarnish their credibility if in a few short weeks they were to be proven wrong? Not saying its impossible, but just less likely than a Nintendo product having lower specs than originally thought.
 
Maybe, just maybe there is OS Reservation of hardware of 10% just like it was on Xbox One and PS4 on launch, and the numbers specified were what the Game

Xbone had 10% reservation of the whole GPU. 14.4 pixels/clock means 10% of very specifically fillrate, which doesn't really make sense. There's nothing made entirely of fillrate, as even the simplest 2D UI elements would need some amount of sp effort.


Nah, that doesn't fit your narrative so that can't be it.
Except I'm not the one who built a narrative to start with.
I'm not the one who made up a list of specs consisting of the lowest common denominator from two different news pieces that aren't even compatible with each other and then proceeded to move or delete posts that questioned the validity of said spec list -> despite the news source itself claiming these pieces weren't compatible and there's information still missing.
All I said - and keep saying - is the specs between the twitter leak and the clocks leak don't match. That's a fact. GPU clock speeds also don't match with the fan requirement and previous anecdotal experiences with other TX1 devices. 14.4 pixels/clock doesn't make sense. HDMI 1.4 instead of 2.0 doesn't make much sense either, unless HDCP2.2 is significantly more expensive than 1.0, otherwise they're just losing the chance to make a good future-proof netflix streaming device (available in Wii and Wii U BTW).

There's no narrative, just the blatantly obvious pointing out that there's missing and misaligned info. I wouldn't be pointing this out if the twitter leak made sense. Or if its clock speeds even remotely matched the newly leaked clock speeds. Or if the console wasn't so thick and had active cooling.
 
Nah, that doesn't fit your narrative so that can't be it.
I mean... it just does not make any sense. You can not reserve 0.6 of a ROP for a single cycle, and even if you could do that, what would be the point without reserving any texturing, memory, or bandwidth to go along with it. The only way 14.4 makes any sense is as gigapixels, which would place the clock of the device at 900Mhz presumably when docked (which conflicts the provided clocks). But given the 14.4 figure is nonsense, I don't have much faith in any of the other numbers...


I also really don't see the need to push unverified information upon the community, when real data will be available in the near future....
 
Last edited:
I mean... it just does not make any sense. You can not reserve 0.6 of a ROP for a single cycle, and even if you could do that, what would be the point without reserving any texturing, memory, or bandwidth to go along with it. The only way 14.4 makes any sense is as gigapixels, which would place the clock of the device at 900Mhz presumably when docked (which conflicts the provided clocks). But given the 14.4 figure is nonsense, I don't have much faith in any of the other numbers...


I also really don't see the need to push unverified information upon the community, when real data will be available in the near future....

nintendo will not reveal the specs anytime in the near future. this is the best were probably gonna get. DF says there pretty sure on the specs, they say there will not be in changes in pure compute capability or cpu. pretty sure we will find out about 14.4 fill rate later, just like how we found out about the wiiu gpu being 176 gflops which made zero sense to everybody. anyway just too many sources confirming these specs not to believe them.
 
As a pure mobile device that will be gaming at sustained loads all the time, it really isn't a bad hardware specification. Things were done to achieve more than 5 minutes of mobile gaming at consistent performance. Despite the compromises done for power management reasons, It's still going to beat the living snot out of the Nintendo DS.

Things only look bad when if you expect it to compete with hardware that has 15x the power draw corded to wall outlets.
 
For comparative purposes, it's 160 gflops fp32.

I'm more worried about multi platform game development for the next 6 years which should be the minimum acceptable lifespan of a non-FC/non-BC new console. As all games must be made for two targets because of mid-gen, multi plat wanting to be on switch, and ps4/pro, and xb1/scorpio, have this range of hardware to pass testing...

0.16TF switch mobile
0.384TF switch docked
1.3TF xb1 fat
1.4TF xb1 slim
1.84TF ps4 fat/slim
4.2TF ps4 pro
6.0TF scorpio

Making a game engine that can scale 0.16TF to 6TF without either extreme suffering compromises is going to be development hell.

I wouldn't go with 6 years. If its on the cheaper side they can replace it in 3 years with a new tegra chip. Also if needed they can refresh the design and just run the older switches at a higher clock speed and get worse battery life. Its Nintendo and they are all about dickish things
 
Comparisons between phones and tablets and a games consoles are potentially hugely invalid and terribly myopic.

Consoles have to be able to run the equivalent of a power virus on both the CPU and GPU simultaneously without throttling or overheating (if you don't believe me look at the Hot Chips presentation on the 360S). And a toy maker can't have a handheld become red hot or even "really really warm".

NX isn't tuned to run a damn android benchmark once with minimal throttling, it's built to take years of abuse from hour after hour of platform specific, hand tuned code without ever deviating from its performance profile.

agree. even PS4 runs at constant clock, it generates heat very widely

Uncharted 4, The Last of Us, FFXV -> PS4 airplane jet taking off
any other games usualy PS4 is silent enough.

NX will need to be able to accomodate that crazy constant heav load without throttling or making your hands uncomfortable
 
Omg, those clockspeed... Few days ago we were talking about "hey, at least the cpu could be better than ps/one". With that downclock, it's not even the case anymore.

IMO, it looks like they were shooting for a portable device with WiiU power. That's it. And the dock/tv/oc thing was a last minute thing ? I wonder if in some situations the actual 3 cores wiiU cpu is not a little more powerfull, because it was pretty efficient if I remember some devs comments. Hell, even the GPU, with a big chunck of fast edram... I wonder how much this "tegra" is customized. If it's only a downclock, then "lol"... I wonder if 720p on portable mode is even a good thing...

I wonder what the devs at Nintendo at thinking, seeing their friends at other compagnies work with a loooot more power for years now. Big open game, full of life, crazy art work, voice acting, etc. But them "eh, our system sucks, so, make a cartoon zelda one more time". I know I know, it's not that simple, I'm just frustrated :/
 
Omg, those clockspeed... Few days ago we were talking about "hey, at least the cpu could be better than ps/one". With that downclock, it's not even the case anymore.

IMO, it looks like they were shooting for a portable device with WiiU power. That's it. And the dock/tv/oc thing was a last minute thing ? I wonder if in some situations the actual 3 cores wiiU cpu is not a little more powerfull, because it was pretty efficient if I remember some devs comments. Hell, even the GPU, with a big chunck of fast edram... I wonder how much this "tegra" is customized. If it's only a downclock, then "lol"... I wonder if 720p on portable mode is even a good thing...

I wonder what the devs at Nintendo at thinking, seeing their friends at other compagnies work with a loooot more power for years now. Big open game, full of life, crazy art work, voice acting, etc. But them "eh, our system sucks, so, make a cartoon zelda one more time". I know I know, it's not that simple, I'm just frustrated :/
It could never be better. PS4/One have 7 cores for games, Switch 3-3.9.
Wii U cpu is slow without a good SIMD, overclocked gamecube x 3. Devs were somewhat happy only because it is out-of-order.
 
Status
Not open for further replies.
Back
Top