Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
I doubt anyone's actually following my argument. ;)

Let's say BoxBoy doesn't need the higher clocks (it doesn't). What's in it for the dev to use the lower clocks? It improves the user's experience, increasing their battery life a bit. But no-one's going to know you've done that and they're battery last a bit longer playing BoxBoy than Zelda.

If you play games on the go a lot you will notice which games are hammering your battery and which aren't. That may actually impact the user's decision on which game to play when away from the wall. It certain does me with 3DS and Vita.
 
I doubt anyone's actually following my argument. ;)

Let's say BoxBoy doesn't need the higher clocks (it doesn't). What's in it for the dev to use the lower clocks? It improves the user's experience, increasing their battery life a bit. But no-one's going to know you've done that and that their battery lasts a bit longer playing BoxBoy than Zelda. Sales of your games aren't going to increase because you've chosen the lower clocks. And given the lower clocks will never be of benefit to the developers where higher clocks often will, why even have them as an option? Just set the base clock at 384 MHz and be done with it!

Why would a developer choose to penalize their customer, or potential customers for no reason? If your game runs just fine at the standard profile, and you choose to use the higher one lowering battery life, that is bad business. I also wouldn't be surprised to see Nintendo enforce this at the point of quality testing. If they test your game and it runs fine at the base profile, they will likely question the developers poor choice to use the performance profile. Nintendo will police this performance profile, no question in my mind. I also wouldn't be surprised to expected battery life for games listed on the eshop, right there with the file size. So yes, it could be a marketing feature if your game last 5 hours in portable mode.
 
Zelda BoTW performance is getting blown out of proportion. Did people suddenly forget that DF had a 40 minutes of footage to dissect last month and only had a couple of performance dips? I believe John from DF even mention at Gaf that they couldn't replicate the performance drops by simply going to the same area doing the same thing. These drops are the exception, and not the rule. Are we to believe Edge just game Zelda BoTW a 10/10 if it had serious performance issues? I believe @function is right, Zelda BoTW is probably taxing memory bandwidth pretty hard at 900p, and occasionally this causes some stutter. I also believe that some of it is a streaming issue that is probably very hard to replicate, and thus will happen from time to time. It has been a thing in the Batman games for years.

You raise a good point about streaming.

If hicups are associated in some way with streaming, they may not always be replicable. Zelda is using WiiU assets on the Switch, and it can run in 1GB of ram, so with 3+GB on Switch there's probably a huge amount of cached data (that's already been loaded and decompressed) just sitting there. No point wasting power loading and decompressing data if you don't need to!

Under these circumstances hiccups may only occur occasionally instead of every time you hit an area. And depending on what's happening when data *is* actually loaded and decompressed, even then it may not happen every time.

BW could still be a factor in loading / streaming hiccups btw (possibly a copy in, read to CPU, write out uncompressed in more memory hungry format). Perhaps we'll find out once the analyses of docked vs un-docked are are in!
 
Max memory clock in handheld mode has dropped, likely contributing to the headroom for an increase in GPU clocks.

Seems like a reasonable tradeoff - it's at max docked clocks that the highest level of BW will be most needed.

Hopefully this can put to bed the arguments about Foxconn leak clocks and a secret last minute switch to A73 cores.

i'll never understand how it got that far, one or 2 delusional fanboy convinced a bunch of people/nintendo fans to believe random test over dev docs from nintedo and eurogamer, lol.
 
Last edited:
If you play games on the go a lot you will notice which games are hammering your battery and which aren't. That may actually impact the user's decision on which game to play when away from the wall. It certain does me with 3DS and Vita.
But that'll be far more a factor of game style than clock speed. Honest question - if a game runs at 75% utilisation at 307.2 MHz and 60% utilisation at 384 MHz, how many hours difference will that really be? The choice for long journeys will be simple games regardless of clock-speed choice.

Why would a developer choose to penalize their customer, or potential customers for no reason? If your game runs just fine at the standard profile, and you choose to use the higher one lowering battery life, that is bad business.
Why's it bad business? It's not going to affect sales. People aren't going to avoid buying your game because it runs the battery out faster. If that was a purchasing decision, no dev would choose to use the higher clocks, would they? "This is the best looking game on mobile and it plays like a dream. However it only plays for 4 hours instead of the typical 5 you'd expect from a full charge, so we recommend you avoid this game." :p
I also wouldn't be surprised to expected battery life for games listed on the eshop, right there with the file size.
Is it with 3DS?
So yes, it could be a marketing feature if your game last 5 hours in portable mode.
You could probably throw a few quality settings like post effects in to reduce render workload and increase battery lifes. Does anyone do that with portable games? When PSP introduced a whopping 50% increase in clock to 333 MHz, did people ask what clock speed a game ran out in order to make an informed purchasing decision? I don't recall it ever being listed on a box. If Nintendo had upped the base clock without offering the lower clock, would everyone be up in arms about the reduced battery life? Why isn't there a request for a 200 MHz mode for simple games as well?
 
You raise a good point about streaming.

If hicups are associated in some way with streaming, they may not always be replicable. Zelda is using WiiU assets on the Switch, and it can run in 1GB of ram, so with 3+GB on Switch there's probably a huge amount of cached data (that's already been loaded and decompressed) just sitting there. No point wasting power loading and decompressing data if you don't need to!

Under these circumstances hiccups may only occur occasionally instead of every time you hit an area. And depending on what's happening when data *is* actually loaded and decompressed, even then it may not happen every time.

BW could still be a factor in loading / streaming hiccups btw (possibly a copy in, read to CPU, write out uncompressed in more memory hungry format). Perhaps we'll find out once the analyses of docked vs un-docked are are in!

Yeah streaming data could be the culprit.

Hopefully Nintendo can further smoothen it. In my xiaomi redmi note 4 Mediatek with utterly shitty flash performance, I can reduce asset streaming stutter by a lot (but not eliminate it) by changing the io scheduler. Dunno something like this is viable for switch or not, or can helps zelda or not.
 
But that'll be far more a factor of game style than clock speed. Honest question - if a game runs at 75% utilisation at 307.2 MHz and 60% utilisation at 384 MHz, how many hours difference will that really be? The choice for long journeys will be simple games regardless of clock-speed choice.

Why's it bad business? It's not going to affect sales. People aren't going to avoid buying your game because it runs the battery out faster. If that was a purchasing decision, no dev would choose to use the higher clocks, would they? "This is the best looking game on mobile and it plays like a dream. However it only plays for 4 hours instead of the typical 5 you'd expect from a full charge, so we recommend you avoid this game." [emoji14]
Is it with 3DS?
You could probably throw a few quality settings like post effects in to reduce render workload and increase battery lifes. Does anyone do that with portable games? When PSP introduced a whopping 50% increase in clock to 333 MHz, did people ask what clock speed a game ran out in order to make an informed purchasing decision? I don't recall it ever being listed on a box. If Nintendo had upped the base clock without offering the lower clock, would everyone be up in arms about the reduced battery life? Why isn't there a request for a 200 MHz mode for simple games as well?
You make a lot of valid points, but I stand by my assessment that Nintendo will likely police this in some way. Utilization is independent of clock speed is it not? So no matter what, a game utilizing 60-70% of the hardware is going to use more power at higher clocks. I also stand by the idea that the standard performance profiles work pretty well to take your 720p portable game and render it at 1080p docked. The performance portable profile makes this nearly impossible. If your maxing out the portable performance, no way can you get to 1080p docked without sacrifices, and making sacrifices for the big screen is a no no when the portable screen is more accepting of sacrifices.

Sent from my SM-G360V using Tapatalk
 
Last edited:
We probably won't know exact speeds until the end of the year, if the Wii U is anything to go by. Which kills me o_O

I wholly expect Eurogamer to be right on speeds, and that it'll have one A57 core disabled and half a gig of memory for the OS. Anything extra is gravy
 
This is on PC (Kaveri). I normally run my PC on balanced mode (which I set to limit max performance to around 3GHz) and my PC ran much cooler than when I set it to high performance. There are no difference in terms of performance which much of the CPU utilization is used for torrent. I turn up high performance only for gaming. Other uses (torrent, online video streaming, basically anything other than gaming) use balanced more. On balanced mode, the CPU fan never spin up to its max speed while on high performance, even if I'm not gaming, the CPU fan sometimes goes to its max speed.
 
You raise a good point about streaming.

If hicups are associated in some way with streaming, they may not always be replicable. Zelda is using WiiU assets on the Switch, and it can run in 1GB of ram, so with 3+GB on Switch there's probably a huge amount of cached data (that's already been loaded and decompressed) just sitting there. No point wasting power loading and decompressing data if you don't need to!
It could also be the media. Cartridge vs. SD card. I'm getting Zelda on cartridge but if DF conclude a fast SD card is better I'll buy future titles like this digitally.
 
Honest question - if a game runs at 75% utilisation at 307.2 MHz and 60% utilisation at 384 MHz, how many hours difference will that really be?
To answer that you'd need to give me the power draw of the screen and the processor. Nintendo's official line is the battery life is between 2.5 and 6 hours which is pretty wide.

You're going to notice that if you play on the go a lot.
 
Despite saying the clocks were final last time, Eurogamer changes their clock values yet again.

http://www.eurogamer.net/articles/d...-boosts-handheld-switch-clocks-by-25-per-cent

Now memory is always 1333MHz in portable mode, and the GPU can do 384MHz after all.
The other specs were final but these are super final.

But don't worry folks, eurogamer is 100% trustable they've known the exact final specs for half a year no, 2 months, no 24 hours because their sources are rock solid.
:rolleyes:
 
But don't worry folks, eurogamer is 100% trustable they've known the exact final specs for half a year no, 2 months, no 24 hours because their sources are rock solid.
:rolleyes:
Interesting spin. DF has been and are correct, because they're using the documents Nintendo is providing developers. Some specs are subject to change, clocks being the last such that when XBox One got its upclocks (and other platforms have had downclocks) it came as no surprise. Because that's just a setting. So yeah, course clocks can be tweaked. But the hardware doesn't change too much near the end (save maybe RAM - 8 GBs GDDR5!!! - and only because that's an easy change switching in different chips), so we know Switch is X1, 256 CUDA cores, as DF reports.

If you want to go on believing there can be completely different hardware, X2, 512 cores, 800 GFs, Burst Processing, whatever, that's your prerogative, but there's no argument here against DF's credibility when the only thing 'wrong' about their reports is a usual clock tweak.
 
Interesting spin.
It's just a spin that tries to break away from a circlejerk. The current circlejerk is "Eurogamer can do no wrong".

Eurogamer was wrong about 2GHz CPU 1GHz GPU, but "eurogamer can do no wrong" so those were obviously just the limits on the silicon and the completely different clocks had an excuse.
Then eurogamer was wrong about 1600MHz memory and max. 300MHz GPU in handheld mode, but "eurogamer can do no wrong" so those were obviously "just clock tweaks".

Tomorrow eurogamer comes up with something else, but eurogamer can do no wrong so it's just something perfectly excusable again.
All these developers must be having a blast, with ever-changing memory bandwidth, GPU and CPU clocks that take enormous jabs at whatever low-level optimizations they're trying to achieve, up to a week from launch.

Any other outlet would have been scrutinized for these kinds of maneuvers, but eurogamer keeps changing their goalposts and somehow for you people that makes them more credible and not less.
I love Digital Foundry just as much as the next B3D'er, but this doesn't make them perfect.



If you want to go on believing there can be completely different hardware, X2, 512 cores, 800 GFs, Burst Processing, whatever, that's your prerogative, but there's no argument here against DF's credibility when the only thing 'wrong' about their reports is a usual clock tweak.

I already stated more than once that they're most probably right, but I just don't put 100% of my trust into anything. CPU clocks going from 2GHz in July to 1GHz in December in is anything but usual clock tweak.
To be honest, this wanting to believe is starting to look like a bit of an accusation.
This thread already has a resident troll eurogamer fan who only posts in this thread and half of his/her posts are about insulting anyone who dares question the Almighty Eurogamer Truth (together with typically trollish lack of punctuation and upper case efforts). That's one too many IMO.


Some specs are subject to change, clocks being the last such that when XBox One got its upclocks (and other platforms have had downclocks) it came as no surprise. Because that's just a setting.

Cutting memory bandwidth in portable mode from 25GB/s to 20GB/s (20% less) in shared GPU/CPU resources at the last minute is just a setting?
Please do explain how this wouldn't be a clusterfuck for developers trying to squeeze as much as possible from both operating modes.
 
It's just a spin that tries to break away from a circlejerk. The current circlejerk is "Eurogamer can do no wrong".
It's not a circle jerk. It's a discussion about Switch's hardware that is looking for reliable sources and through logic and consensus has for most agreed upon DF (who clearly delineate their speculation from their sources).

Eurogamer was wrong about 2GHz CPU 1GHz GPU, but "eurogamer can do no wrong" so those were obviously just the limits on the silicon and the completely different clocks had an excuse.
Then eurogamer was wrong about 1600MHz memory and max. 300MHz GPU in handheld mode, but "eurogamer can do no wrong" so those were obviously "just clock tweaks".
So you haven't understood at all about how clocks are liable to change...

Tomorrow eurogamer comes up with something else, but eurogamer can do no wrong so it's just something perfectly excusable again.
If anything changes (they weren't wrong - the information changed because Nintendo changed the specs) it'll be clock tweaks. That's the only thing that can change at this point.

All these developers must be having a blast, with ever-changing memory bandwidth, GPU and CPU clocks that take enormous jabs at whatever low-level optimizations they're trying to achieve, up to a week from launch.
1) Upclocks aren't a problem. 2) These changes would have happened more than a week from launch - we're just hearing about them now. 3) yeah, early launch devs can have a shit time with a lack of clarity. If you followed the development of any console you'd know this. Devs were creating PS3 games long before they knew what the actual specs were going to be. The 8 GBs RAM of PS4 was a spectacular last minute change (and a positive one, but usually they aren't so).

Any other outlet would have been scrutinized for these kinds of maneuvers, but eurogamer keeps changing their goalposts
They report on the current info and typically say, "subject to change, not final, we believe, as we understand it." Importantly they separate speculation and rumour from their sources, so we know what's guesswork and what's 'confirmed' by insiders. It's only the 'confirmed insider' info that we give additional weight to.

I already stated more than once that they're most probably right, but I just don't put 100% of my trust into anything. CPU clocks going from 2GHz in July to 1GHz in December in is anything but usual clock tweak.
To be honest, this wanting to believe is starting to look like a bit of an accusation.
This thread already has a resident troll eurogamer fan who only posts in this thread and half of his/her posts are about insulting anyone who dares question the Almighty Eurogamer Truth (together with typically trollish lack of punctuation and upper case efforts). That's one too many IMO.
You're making it personal again. You seem to do that in a lot of debates about hardware. Such that you're mostly arguing about wanting to prove DF wrong and fallible and making illogical arguments to that effect. The rest of us not emotionally invested in proving people wrong can see the information for what it is.

Cutting memory bandwidth in portable mode from 25GB/s to 20GB/s (20% less) in shared GPU/CPU resources at the last minute is just a setting?
It is. Nintendo changed a FW setting to limit devs to the minimum clock undocked where before there was a choice.
Please do explain how this wouldn't be a clusterfuck for developers trying to squeeze as much as possible from both operating modes.
Who said it wouldn't be problematic?? Do you not remember the PS3 downclocks? The devs targeting uncertain specs, creating games on PC only to have to hack them down into something that'd run? It happens. It's not unusual - hell paper specs don't tell you much about how the hardware actually runs anyway. It's also not any report being wrong when the console company is changing the goalposts.

DF reported the hardware architecture and that is the hardware architecture. They've then updated on the clocks as and when better info is available. They have consistently been a reliable source on all the consoles as we've tracked their development. There's no need to get angry/annoyed at people finding a reliable, trustworthy source and using it appropriately.
 
You're making it personal again. You seem to do that in a lot of debates about hardware. Such that you're mostly arguing about wanting to prove DF wrong and fallible and making illogical arguments to that effect. The rest of us not emotionally invested in proving people wrong can see the information for what it is.

How is calling people a delusional fanboy over and over again somehow not personal? I gather you actually understood who I was talking about, right?

I'm lowering DF's credit to plausible levels while at the same time acknowledging they're most probably right (how this somehow keeps going past people's heads is comically incredible). This is not the same thing as trying to prove them wrong.
And if even you can't see past that and join the same old tired overused "you're just hopeful it's something better" argument, now with "you're emotionally invested" cherry on top (we both know this is a sugar coated "you're not fit for this conversation" jab), then this conversation is over.
 
How is calling people a delusional fanboy over and over again somehow not personal? I gather you actually understood who I was talking about, right?
No, I don't pay that much attention. ;)
I'm lowering DF's credit to plausible levels while at the same time acknowledging they're most probably right (how this somehow keeps going past people's heads is comically incredible). This is not the same thing as trying to prove them wrong.
But the argument you raised against them makes no sense. How is this not trying to prove their fallability?
But don't worry folks, eurogamer is 100% trustable they've known the exact final specs for half a year no, 2 months, no 24 hours because their sources are rock solid. :rolleyes:
Eurogamer was wrong about 2GHz CPU 1GHz GPU, but "eurogamer can do no wrong" so those were obviously just the limits on the silicon and the completely different clocks had an excuse.
Then eurogamer was wrong about 1600MHz memory and max. 300MHz GPU in handheld mode, but "eurogamer can do no wrong" so those were obviously "just clock tweaks".

Tomorrow eurogamer comes up with something else, but eurogamer can do no wrong so it's just something perfectly excusable again.
Clock changes at Nintendo's end doesn't discredit DF one bit. At the time of writing, the info provided was correct. If you genuinely think DF are probably right, then you wouldn't have raised complaints about them being wrong and unreliable over a clock speed which is usual fodder for last minute changes.
 
Status
Not open for further replies.
Back
Top