Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
It wouldn't surprise if the SOC has embedded ram to help the bandwidth issues. That has been a staple of many Nintendo designs.

I personally expect only Tegra X1 performance and customizations done to improve power efficiency.
 
anybody has the idea how the cpu of Switch compares to the xbox one's cpu in performance department. could it be faster than xbox one Jaguar microarchitecture
According to the original Tegra rumor: Four ARM Cortex-A57 cores, max 2GHz

Xbox One has an eight core Jaguar CPU. Twice as many cores, roughly the same clocks. IPC of A57 and Jaguar should be pretty close to each other. So roughly half the CPU performance, if the rumors sre credible.

We down know anything about OS CPU and memory reservation, so its hard to make any estimates yet.
 
Again, I didn't say it would be better or cheaper, just that it would be possible. AMD won the designs because they provided the best possible solution in the eyes of their contractors, not because there was no other way (as in, if AMD had gone bankrupt in 2010 we'd have no PS4 or Xbone). Credit should be given where credit is due IMO.
I have no idea how much a power-optimized quad-core Power7+ at e.g. 2GHz would consume but it could certainly fit a console if needed. In 2010 IBM was making a 45nm SoC with a 3.2GHz CPU consuming less than 120W.

I spoke of MCMs simply because it's a form of integration that has been used in consoles many times, IBM designs included.

I did say in the part you originally quoted that nVidia didn't have a competitive 64-bit CPU that they could offer integrated in an SoC with their GPU.

I can't find where it's specified quad-core Power7 @ 3.2GHz uses < 120W, do you have a source for that? It's hard to find fine grained information, I just know that using IBM's energy estimator you can't build a minimal system (with just quad-core CPU + 8GB RAM + backplane + 1gbit ethernet) on Power7 at under 300W.

Jaguar cores @ ~1.6-1.75 GHz probably consume < 30W for the 8-core cluster. I doubt quad Power7+ would be anywhere within spitting distance of that figure when scaled down, given how aggressively it's designed for high clocks and high SMT throughput. If it made sense from a power standpoint to offer SKUs with several ~2GHz Power7+ they probably would have.

8-core Jaguar is about 55.5mm^2 in XB1 (roughly similar in PS4). That includes the 4MB of L2 cache. Power7+ die is 567mm^2 for 8 cores on IBM's 32nm SOI. If you cut the cores in half and took out the accelerators and other unneeded things (SMP related) it'd still easily be over 200mm^2. And then there'd probably be a decent decrease in density moving everything else to this process.

AMD has the best possible solution because they had a good enough CPU to offer at that exact time, but this wasn't really because of a consistent technical advantage over nVidia. It was due to circumstances that just don't apply today, and don't contribute to a cause for concern that Nintendo is using nVidia in Switch.

I don't include MCMs under the basis that they were not on the table for the XB1 or PS4 designs.. and the arrangement in the Wii U is really bad.. slapping a tiny ~30mm^2 cluster of an ancient CPU next to a different main die on a different process. It would have been a big improvement just integrating that CPU die into the main die but that must have not been too trivial, as I imagine it wouldn't have been for nVidia and Sony/MS/Nintendo to share a die with IBM. While the integrated XBox 360 CPU+GPU die is a counterpoint that was an optimization of a design that was several years old, not using anything resembling recent designs, so they had a lot of time to work out and negotiate the issues surrounding this.
 
Is this a serious question? I mean if over a decade of speculation threads on this forum doesn't answer it, I doubt I can.
Yes. If there's an answer I've missed, feel free to spell it out to me. ;) From what I recall of speculations over the past decade+, the specs we get that are bogus are over-the-top from fanboys and troublemakers. The specs that are sane and reasonable end up being the true specs. I don't recall any 'leak' being well below the final product. And with Nintendo especially all the specs low-level specs and leaks were exactly what we got, despite some arguing for years afterwards that the machines were somehow even more powerful and advanced.

So the rumour is the BW limit of LPDDR4, which makes technical sense, and Nintendo have a history of conservative specs (and this spec is still the state-of-the-art BW for mobile devices), and the history of (Nintendo) leaks that don't talk of AI processors and raytracing processors tend to come out true, and I see no reason why someone would intentionally sully their reputation by claiming a fictional 25.6 GB/s BW when the real machine is capable of far more...everything points to 'Plausible' to my mind.
 
i wonder, will their game still look good when rendering at half resolution?
if i remember correctly, 3DS renders at half resolution when running in 2D mode. Hench the blur when not using 3D.

at that low resolution, NX should have good enough power right?
but i guess this will bring annoying problem when docked to TV -> the picture looks like shit
 
Just like all those ps4 and one games that don't look good when rendered at less than 1080p :?

I really don't see the point in comparing 3ds with switches obviously the Switch will have modern hardware and it shouldn't have any problems outputting sharp graphics.
 
but sharp photo realistic graphics?

rendering at lower resolution than native could give the performance boost needed to handle PS4 ports
 
There are certainly things that could be added like eDRAM or stacked RAM to provide more BW than LPDDR4
but sharp photo realistic graphics?

rendering at lower resolution than native could give the performance boost needed to handle PS4 ports
Something PS4-ish, at 540p, seems reasonable for Tegra X1 class. Throw in some checkerboard rendering options and I think the handheld results could be reasonable. Problem is TV output. 720p native ain't gonna hit PS4 quality. And underlying motivations from Nintendo are likely "can we make our simple, low-cost asset games look good?", which they'll be able to do. Mario Galaxies will look great - lovely shaders, simple geometry, little bit of AO and maybe even some secondary illumination - which is as much as N. want, but more visually ambitious games will struggle.
 
Also, and this is just me, but I want a PS Vita res screen. I'm hoping for a screen that low res because I feel it's good enough for a handheld right now, and would allow for some pretty damn good looking games; easily better looking than PS3/X360/Wii U without having to worry about the performance perils of resolution too much.
 
At this point in the mobile landscape I consider 1080p as "not that big" at what all mobile devices should aim for; especially not when everyone is used to UHD screens on their phones.
 
At this point in the mobile landscape I consider 1080p as "not that big" at what all mobile devices should aim for; especially not when everyone is used to UHD screens on their phones.

I suppose, but a console is different from a cell phone. The nature of user interface is different, plus we have to worry about decent graphics throughout the console's life. This thing won't be able to do that if its stuck at 1080p. It'll top out quickly like the Wii U did and just stay stagnant.
 
At this point in the mobile landscape I consider 1080p as "not that big" at what all mobile devices should aim for; especially not when everyone is used to UHD screens on their phones.
720p at 6" probably isn't that bad. However, I'd prefer a 1080p screen even if games are rendered at 540p (or checkboard!) and upscaled (reconstructed), with a native UI.
 
The specs that are sane and reasonable end up being the true specs.
This is not true. There are countless examples of false "reasonable" specs abounding here. And of course the actual specs would end being considered sane and reasonable....

everything points to 'Plausible' to my mind.
That's great, for you. I happen to disagree. Actually, I don't even disagree with the word plausible. Plausible, sure. Likely? No.

I see no reason why someone would intentionally sully their reputation
Bullshit.
 
Last edited:
It is that bad. I own and used different 7" tablets, the original Amazon Fire and the Nexus 7 2012 model. The screen is amazingly bad on the 600p compared to the 720p. Originally it seemed alright in 2011 but once you spend 5 minutes on a 720p screen you wont ever want to go back to a 600p screen, even if its just for watching movies. Its even worse if you attempt to read anything. The difference between 600p and 720p screens is huge.

As large of a difference as there is between 600p and 720p, its even more telling going from 720p to 1080p when reading. I'm now using a Samsung 1080p tablet and want a better tablet with UHD screen but this one is still so functional and performant enough for surfing and watching videos that I cant justify upgrading.
 
Status
Not open for further replies.
Back
Top