Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
That page appears to be for the Tegra K1 (A15).
It references the K1, but it seems to be how it is handled on both the K1 and X1 versions of Jetson, at least it is referenced from the X1 pages.
I've been looking around at X1 material, but don't actually have access to either board or necessarily full docs which is why I threw this into the thread in the hopes of someone with more insight contributing something more decisive.
I don't have time to really dig into this, christmas event horizon is looming! :)
 
So in your imaginary specs (what are they? are you saying it would be 4 SM? With 128bits lpddr4?), how many watts would the switch consume when docked, and how many watts in portable mode, considering the only difference is 40% gpu clock and the rest of the chip left at the same clock?
I don't know what he has in mind

I don't even want to share what I have in mind. This forum has become horrible for people making guesses that happen to come out wrong. I still get thrown in the face stuff I predicted wrong from years ago, some people here are completely obsessed trolls eager to use strawmans at the first opportunity. E.g. from time to time I still get flak for writing I thought stereo 3D would be the future of handheld devices sometime in 2011. (BTW what the hell do these people do in their lives that's so boring that they get off on searching/keeping/whatever years-old quotes from unimportant anonymous users in internet forums?!)
There's even a guy that's been accusing me these last few pages of "being a wishful fanboy who can't accept reality" even though I haven't posted any specs of my own for months.

Instead, I'll just broadcast my idea that the "Twitter Leak Specs + Eurogamer clocks make no sense" premise. Because it doesn't, regardless of what the outcome of this whole thing will be.



Here's another one. If nvidia is making the SoC then the earlier dev kits are most probably using a TX1. But if said early TX1 devkits were using GPU clocks at 768MHz maximum, why were they loud if the Shield TV can be practically inaudible while maintaining solid 1GHz GPU clocks?
 
:D
Via the cloud...

Seriously we have a respected outlet in Digital Foundry with a history of only running with well verified sources stating specs that make sense given the track record of both Nvidia and Nintendo which is also backed up by another reliable Nintendo leaker (Laura Kate Dale) but a randomer on Twitter is more believable.

More and more I believe we will see a "MisterN" out of this
 
I'd be pleasantly surprised if this thing is more powerful than the rumors suggest but as of right now I believe the rumours if nothing else because of battery life + Nintendo recent tendency to be conservative when it comes to hardware power.
 
I don't even know why their aiming for 1080p on tv.
720p when in handheld mode.
900p when in console mode and scale up.
Lots of x1 games are 900p scaled up, I think use the extra resources for improved fidelity if possible.
 
I'd be pleasantly surprised if this thing is more powerful than the rumors suggest but as of right now I believe the rumours if nothing else because of battery life + Nintendo recent tendency to be conservative when it comes to hardware power.
well, 4k is no problem for tegra. The only question is, what do you want to display with all those pixels. A 4k or even 8k Pacman or Snake shouldn't be the problem ;)

But I highly doubt this twitter-user has legit sources. From what we know, the Switch has only a HDMI 1.4 port, well it is enough for 4k but only at 30 Hz. Not the best thing you can do.
A resolution higher that 1080p would only make sense with a HDMI 2.0 port, because you would want to output a 4k signal with that resolution. Everything else makes no sense for TVs.
 
I don't even know why their aiming for 1080p on tv.
720p when in handheld mode.
900p when in console mode and scale up.
Lots of x1 games are 900p scaled up, I think use the extra resources for improved fidelity if possible.

I'm guessing we will see scaled up on both handheld mode and console mode. games will probably target something like 540p on handheld an 720p-900p on console mode.
 
well, 4k is no problem for tegra. The only question is, what do you want to display with all those pixels. A 4k or even 8k Pacman or Snake shouldn't be the problem ;)

But I highly doubt this twitter-user has legit sources. From what we know, the Switch has only a HDMI 1.4 port, well it is enough for 4k but only at 30 Hz. Not the best thing you can do.
A resolution higher that 1080p would only make sense with a HDMI 2.0 port, because you would want to output a 4k signal with that resolution. Everything else makes no sense for TVs.

I'm guessing being able to market "Netflix in 4K" would be nice.
 
But I highly doubt this twitter-user has legit sources. From what we know, the Switch has only a HDMI 1.4 port, well it is enough for 4k but only at 30 Hz. Not the best thing you can do.
A resolution higher that 1080p would only make sense with a HDMI 2.0 port, because you would want to output a 4k signal with that resolution. Everything else makes no sense for TVs.

Tegra X1 supports HDMI2.0 and HCDP2.2. Why would Nintendo limited it to HDMI 1.4?...

There are no real informations about the hardware.
 
I'm guessing we will see scaled up on both handheld mode and console mode. games will probably target something like 540p on handheld an 720p-900p on console mode.
Nintendo titles are not the most graphically demanding titles out there. It shouldn't be a problem to run a zelda/Mario at 1080p with descent hardware. What really graphically shines on nintendo titles is the artwork. Nothing else.
 
Other than credibility of the source or feasibility, you have a system capable of easely produce 3d games at 2k+, and when you dock it and increase the clock instead of targeting 4k, you go stright to 1080p
 
Tegra X1 supports HDMI2.0 and HCDP2.2. Why would Nintendo limited it to HDMI 1.4?...
Folks are going about this "X1" base the wrong way of course!

CLEARLY they built a custom 28nm chip that just coincidentally has similar specs to the X1 because that's just what made sense for the wat target. Just use some 28nm libraries for Maxwell, A57, and the Tegra K1 video block to come up with a custom ASIC!

:runaway::runaway::runaway::runaway:

DanK1

:confused:
:rolleyes:
o_O

I need more smileys.

:unsure::neutral:

:sleep:

(danke k1 :cool:)
 
Seriously we have a respected outlet in Digital Foundry with a history of only running with well verified sources stating specs that make sense given the track record of both Nvidia and Nintendo which is also backed up by another reliable Nintendo leaker (Laura Kate Dale) but a randomer on Twitter is more believable.

Takashi Mochizuki is Wall Street Journal's technology correspondent from Tokyo. Let's go a little easier on the knee-jerk reactions.
I'd believe his sources over the friends of any twitter celebrity, but in this particular case he's simply quoting a report from a japanese analyst firm called Ace Research Institute.

The other spec he quoted point to USB-C being used to carry a DisplayPort signal into the dock which is then converted to HDMI, hence why no HDMI 2.0 / HDCP 2.2 (only available in DP1.3 which is too recent). The Displayport part is actually in the patent filings.
As for the WQHD -> 1080p he most probably just switched the order, so 1080p max for handheld and 1440p max for docked. WQHD (1440p or 1600p) would actually match the limitations of HDMI 1.3/1.4 output at 60Hz, which is 2560*1600 30bit-per-pixel.
Again, the WQHD is something anyone could have deduced from looking at the patent:

[0457] As shown in FIG. 32, the cradle 5 includes a conversion section 131 and the monitor terminal 132. The conversion section 131 includes, for example, circuitry configured for performing video and sound conversion and is connected to the main body terminal 73 and the monitor terminal 132. The conversion section 131 converts a signal format regarding images (referred to also as video) and sound received from the main unit 2 into a format to be output to the TV 6. In the present embodiment, the main unit 2 outputs image and sound signals to the cradle 5 as a display port signal (i.e., a signal in accordance with the DisplayPort standard). In the present embodiment, communication based on the HDMI (registered trademark) standard is used for communication between the cradle 5 and the TV 6. That is, the monitor terminal 132 is an HDMI terminal, and the cradle 5 and the TV 6 are connected together by an HDMI cable. Thus, the conversion section 131 converts the display port signal (specifically, a signal representing video and sound) received from the main unit 2 via the main body terminal 73 into an HDMI signal. The converted HDMI signal is output to the TV 6 via the monitor terminal 132.


At this point I'm not inclined to believe it's a 1080p screen in the final console, as it's probably just a wrong assumption from the tech analyst in question. Though there seems to be some very relevant differences between early dev kits and production hardware and leakers may be mixing up specs between both, so some specs could still be a surprise (pleasant or unpleasant).
Truth be told, the price and power consumption difference between 1080p and 720p panels at >6" should be residual nowadays and if nvidia implemented a good PS4 Pro-like scaler why not go with a 1080p screen for a sharper image on simpler games?



CLEARLY they built a custom 28nm chip that just coincidentally has similar specs to the X1 because that's just what made sense for the wat target. Just use some 28nm libraries for Maxwell, A57, and the Tegra K1 video block to come up with a custom ASIC!
Still more believable than the Twitter leak claiming 14.4 ROPs.
 
Hexus said:
Any GPU engineer will tell you that a modern GPU has multiple clock domains, but for our purposes, let's consider GTX 580 to operate with a general clock of 772MHz and shader-core speed of 1,544MHz.

http://hexus.net/tech/reviews/graphics/36509-nvidia-geforce-gtx-680-2gb-graphics-card/?page=4

Shaders, ROPs, TMUs, memory controller etc don't actually need to run at the same speed due to some fundamental law. If it were deemed worthwhile, ROPs could be made to operate at a different clock - for example 9 x some base clock to the shader cores 10 x.

Being able to fine tune where your power was spent on a highly power constrained system might be useful.
 
http://www.tweaktown.com/news/55585...-high-1080p-unreal-engine-4-preset/index.html

Unreal Engine 4 display profiles discovered. Doesn't sound like the standard profiles for Unreal 4 in docked mode are too different than the PS4/X1. Im still of the opinion that Eurogamer does have correct clock speeds, Laura K Dale backed it up with one of her sources, but the picture being painted here based solely on those clock speeds seems to differ greatly from the Unreal Profiles and developer comments.
 
Imagination PowerVR?
Apple?

If it would had been PowerVR IP, one of the likely candidates would had been rather NEC as one example (the Vita SoC came also from NEC). So if I'd be Nintendo I'd think great GPU IP, but what about sw support?

Apple would design and sell an SoC for any third party why exactly?

TottenTranz,

My sentence you quoted was just a fair warning that I'm not answering on your behalf, nothing else. Not that it has to do anything with the topic but I like you as a personality and I actually read and respect your input (albeit I don't always agree with it, which stands for pretty much anyone). Let's stick to the topic though shall we?

By the way it probably has been already mentioned, but those that haven't read venturebeat's writeup on the topic it's worth a read http://venturebeat.com/2016/12/14/nintendo-switch-specs-less-powerful-than-playstation-4/
 
Last edited:
Status
Not open for further replies.
Back
Top