Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
This is an example of people seeing and hearing different things from the same presentation.
The video was made before the Switch reveal, Richard state clearly that the reports he has had is about the devkit, and speculates that the final product may use an updated SoC.
There is NO point in the video where he says what you claim about "final specs", in fact he is speculsting the direct opposite!

yes he speculates at 6:25, and also mentions all sources are pointing to it being Maxwell. think about about that for a second, those are developers that have devkits, why didn't they mention to him anything about it probably being pascal in final hardware?
 
It can't be because they don't want to tell him, they already leaked specs to him. it's most likely they haven't heard anything about pascal, and think the devkit will be the final hardware.
If that was the message they conveyed, why on earth would he finish off by speculating that the final product would be Pascal? You are making assumptions about what was told to Richard that obviously he himself doesn't agree with.
For now, we just don't know. And unless nVidia states it clearly at launch, we likely won't know for certain until Chipworks get their hands on a retail sample.
 
If that was the message they conveyed, why on earth would he finish off by speculating that the final product would be Pascal? You are making assumptions about what was told to Richard that obviously he himself doesn't agree with.
For now, we just don't know. And unless nVidia states it clearly at launch, we likely won't know for certain until Chipworks get their hands on a retail sample.

yes he speculated in the end that it could change to pascal, but right afterwards he states in his exact words, this is all speculation on his part, and everything he's heard from sources all point Tegra x1, this line right there and the link I posted with the ubi developer giving a hint it won't be pascal http://www.neogaf.com/forum/showpost.php?p=220889631&postcount=117 leads to believe it won't have pascal, of course nothing is 100% and we don't for sure, but people acting like pascal is confirmed are ridiculous.
 
If that was the message they conveyed, why on earth would he finish off by speculating that the final product would be Pascal? You are making assumptions about what was told to Richard that obviously he himself doesn't agree with.
For now, we just don't know. And unless nVidia states it clearly at launch, we likely won't know for certain until Chipworks get their hands on a retail sample.
We don't know but that is why we are speculating. And to say it's pascal is more wrong than saying it's maxwel because if there is already sources saying that the dev kit is maxwel, and only speculation that it is pascal, then it follows that the more likely thing is maxwel. Unless of course, people know things they aren't disclosing.
 
So, while the speculation on which version of Tegra this thing is going to use is fun, I have a different question.

Do you guys think that the NS will be an iterative system? What I mean is, do you think we'll see an upgraded version maybe two or three years after the initial release? I mean, the market is pretty used to upgrading tablets and phones. If the system itself is cheap enough, they might just sell and upgraded tablet (the guts) every few years to upgrade the system rather than making a whole new system.
 
I didn't. I just reminded everyone including you, that if one hypothetically discusses later CPU core designs (A57 as in the Tegra X1 in the Shield) as alternatives, you should always consider contempory designs for comparison (and I brought up an example of a chip released significantly before the one chosen by you). (Not only) otherwise the discussion is pointless. If the A57 implementation in the X1 would have been an alternative, surely the same would apply for the Puma cores in Mullins, as the chip was available roughly a year earlier than the X1. If it's not, you are argumenting in a meaningless vacuum. I hope you get my drift. ;)

That still doesn't make any sense... The logic has never been "what hardware would consoles use if they had been released in 2014/2015, etc, etc". That is not what is being discussed at all. The logic has always been "if A57 had been available at the time, was it fast enough?". The same question cannot be made about Puma because we know it wasn't available at the time, or consoles would use Puma. It's like saying that "current PC's are as powerful as supercomputers from the 80's" and coming up with the respònse "but current supercomputers are several orders of magnitude faster". It doesn't make any sense to bring that up into the convo and it doesn't change the veracity of the original claim either.

It all makes even less sense when you realise that the context of all of it is "what kind of CPU does the Switch need to match/beat PS4/XB1?"
 
Not at all, I can say with no reservation that Nintendo made mistakes with Wii and Wii U, esp. the latter. On the flipside I can see some anti Nintendo rhetoric, I mean what you're writing makes it seem like you think Wii U has no advantage over Ps3 when it clearly has a much more modern gpu and superior ram set up.
What have I said that suggests I don't recognise the advantages of Wii U?

Lighting and material shaders are the most obvious improvements in Wii U games over Ps3 software.
Shaders in some respects, although N. use an unrealistic art style that makes the most of shaders and which few others would dare to try, I guess. Lighting I don't particularly agree with. Also your examples are limited environments where more can be spent on what's displayed. Furthermore, Wii U's more modern GPU enables some features, such as the grass in Zelda. So are Nintendo really doing more with hardware than other developers, or are they just doing the same? Because your assertion was that Nintendo are superior at extracting performance from hardware than other developers. I see no evidence of that, again citing more comparable games like Zelda as a reference point.

1. You can say that about any console, Why not play an N64 on a big HDTV? Well, that's not how the graphics were meant to be displayed.
Wii was released in 2006. The median TV screen size was 42" (probably in US). The other consoles recognised the transition and made themselves HD.

3. Anecdotal, i've heard many people praise a lot of different Wii games for their visuals. I bet the same people that bash Wii/Wii U graphics will praise Xbox/Ps3 graphics and how they were so impressive
No, I'm talking the broad Wii audience, mum's and 'non-gamers' who never played a console game in their life alongside gamers. People with no experience of games other than a passing glance at what someone else is playing, perhaps, who found the lack of quality and the shimmering and jaggies distracting.

Also, yes it's anecdotal evidence. Why raise it then? To counter your point that Wii graphics were good enough based on just your personal opinion. So I cite uncorroborated evidence that your personal opinions weren't universal and there was/is evidence that Wii should have been higher res. My evidence is supported by you in your quote "Nintendo themselves have said Wii should've been HD from the start." So Nintendo agree with me and disagree with you saying Wii was good enough. ;)

More power lets you do more, but I was just pointing out that some of the techniques developers use nowadays only make games look worse than they used to. I'm not interested in hearing arguments for effects like temporal AA, excessive lens flare and chromatic aberration. Even on the technical side of things, an original game that was made with older hardware in mind may lose some of its positive aesthetic in the transition to newer hardware. But that's another long discussion.
It's also not relevant for the discussion of the hardware and its power. Game aesthetics belong in other threads. You said that N. are better at using hardware than other devs. The conclusion here seems to be 'you prefer look of N. games,' which is fine, but in the context of the discussion irrelevant. Given a consle with x power, devs, all devs including Sony's and Nintendo's and MS's, will have the capacity to do x amount of work. The aesthetics they choose are down to them. Nintendo has a clear aesthetic which typically is graphical undemanding and NS will be okay for achieving that at good quality in a handheld, as I've already mentioned.
 
(From pages 6, 7)

At this point in the mobile landscape I consider 1080p as "not that big" at what all mobile devices should aim for; especially not when everyone is used to UHD screens on their phones.

720p at 6" probably isn't that bad. However, I'd prefer a 1080p screen even if games are rendered at 540p (or checkboard!) and upscaled (reconstructed), with a native UI.

Yup, and you can pull tricks to make text rendering look better on a lower resolution screen but games are often drawing things into the 3D distance where low resolution really hampers image quality.
(etc.)

I just wanted to say that the 3DS is 240p, and the DS was 192p (lower than NES)

Also many future buyers of the new Nintendo won't have much experience of 3D games on phones at 1080p or even 720p. Perhaps some play little games that happen to use OpenGL and run at the phone's res, but they're 2D puzzle games that play like Candy Crush but you hatch eggs or transform vegetables into other vegetables, etc.
This reminds me of a 800x600 game of Solitaire on Windows 3.1 (it didn't scale but was more "roomy") while at the same time, doom2 at 200p was all the rage and years later running a 3D game at 640x480 and 30 fps was state of the art.

So, 720p with high end assets and rendering seems quite high end, at least for me?
It's a hair below typical 15.6" laptop resolution.
For advertisement, they can show it on a 2160p TV and let the TV handle the scaling from 720p.
If the console can output 2160p 60Hz by itself (i.e. don't trust some random TV scaler, or don't end up scaling twice), that'd be the one feature that works when you use the dock, by that I mean there may be some heat budget taken by running the HDMI interface at such a high bitrate? If docked, not a problem. The battery won't drain and the consumer won't know it runs slightly warmer. I am actually wondering about that. Does such an output use half a watt? a watt?
 
We don't know but that is why we are speculating. And to say it's pascal is more wrong than saying it's maxwel because if there is already sources saying that the dev kit is maxwel, and only speculation that it is pascal, then it follows that the more likely thing is maxwel. Unless of course, people know things they aren't disclosing.
Nobody is saying it's Tegra on FinFET, only that it is possible.
Turning the argument around, back when the configuration of the DevKit was decided, was a FinFET Tegra available at all? So barring actual knowledge of the final product, the configuration of the devkit doesn't provide any information either way.
At most, we're looking at a factor of two or so in graphics capabilities.
 
But the advertisement shows it as a tool to play some impressive game on the wall mounted screen. i.e. that one where you're a first-person barbarian and club things to death in a lush medieval forest.
So, 720p might be the minimum resolution, and hence the LCD's resolution.
 
yes he speculates at 6:25, and also mentions all sources are pointing to it being Maxwell. think about about that for a second, those are developers that have devkits, why didn't they mention to him anything about it probably being pascal in final hardware?
It really boils down to the process the SOC is manufactured in. If it's 28nm it will be Maxwell based, if it is 16nm it will be Pascal based. Why? Because Pascal is already 16nm and shrinking Maxwell to 16nm would mean additional work along with possible problems and bugs. Same goes for the CPU. If the SOC is 16nm, the CPU part will be closely related to Parker's CPU part.

Regarding the devkits, I stated this before and I say it again: I don't think anyone (outside Nintendo possibly) has a devkit based on final hardware yet. Current devkits may not represent the final hardware. Nintendo expects to ship only 2 million Switch units until end of their current fiscal year ending march 2017. That means production is just starting. If they were using a customized X1, production would probably be much further along.
 
Threre will be a DS Lite alike upgrade. It's quite bulky now.

Oh, I'm sure there's going to be a phone sized version eventually. At least something of that nature, anyway. What I mean is upgrades to the tablet, and thus the whole console. For instance say that it's using Maxwell right now. I mean that in two or three years, they announce a new version of the tablet portion except it has more RAM and uses Volta. Sort of like PS4 to PS4 Pro.

So basically, you buy the new better unit, plug it in at home and voila, you've upgraded to a more powerful unit.

Do you guys think Nintendo will go down that route for the switch?
 
It really boils down to the process the SOC is manufactured in. If it's 28nm it will be Maxwell based, if it is 16nm it will be Pascal based. Why? Because Pascal is already 16nm and shrinking Maxwell to 16nm would mean additional work along with possible problems and bugs. Same goes for the CPU. If the SOC is 16nm, the CPU part will be closely related to Parker's CPU part.

Or there's been a 16nm Maxwell all along in some "blueprint" form. It was cancelled in favor of a 28nm version (GM2xx) and 16nm Pascal.
Work on the Tegra for Nintendo was started before that for Tegra X2 [in that alternate reality, at least]. Well, we're left to linger.
 
Isn't Tegra X1 manufactured on the 20nm process? Its possible that Tegra X1 was a placeholder for development kits, but that would still mean the target performance is very much inline with Tegra X1 as well. Its certainly possible that the chip in the development kits is overclocked to help simulate the improvements brought on with the custom chip. I would have to assume final development kits are going out by now, right?
 
Isn't Tegra X1 manufactured on the 20nm process? Its possible that Tegra X1 was a placeholder for development kits, but that would still mean the target performance is very much inline with Tegra X1 as well. Its certainly possible that the chip in the development kits is overclocked to help simulate the improvements brought on with the custom chip. I would have to assume final development kits are going out by now, right?
I assume the Shield may be the temporary kit until final hardware is ready indeed...
 
Status
Not open for further replies.
Back
Top