Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
What about the esram/edram possibility, or even a vita-style wideIO vram? This idea seems to have finally died for both sony and microsoft, but nintendo might still want to play with edram.

On 16/14nm, 16MB would be very small. Depends if nvidia can deal with a process compatible with edram at this size. Or if it's worth the trouble at all.
 
The "leaks" point towards Maxwell instead of Pascal, that is disappointing, especially so for a handheld (battery life). But i could see Nintendo upping clocks while in the dock, kinda like how laptops work on and off battery.
 
The leaks are of the devkit, which is limited to Maxwell as what was available at the time, no? We haven't had a leak of the final hardware. What have other platforms done? Quick Googlage says PS4's final devkit was only available 3 months from launch. 360 was a very different beast in devkits before the final silicon taped out - what was the timeline on those?
 
This is a device designed to be cost effective, imo. However, it will be faster than the WiiU and and orders of magnitude faster than than the 3DS. Even if it was close to Xbox power, it would still be less and that wouldn't provide the best experience for multi-platform releases so ultimately who cares if it's 1/4 or 1/2 of Xbox 1 power. This is a console for Nintendo games.

Faster than Wii U means jack because the Wii U has terrible hardware. Being 1/4 or 1/2 of Xbone will tell if the amount of multiplatform games is zero or more than zero.
If Nintendo was aiming at zero multiplatform games (i.e. console for Nintendo games), I think they wouldn't have put a multiplatform title running in the new console for about a third of the reveal video.

About being cost effective, they already stated they will be selling the $200 3DS in parallel with the Switch, meaning the Switch is probably not going to be cheap. Most probably not below $300.
 
The leaks are of the devkit, which is limited to Maxwell as what was available at the time, no? We haven't had a leak of the final hardware. What have other platforms done? Quick Googlage says PS4's final devkit was only available 3 months from launch. 360 was a very different beast in devkits before the final silicon taped out - what was the timeline on those?

Shouldn't we be looking at the PS4 Pro devkits availability and not something from 3.5 years ago?
 
Shouldn't we be looking at the PS4 Pro devkits availability and not something from 3.5 years ago?
I think any console is a decent reference point for how close to launch final hardware appears. Certainly I think there's precedent for final hardware just mere months before launch.
 
Don't dev kits usually overshoot the specs? Like how the ps4 and xbox one dev kits had 7970s in them when the end consoles were about half as powerful?
 
http://www.neogaf.com/forum/showpost.php?p=222110731&postcount=1032

more rumors from Emily rogers, a very reliable source backing eurogamer saying it will be a Tegra x1.
I am not sure what to make of that tbh.
She says she was told it was 'pretty similar to the X1'; now how much of a departure is the Drive PX2 to the Drive, or Pascal to Maxwell, there are close similarities between their respective platforms and previous generations, and it comes down to what is meant by 'pretty similar'.
If she had come out and said it was a custom X1 yeah for sure, but just a bit too vague unfortunately the info that was passed onto her.
And yeah she is one my favourite sources/reporters of such leak info as well.
Edit:
As others mentioned yeah the Devkit was reported to be the Nvidia X1 dev kit available for purchase as there was (if even exists) probably no 'X2' available until Q3 this year and as a very early sample IMO outside of the Drive PX2, so could be info being confused or Switch could be an X1 device, but the info Emily was given is too vague to conclude IMO.
Especially when awhile back quite a few were running the numbers for what to expect and comparing X1 to a custom 'X2' (using info provided about the Drive PX2).
Cheers
 
Last edited:
Don't dev kits usually overshoot the specs? Like how the ps4 and xbox one dev kits had 7970s in them when the end consoles were about half as powerful?
True, but then you need to work out the logistics of when a custom 'X2' could be created for Nintendo, the Drive PX2 in all its forms would had exacerbated any delay as that would be Nvidia's priority.
I cannot see a custom version of the Tegra 'X2' platform until late Q3 at best, Q4 realistic, if they were working on it.
Cheers
 
Don't dev kits usually overshoot the specs? Like how the ps4 and xbox one dev kits had 7970s in them when the end consoles were about half as powerful?
Those were alpha kits, right? In the case of XB360, the PowerMac devkit was basically just to give devs something with PPC ISA to work with. We're not talking alpha kits here but near-release devkits (though not final hardware). If the final hardware is TX1, it makes sense to have TX1 in the devkits. And if the final hardware is TX2 or something else, it makes sense to have TX1 in the devkits untli that's available.
 
But they are using none of them...

Basically what I mean is that Nintendo is not buying any existing chip. And neither are they buying GPU IP or a GPU block and implementing it on their own SOC. Nintendo asked for a custom SOC with X capabilities and a complete software stack, for a price of Y. Now, I want concrete evidence on how implementing a 256 core Maxwell GPU would be more economical for Nvidia than implementing a 256 core Pascal GPU.
 
Basically what I mean is that Nintendo is not buying any existing chip. And neither are they buying GPU IP or a GPU block and implementing it on their own SOC. Nintendo asked for a custom SOC with X capabilities and a complete software stack, for a price of Y. Now, I want concrete evidence on how implementing a 256 core Maxwell GPU would be more economical for Nvidia than implementing a 256 core Pascal GPU.

short term both Nintendo and nividia can get huge wins, Nintendo can get Maxwell dirt cheap on a clearance sale, and nividia can get rid of unwanted inventory, and not have to sale there new tech cheap, they of course probably have a deal for a revision version with pascal in the near future, it's pretty safe for both parties.
 
Don't dev kits usually overshoot the specs? Like how the ps4 and xbox one dev kits had 7970s in them when the end consoles were about half as powerful?

Not sure about that, but the more relevant way to word that, IMO, is, has any company ever released a dev kit that is significantly slower (like Pascal vs. Maxwell) than the final hardware?

And I'm not really aware of any, although I haven't really kept track of all devkits through the years.

IMO, if Pascal was the target GPU of final hardware, they wouldn't be using a TX1 for the devkits.

Regards,
SB
 
Basically what I mean is that Nintendo is not buying any existing chip. And neither are they buying GPU IP or a GPU block and implementing it on their own SOC. Nintendo asked for a custom SOC with X capabilities and a complete software stack, for a price of Y. Now, I want concrete evidence on how implementing a 256 core Maxwell GPU would be more economical for Nvidia than implementing a 256 core Pascal GPU.
Why do you assume an equal number of cores?
 
short term both Nintendo and nividia can get huge wins, Nintendo can get Maxwell dirt cheap on a clearance sale, and nividia can get rid of unwanted inventory, and not have to sale there new tech cheap, they of course probably have a deal for a revision version with pascal in the near future, it's pretty safe for both parties.

But there is not inventory.... it's a custom SoC...

Why do you assume an equal number of cores?

And why should I assume a different number of cores? Especially when the rumors say that it is a 256 core, but it's not clear if final hardware will be Maxwell or Pascal?
 
custom could mean anything, slight modification to a existing to tx1. people really over exaggerate the word custom. ps4 and xbox are custom gpu's they are just cut down versions of the orginals.
Not really. Both have notable customisations. PS4 has new APU features, 8 ACES, and stuff. XB1 has DX12 enhancements, a few extra gubbins (3 display planes, 2 more DMA units), DSP, yada yada. Customisations aren't massive, but are different from 'just cut down versions of the originals'.

Not to mention 8 CPU cores!
 
Status
Not open for further replies.
Back
Top