Nintendo announce: Nintendo NX

Status
Not open for further replies.
Add stereoscopic cameras and relief screen for AR, should make the next generation of Pokemon AR very popular ;)
(Ok not as much as on mobile since you don't have to buy a console when you already have a mobile ^^)
 
Eurogamer also wrote this article:
IMO, the article makes no sense overall because it's based on the assumption that it's either a Tegra X1 or a Tegra X2.
There's never been a single console, from Nintendo or any other, that uses off-the-shelf SoCs. The Tegra X1 isn't appopriate for a console because there's a Cortex A57 module in there that's used for short bursts of high power consumption in the presence of short-lived single-threaded tasks (mostly javascript?). The Tegra X2 makes even less sense with that weird choice of Denver+A57 core mix.

Righto, and have to fish for the HDMI cable every time you wanna use it or have it schlepping messily on the side of your TV when you're not using it? (AND the charging cable as well btw.) Awesome suggestion! So why not simply have a small dock there which holds the cables in place, and also holding the console in place while docked instead of having it loitering like a bum around your AV gear?
The ability to use HDMI cables shouldn't be mutually exclusive to having a proper dock with all connections in place. The Surface Pro line has dedicated USB and Mini-DP ports and a Thunderbolt-esque proprietary connector for charging and connecting to the dock (with audio+video+usb ports).


Why even fuck with hdmi cords? Why not use the existing wireless android screen casting technology that all modern phones and tablets support?
Latency and not getting compression artifacts?
IIRC, at least until a while ago it was almost impossible to use a Chromecast to play games through an Android phone/tablet, because of the ~1s latency. Same with Intel's WiDi solutions.


Put a camera in a base that charges and hooks to the tv and then let the player put the console on their head and play vr games . With ir tracking they would be able to track the controllers and visor.
The NX won't do VR.
 
It is pretty safe to let VR outside of the discussion IMHO, won't happen.
The target performances are pretty clear run Wii U games at the device screen resolution. Its purpose seems to unify home and handheld console performances and offer traditional (from the retro experience to the Wii/WiiU) Nintendo gameplay on the go. I suspect that split controller will prove dangerously close Wiimotes in their capabilities.
I can see how it makes sense for Nintendo to have three systems offering a unified experience, the thing is the home console require different hardware than the handheld and the Nx is thanks to the raise in TV resolution. Nintendo controls the resolution on its portable device, from there it is possible to offer last gen and more type of experience with cheap hardware. Custom hardware for Nvidia could power both the Nx and a hypothetical new DS.
It would appreciate something minimal but at last up-to-date.
 
We know AMD's flops are not equal to Nvidia' flops. How about mobile SOC versus console? Can we really say that mobile SOCs already defeat last-gen consoles because of higher Gflops?

In this video :

Metal Gear Rising has much better performance on xbox360 than Shield TV.

And don't forget Doom. Although it runs 1080/60 @ shield TV but image quality is not as good as last-gen consoles.
 
IMO, the article makes no sense overall because it's based on the assumption that it's either a Tegra X1 or a Tegra X2.
There's never been a single console, from Nintendo or any other, that uses off-the-shelf SoCs. The Tegra X1 isn't appopriate for a console because there's a Cortex A57 module in there that's used for short bursts of high power consumption in the presence of short-lived single-threaded tasks (mostly javascript?). The Tegra X2 makes even less sense with that weird choice of Denver+A57 core mix.
Is it really used only for short bursts? Why would it, in the shield TV? There shouldn't be any thermal issue, as they can use an active cooling. Power wise it shouldn't be of a problem either, there is no battery etc.

The X! CPU is still the old design of big+little, where the OS can just switch between both configurations (on the newer b+l all cores run at the same time and the OS balances the threads). This could be interesting to switch between the gamepad and stationary usage.
 
You trolin, mate?
we had that conversation a few pages ago (I think p98). It means the peak numbers don't give a good way to compare two different architectures. It gives you a clue if these numbers differ by integer factors, but if an XB1 has 20% more GCN-flops than NX has Maxwell/Pascal-flops, it's really nothing to put your finger on and claim either of these to be faster.
 
Metal Gear Rising has much better performance on xbox360 than Shield TV.
Flops on the GPU are only a small part of the whole performance equation, which includes CPU, RAM, OS/API overheads, and title quality (cheap port or focussed development).

Furthermore, that one example isn't representative as the video itself says. There are other X1 games (Doom 3 BFG edition) that outperform their 360 counterparts. The video also mentions flops and how the X1 is a class apart.

So, in short, you shot yourself in the foot with that reference. :p
 
Flops on the GPU are only a small part of the whole performance equation, which includes CPU, RAM, OS/API overheads, and title quality (cheap port or focussed development).

Furthermore, that one example isn't representative as the video itself says. There are other X1 games (Doom 3 BFG edition) that outperform their 360 counterparts. The video also mentions flops and how the X1 is a class apart.

So, in short, you shot yourself in the foot with that reference. :p

DF also flagged in their article that Revengance and RE5 were titles that were worse on Shield than 360 but down to shoddy DirextX ports.
 
Is it really used only for short bursts? Why would it, in the shield TV?

The Shield TV isn't tweaked for mobile usage, since it's always connected to the power outlet. AFAIK, its SoC has the Cortex A53 module always disconnected, the A57 module runs at the full 2GHz all the time and all the power saving features are turned off. That's why the power consumption at the wall is close to 20W (actual SoC consumption should be closer to 15W, considering ~80% PSU efficiency and 1W for peripherals).

The Pixel C should be more representative of what to expect from a possible Nintendo handheld with a TX1, though I don't think the NX has a TX1.




Flops on the GPU are only a small part of the whole performance equation, which includes CPU, RAM, OS/API overheads, and title quality (cheap port or focussed development).

Furthermore, that one example isn't representative as the video itself says. There are other X1 games (Doom 3 BFG edition) that outperform their 360 counterparts. The video also mentions flops and how the X1 is a class apart.

So, in short, you shot yourself in the foot with that reference. :p

To be frank, the X360 versions probably had a lot more man-hours for low-level optimizations than the Tegra adaptations. DF's comparison may be the fairest possible, but it's probably not representative of what the TX1 should be able to produce.
 
well at the very least the nx should have a ram advantage compared to last gen consoles. I still suspect 4gigs of ram or more.
 
we had that conversation a few pages ago (I think p98). It means the peak numbers don't give a good way to compare two different architectures. It gives you a clue if these numbers differ by integer factors, but if an XB1 has 20% more GCN-flops than NX has Maxwell/Pascal-flops, it's really nothing to put your finger on and claim either of these to be faster.

True.

However, for example, if NX has 512 Maxwell/Pascal gflops and Xbox One has 1310 GCN-gflops, I really doubt the NX can match Xbox One, wouldn't the (2.5x) for architecture differences and Nv vs AMD flops to overcome. As far as officially stated raw flops, Tegra X1 has 39% the performance of Xbox One GPU. Lets give Maxwell some benefit and round up to 50%
 
I guess according to the latest GAF rumors, the GPU will be Pascal based.

How much of a Pascal GPU would you need to run an Xbone/PS4 game at 540p (assuming that is the screen resolution)? It's probably doable on a 256 gigaflop handheld.
 
Latency and not getting compression artifacts?
IIRC, at least until a while ago it was almost impossible to use a Chromecast to play games through an Android phone/tablet, because of the ~1s latency. Same with Intel's WiDi solutions.

Nintendo knows a thing about that ;)
Remember Wii U, that's perhaps the most used and best form of game streaming over Wifi, but they used 480p over 5GHz.
It's a favorable case, since they control hardware and software on both ends, and they could do it again by building Wifi hardware in the dock. Upgrade to a H265 configuration instead of a H264 one.

Yet I can see two problems with that : this makes the dock more expensive, and unlike the Wii U you have to add the TV's lag.The lag might be more noticeable on a giant TV, and the audio might be very slightly out of sync. The 5GHz spectrum might be crowded too, whereas it was virtually unused at Wii U's launch.


Late edit : wireless display and plain wireless anyway would deplete the handheld's battery.
 
Last edited:
How much of a Pascal GPU would you need to run an Xbone/PS4 game at 540p (assuming that is the screen resolution)? It's probably doable on a 256 gigaflop handheld.
Maybe for the 1080p games on XO. They'd probably want to alter geometry LOD though.

Shader aliasing could get fugly. :p
 
I suspect them comparison between the NX and the X1 are completely out of lace, one system is +100Watts the other not. As for comparing the Shield TV with the X1 it is more or less the same, different silicon budget, power budget, OS, the amount of RAM and storage, they are not in the same league.

I would be happy to see Nintendo deploying something like that:
x1 A73 with 2 MB of L2, x4 A35 1 MB of L2. One GPC, one SM, 128 Cuda core able to process FP16 at twice FP32 speed, 1 triangle every two cycles, 4 ROPs and a good amount of L2.
As for the RAM I would be happy with 1GB of fast RAM. For a device with a single screen I would stick to 480p / FWVGA as there is a lot of content optimized for that rez more than for qHD /540p or the 600*1024 that appeared on many cheap tablets.

If Nintendo could deploy that SOC to NX and an hypothetical new DS, it could really bring handheld into a new performances field, after Sony gave up on the Vita.
 
If they went with FP16 then this adds even more headaches for developers as it is not just about porting-development HW considerations but also key aspects of the game-rendering engine from FP32 to FP16.
Cheers
 
we had that conversation a few pages ago (I think p98). It means the peak numbers don't give a good way to compare two different architectures. It gives you a clue if these numbers differ by integer factors, but if an XB1 has 20% more GCN-flops than NX has Maxwell/Pascal-flops, it's really nothing to put your finger on and claim either of these to be faster.

It's not exactly comparable there where on PC AMD was hampered by less well tuned drivers for Dx11 and OGL. That's not an issue on consoles. So you could also say that console AMD FLOPs are greater than PC AMD FLOPs. However, in Dx12 and Vulkan, we're seeing instances where AMD hardware is scaling closer to peak FLOPs as well.

In other words, Nvidia FLOPs > AMD FLOPs is just an internet meme that is a result of AMD's previously less optimized Dx11/10/9 and OGL drivers compared to Nvidia. And it wasn't that the FLOPs were less capable.

Regards,
SB
 
Status
Not open for further replies.
Back
Top