Nintendo announce: Nintendo NX

Status
Not open for further replies.
XO at 1.3 TF it´s already more or less 900p to the 1080p PS4 games, even some of them going to 720p (frostbite)
Getting ps4 fidelity at 800p in a portable at 768 gflops it´s stretching a bit.
Like rapso said, it would depend a bit on how well the developers could make use of the 2*FP16 feature (assuming it passes on from TX1 to NX) and on the fact that Maxwell and Pascal GPUs get better ALU utilization than GCN.
If "approaching XBone" means being within 80% of its performance, a Pascal doing 768 GFLOPs could do the trick.


I'd worry way more about memory bandwidth and CPU performance if it's really X1. (quad core ARM + 25GB/s)
Eurogamer suggested it would be a Pascal part, and the TX2's apparent focus on CPU performance doesn't put it very well placed for a gaming console.
Besides, when did any console maker ever used a 100% off-the-shelf component?


The main restriction of NX is cooling solution (even with a dock). Its main part should be smaller than a 7-inch tablet. So far we don't see any 7" tablet with GPU better than A9X of apple.
It doesn't have to be as thin as a tablet. That would be too restrictive because it's not a tablet. The Wii U tablet controller isn't thin, and the original SHIELD handheld had was actively cooled.


And sales are...? AFAIK the main purpose/value for Vita TV is streaming PS4 to another room.
Thousands of people are using DVD players with 1080p TVs. The Vita TV sold really well in Japan, as the platform has been really popular for visual novels and JRPGs.
Regardless, let's hope the target resolution isn't 960*540p. That look terrible for today's standards indeed.


I think anyone expecting more than 200 gigaflops for the NX in handheld is setting themselves up for disappointment.
But expecting the console to have 60% of the performance of a 2 year-old Tegra K1 that is currently going into the $200 Shield Tablet is not?
 
ugg. Really hating this concept of AMD flops and Nvidia flops. It really needs to die. That's like saying there are UK meters and CAD meters. They are the same meters! But people treating this like Imperial vs Metric but calling it the same name! In this case flops!

At the end of the day it's just math, if the formula to calculate flops should add a variable like utilization or something factor it in! There needs to be some "variable" that can help translate flops, as flops are a base measurement, it's just not being calculated correctly and it's highly confusing!


Sent from my iPhone using Tapatalk
 
You're right, but I think the talk is sanely factoring that in among those who know. They talk about flops not being the same as a shorthand for utilisation and work-done-per-flop.
 
I think its very possible for this to be a pretty standard Tegra X1 chip. Depending on what price point Nintendo is shooting for, it could make a $199 launch price very obtainable. I look at this as Nintendo's way of combining its markets without really alienating either gamer. If you preferred Wii U, the NX will give you everything the Wii U did and more, and same goes with the 3DS. The Tegra X1 is a pretty nice step up from the Wii U hardware, and a huge step up from 3DS performance. Nintendo needs a concept where their product can appeal to everybody. If its a product designed to be a direct competitor to X1/PS4, its going to fight an uphill battle, but if its something very different from those products, and can appeal to those gamers despite the fact that they own an X1/PS4. In similar fashion to the idea that a ton of Xbox360 and PS3 gamers also owned a Wii.
 
What kind of screen will NX adopt? A glass-free 3D screen? OLED screen? Or a common LCD screen?

Personally I prefer OLED panel because the perfect contrast, and wider color gamut. Since Nintendo has used Tegra X1, maybe we can also expect a good screen?
 
The shield TV draws about 20w when gaming. The handheld would need to draw less than that wouldn't it, unless the battery is huge?

Even if they go Tegra X2, 16nm and Pascal would only offset lower power at best?
 
ugg. Really hating this concept of AMD flops and Nvidia flops. It really needs to die. That's like saying there are UK meters and CAD meters. They are the same meters! But people treating this like Imperial vs Metric but calling it the same name! In this case flops!

At the end of the day it's just math, if the formula to calculate flops should add a variable like utilization or something factor it in! There needs to be some "variable" that can help translate flops, as flops are a base measurement, it's just not being calculated correctly and it's highly confusing!


Sent from my iPhone using Tapatalk
The other problem is it's all based around pc, we have seen recently (Doom Vulkan) that even on pc that when it's coded to take advantage of the architecture, like it would be on a console that those utilization type factoring are pretty worthless.
 
I really hope NX will be strong enough to do Frostbyte games. I would really like a FIFA I could play both at home and bring with me.

BTW, would it be possible to do a portable PS3 in the size of a Vita? I think that would be a really good niche HW market for Sony.
 
The shield TV draws about 20w when gaming. The handheld would need to draw less than that wouldn't it, unless the battery is huge?
Google's Pixel C uses a Tegra X1 and its battery life is just fine. Clocks are ~15% below the Shield TV, IIRC.

The lower performance target is because it will be mobile and needs to hit the 6.5 to 8 hours of battery life if they want any chance of being accepted by consumers.
Why does it need to hit 6.5 to 8 hours? Both the Vita and the 3DS had a battery life of 3-5 hours (for gaming) when they released.
3-5 hours would be more than enough, especially if you can charge it through USB.

As for everyone predictions based on actual release date of March 2017, please remember that the original release target for the Nintendo NX was much earlier.
Initial rumors placed it as a Holiday 2015 release. Yes 2015. It had been delayed a year til Holiday 2016 and then pushed again to Spring 2017. I think everyone needs to temper their predictions accordingly. I expect it to be 25% less performant than the current Nvidia Shield set top box.
The release window has always been "fiscal 2016", which effectively goes until March 2017.
I don't remember seeing anything about Holiday 2015.

Just more of my opinions... There won't be any digital distribution for the games. The local storage will only be 16gb (or less) to contain the OS and save points.
All Nintendo consoles, handheld and home, released during the last 10 years, have SD card readers.
From Wii and DSi to Wii U and New 3DS, all of them.
 
Like rapso said, it would depend a bit on how well the developers could make use of the 2*FP16 feature (assuming it passes on from TX1 to NX) and on the fact that Maxwell and Pascal GPUs get better ALU utilization than GCN.
If "approaching XBone" means being within 80% of its performance, a Pascal doing 768 GFLOPs could do the trick.
That means Xbox-one is nearly 70% faster rather than 20% if it had 768 Gflops.
Unfortunately it still does not add up, especially if considering the Emily Rogers source leak that has been correct so far about Nintendo's next console.
And just how much work can FP16 be used in games to replace FP32 when using a 1080p resolution (for TV although appreciate developers will use temporal techniques).
Cheers
 
Eurogamer suggested it would be a Pascal part, and the TX2's apparent focus on CPU performance doesn't put it very well placed for a gaming console.
Besides, when did any console maker ever used a 100% off-the-shelf component?
hardware is always customized, I agree. but it's never maxed out and rarely cutting edge tech. e.g. PS3 shipped with some G70'ish while pretty much at the same time NV released G80.
I would think, if we focus on the handheld (kinda 3ds replacement), an X1 would be a very competitive cpu|gpu.

ugg. Really hating this concept of AMD flops and Nvidia flops. It really needs to die.
what really needs to die is to compare final performance of two different architectures solely based on gflops. I agree it could give a hint about the rough magnitude of performance difference. But some chip with 4TFlops from architecture B won't be by slower/faster than another prozessor with 5TFlops from architecture A. There is not much value in this kind of comparisons.

At the end of the day it's just math, if the formula to calculate flops should add a variable like utilization or something factor it in! There needs to be some "variable" that can help translate flops, as flops are a base measurement, it's just not being calculated correctly and it's highly confusing!
That's maybe the source of the problem. flops are not "measured" here, it's a pure peak number based on hardware design specs. That's what I wanted to hint in my former post. we should not judge purely based on some theoretical numbers. Rather take several games that run on x1 and on gcn, and use these to guestify the performance. It's way more useful than flops. (like eurogamer did).
(it would be a different story if XN would be expected to have 5x less or more flops than an xb1. but within 50%, the numbers don't give significant information).



nobody dares to comment on my SLI'ish idea of coupling several X1? :)
 
That means Xbox-one is nearly 70% faster rather than 20% if it had 768 Gflops.
Pascal gets around the same real-world performance as GCN2 with ~40% less theoretical compute throughput (as seen with e.g. a GTX 1060 vs. R9 290X comparison).
So roughly speaking those 768 GFLOPs could perform close to a 1.1 TFLOPs GCN2, which would be within 80% of the XBone's 1.3TFLOPs.

Of course, the situation changes a lot when async compute is used (another >20% towards GCN2?), but the 2*FP16 on some pixel shaders could equal things up a bit.
 
Pascal gets around the same real-world performance as GCN2 with ~40% less theoretical compute throughput (as seen with e.g. a GTX 1060 vs. R9 290X comparison).
So roughly speaking those 768 GFLOPs could perform close to a 1.1 TFLOPs GCN2, which would be within 80% of the XBone's 1.3TFLOPs.

Of course, the situation changes a lot when async compute is used (another >20% towards GCN2?), but the 2*FP16 on some pixel shaders could equal things up a bit.
We know for a long time that nVidia PC drivers are much better optimized.

http://www.startlr.com/ten-tesla-m40-could-not-surpass-eight-radeon-r9-290x-in-gpupi-1b/
 
We know for a long time that nVidia PC drivers are much better optimized.


Yes, they are, however:
1 - the RX480 isn't GCN2 (which should be closest GCN version towards the XBone), it's GCN4 which brings significantly higher performance-per-clock and performance-per-GFLOP.

2 - That video shows about 25% better performance, which despite the above is not too far away from the 20% boost I predicted (plus, async is still not being used on Pascal IIRC?)

3 - Doom is definitely not making heavy use of FP16 for pixel shaders (probably none at all), otherwise that GTX 1060 could be hilariously stuck in single-digit FPS because nvidia dwarfed the FP16 capabilities for the consumer desktop Pascal GPUs.
 
It's a tablet PC, while NX is a complete game console package.
The tablet PC requires an XB1 to stream games, NX is a game console.
The PC+XB1 costs $1000+, NX will be priced as a game console.

All of those things exist only if you are using that to play XB1 games.
If you use it to play W10 games, you don't need an XB1.
It's a PC in it's own right, it requires no additional hardware to play games on.

How is a tablet PC with the controller attachment less than a "complete game console package"?

You can plug it into a TV if you like, you can detach the controllers if you like.

I'm missing as to what is different unless there's hidden features in your definition of "complete game console package" that you aren't spelling out.
 
Status
Not open for further replies.
Back
Top