Nintendo announce: Nintendo NX

Status
Not open for further replies.
I guess the whole question to me is if they plan to re-gussy Wii/WiiU tech for the 14th time.

If they dont, then yeah no theoretical reason they cant do something nice and tidy with the hardware.

Of course, unless it is some kind of hybrid mobile and that also hampers it...
if it's a hybrid, then it will ensure decent performance - for a portable.
I find this comparison interesting, because that portable chip is already in full production on a less mature process, and at a yield friendly die size.
 
Well although that was already talked about, Nintendo doesn't necessarily need to be on par tech spec wise, it could be different, I'm thinking about PowerVR Wizard here.
I have no reason to believe they will, but it would be interesting and that would allow them to have a system that cannot be directly compared with its competitors, freeing them/fanboys of that war...
 
Power (Watts) would be a valid reason for considering embedded memory

Is it though, for a 2016 solution? For a mobile chip, there is LPDDR4 and Wide I/O2.
With LPDDR4 Samsung already has 12Gb chips rated at 4266MT/s. A couple of two-chip stacks at 64bit each would result in 6GB at 64GB/s. Three two-chip stacks would do 9GB at 96 GB/s.
Wide I/O2 would do 68GB/s with only one stack.

Of course, this would probably end up quite a bit more expensive than just getting 32MB of eDRAM again, occupying a huge chunk of its cheap and leaving little space for GPU compute and rasterization units.


With regards to the XB1, I think the embedded ram work out so badly because it's paired with the a terrible configuration of external memory.
IMO, the worst factor is the die area used for embedded RAM that could have been used for more execution units. That's why the PS4 gets 50% more GPU compute units and twice the ROPs.
If the XBone had the same amount of CUs and ROPs as the PS4, it would probably get the same results, but its SoC is already a tad bigger than Liverpool..


Fat DDR3 bus, lots of memory chips (now twice the number of the PS4!) that all need identical trace lengths, and low bandwidth. In a hypothetical XB1 where the APU had been paired with 128-bit GDDR5, you'd have seen reduced cost and complexity and kept the benefits of the esram while removing the biggest issue with the slow DDR3 (would probably have had less die area dedicated to memory IO too, so had a smaller chip). Obviously that would have meant only 4GB of memory, but most people aren't interested in the shit that takes up most of the other 4GB anyway. :eek:

I'm still hoping that down the lane both Microsoft and Sony will be able to switch their memory configurations and use newer technologies. Sure, trying to make a perfect fit into the timings might be hell, but it would pave the way for getting the consoles really tiny and even achieving a handheld version.


They could probably emulate the WiiU on any remotely fast modern system, assume they were prepared to go down that road. Nintendo seem to like maintaining hardware BC for one generation and then ditching it.

Well they finally started to do day-one emulation for BC with the Wii U's Terascale GPU having to emulate the TEV shaders from the ArtX architecture.
 
The NX probably has some kind of ARM processor in the APU I would imagine. If BC was an option for NX to play Wii U games, how powerful would that ARM CPU need to be and would that even work?
 
The NX probably has some kind of ARM processor in the APU I would imagine. If BC was an option for NX to play Wii U games, how powerful would that ARM CPU need to be and would that even work?

If their Javascript performance are anything to go by, those PowerPC 750-esque cores seem to be have a rather weak single-threaded performance that a Cortex A53 could surpass even at the same clocks. Then again, it would still be emulating a different architecture, so higher clocks would probably be needed.
If Nintendo bothered to put some effort into it, maybe a quad Cortex A53 @ 2GHz could do the thing.
 
IMO, the worst factor is the die area used for embedded RAM that could have been used for more execution units. That's why the PS4 gets 50% more GPU compute units and twice the ROPs.
If the XBone had the same amount of CUs and ROPs as the PS4, it would probably get the same results, but its SoC is already a tad bigger than Liverpool..

I don't think that the die area used for the esram would automatically have been used for additional CUs if the esram hadn't been there. The opposite, actually. I think MS knew the power level they were targeting before they specified 32MB of esram, and this seems to be backed up by the leaked concept documents from a few years back.

Without the esram we'd simply have had a smaller chip and a more expensive memory subsystem. If MS had been targeting a more powerful system they could have gone for a larger chip even with esram as it's well below reticle size - the added cost of the die area may have balanced against saving %50 on external memory bus and chip count.

It's the cost of getting the bandwidth whether that's from embedded memory or external memory that matters IMO.

I'm still hoping that down the lane both Microsoft and Sony will be able to switch their memory configurations and use newer technologies. Sure, trying to make a perfect fit into the timings might be hell, but it would pave the way for getting the consoles really tiny and even achieving a handheld version.

Yeah, I'm interested to see if they can change configuration. Sony are in a pretty good place at the moment with only 8 chips and a smaller board footprint, plus GDDR5 8-mbit chips will be in use for years to come. If they wished they might even be able to go with a single 8GB HBM2 stack, and save on power and shrink the board right down. If anything, the reduced latency should make a transfer easier, I guess.

MS are stuck with 16 chips over a larger board area, and DDR3 is going to be in decline. DDR4 has higher latency, at least going by JDEC standards, which might make transition difficult. HBM2 would seem to be overkill, unless they also replaced the esram. But if you were going to re-engineer to that point you might as well release a new console.

I don't think we'll ever see current systems in a handheld format, the power gulf if simply too large even with optimistic power scaling below 10 nm.

What does this mean for Nintendo? I think DDR4 (possibly with some embedded ram) or GDDR5 are most likely. DDR3 is simply too old and slow and HBM2 is likely too new and expensive. I guess this speculation would be far easier if we actually knew what form factor(s?) the NX is going to come in! :D
 
I would bet a good deal of money the NX will not come out in 2016. Reminded today Nintendo only ever said they would release "details" about NX in 2016.

Video games are the most delay prone media ever, and Nintendo is the king of taking FOREVER.

I would go so far as to suggest 2017 is probably not set in stone either. At least if Wii U wasn't doing so poorly. I could see them aiming for 2017 then having it slip into early 18.
 
I would bet a good deal of money the NX will not come out in 2016. Reminded today Nintendo only ever said they would release "details" about NX in 2016.

Given Kimishima Tatsumi only recently said "This year, [2016] I will announce the details of our next-generation game machine, NX", that is a pretty safe bet.
 
A mid 2017 launch would make 14 nm, Zen and HBM2 far more doable. AMD have a Zen APU with > Carrizo CU count and HBM2 support due for around then.

Okay, lets do this! First early prediction based on ... nothing in particular.

Quad core, 8 thread Zen APU ~ 2gHz. 16 GCN CUs on chip, two disabled so 14 active. GPU clock about 700 mHz. 8 GB HBM2 and about 256GB/s BW. 14 nm, fabbed at Globalfoundries. Power consumption of chip package about 45 Watts, system draw about 60W at the wall. No handheld version of the chip as power draw won't scale low enough even with low clocks and disabled units.
 
They can't leave it too long from announcing to launching though, surely? They don't have a decent platform to sustain them before release, unlike PS360, and Wii U sales will no doubt be hit by a new platform a year-ish off and it losing already limited interest. I guess it depends on what launch software is available and if devs are progressed enough.
 
PS4 was nine months between announcement (February) and launch (November) and while Holidays would seem like a good time to launch the barrier is hardware manufacture and software availability.

True we don't know what NX is yet but I can't imagine that third parties are clamouring to be among the launch lineup.
 
My prediction, 2017 spring release with it being 75% the graphical and cpu powerhouse of the Xbox One and being $50 more expensive than the PS4.
 
I really don't think 3rd party software availability has any bearing on when Nintendo will release. IMO, if there's a big gimmick I think most likely Nintendo is playing it close to the vest with NX and launch will be mainly first party titles. For 3rd parties to bother, I think NX would have to match X1/PS4 in power and online infrastructure. That's a big hurdle.

Given how sales YoY for both WiiU and 3DS are down, I think they have to replace one of them this year. If NX is a platform, they could release the handheld this year and the home console next year. 3DS was released in February of 2011. It's due for an upgrade. Maybe it makes more sense to release a handheld NX on cheap 28nm and wait for 14/16nm to drop price in 2017 for the home console.

I know Nintendo has traditionally announced a platform a year or more in advance of release, but they've also stopped doing E3 press conferences and have proliferated Directs instead. I'm sure they're looking at how PS4 and X1 were announced and release (or even Apple's short time from announcement to release).
 
Do we really think 3rd parties would be lining up to develop launch titles once it's announced, such that Nintendo need to wait 2 years after announcing before they can launch? Whatever it is, 3rd parties will wait and see save for the cheap toe-dipping options. So Nintendo will have to provude the library that sustains NX, and that could/should be progressed now such that it's ready six months after the announcement (or however long).
 
I was just going to say the reverse, it takes Nintendo FOREVER to make a new Mario or Zelda, so why would we magically expect them to have those ready for a hypothetical NX launch anytime soon? If anything the software angle probably makes delays even more likely.

Also as I have stated it's my opinion the sooner NX comes out the less powerful it's likely to be, simply because more powerful=more time+more leaks IMO.

My prediction, 2017 spring release with it being 75% the graphical and cpu powerhouse of the Xbox One and being $50 more expensive than the PS4.

I doubt even Nintendo can screw things up that much, that would be very difficult to do. Of course that's assuming there isn't some external cost raising factor like the Wuublet or I dont know, some sort of console with a docking mobile device in one shenanigans. Which I guess, SOMETHING besides a traditional console is almost a given...but who knows how it will affect costs.
 
A 60 fps, 4K port of Zelda U at launch would be quite nice, especially seeing as Zelda U is going to sell like shit on the tiny active install base of the WiiU. Would also be a way to make a marketing bang for the system. People love Zelda, and are being conditioned nicely to know what 4K is. If you say "60 fps" at the same time they'll assume that's a good thing.

I'd also be happy to see a return for waggle controllers, as long as they were accompanied by a good standard controller. Wii Sports was the most fun you could in a room full of people with your clothes on.
 
Well I wouldn't exactly say "expecting".... :eek:

Shouldn't need 20 times the power in practice, what with various tricks now being developed to create a larger final frame buffer from depth samples and reprojection and all that.

For the sake of argument though, what would it take to be 20 times faster in practice? VLIW5 was first used by the dinosaurs to calculate asteroid strikes, so any modern GPU should be far more efficient. Desktop Carrizo should already be 6+ times faster, memory aside ....
 
Status
Not open for further replies.
Back
Top