Nintendo announce: Nintendo NX

Status
Not open for further replies.
A: that you/we haven't heard any complaints doesn't mean there aren't any.

B: we haven't really heard ANY devs talking about wuu hardware, off the record, in-depth. Perhaps this is because there are very few games that actually try to push the console.

C: drawing takes place in eDRAM, but textures have to come from main memory. 12.5GB/s is seriously pathetic any way you choose to look at it.

Shin'en has been forthcoming, and so was Criterion with Need for Speed Most Wanted, where they used higher res textures than they did on 360/ps3, and digital foundry showed better performance on Wii U as well. I know I don't have an overwhelming amount of developer comments backing up my point, but you have none, just your personal opinion that the memory bandwidth must be a bottleneck. I don't think I am going to change your mind, and you certainly haven't provided any conclusive evidence supporting your claim, so we should probably just agree to disagree.
 
Although this may be inappropriate for this thread, I have to ask if you have any opinion on just how common code that ran at peak fp rate was on the 360. The CPU always seemed extremely top heavy in terms of peak fp vs. general code performance and overall capabilities. But then, my coding experience is from chemical science as opposed to games.
If (close to) 30% of CPU cycles were spent in tight, full rate loops, then you are obviously forced to find other ways to achieve your results, if on the other hand it was typically on the order of 5%, then the lack of strong SIMD capabilities per se are not necessarily a big deal when porting.
I never really saw any utilization statistics.
Those old in-order PPC cores were pathetic in running general purpose code, such as unoptimized game logic code. Memory stall cost was 600 lost cycles, and the CPU didn't have any data prefetch hardware. You had to manually prefect every single cache line (even in linear array accesses) or you suffered 600 cycle penalty for each 128 bytes (that was the cache line size). If you didn't cache optimize your memory access patterns and didn't manually write prefetch, that particular code was running at least 10x+ slower. Another big bottleneck for generic code was LHS stalls. There was no store forwarding hardware. If you accessed a memory location that was recently written, you suffered a ~40 cycle stall. The problem is that common calling conventions write parameters to stack (= memory) and then the function reads them from the stack (= memory) at the beginning. This is a sure way to get LHS stall. Practically every function call caused LHS stall, so people had to inline everything and that killed the instruction cache.

Vector code could reach high IPC, but only if you unrolled loops heavily. There was 128 vector registers (per thread, so 256 per core). As said earlier spilling data into stack resulted in 40 cycle LHS stalls, so you had to keep everything in registers. Moving data between scalar registers and vector registers also caused LHS stall, because there was no direct path between the two register files. Data had to move through memory (load->store = stall). You couldn't directly move vector data to scalar register to perform branching. Branching by float/vector data always caused a big stall. The vector pipeline was super deep (12-14 cycles for mad and aos dot product). Because of this, you had to have lots of independent instructions in flight. 2x128 registers made this possible. With no register renaming tight loops were not possible. One unrolled loop iteration had to contain enough independent instructions to use most of the registers to fill the pipelines.

Practically people where forced to separate vectorized code to big unrolled loops and only use vector registers and vector instructions in this chunk of code (to avoid LHS stalls). Mixed vector + scalar code was not efficient at all. So if you wanted to vectorize something, you had to hand write the whole processing function with pure VMX-128 intrinsics. And it had to process lots of data to hide the pipeline latency and the LHS stalls on both sides. So VMX-128 programming was similar to offloading data to GPGPU. But of course had much lower latency, so didn't have to do it asynchronously (and fetch the results back next frame).

We VMX-128 vectorized many things, and so did most studios. We had several threads (such as particle simulation) that were pretty much running pure VMX-128 code. But most of the generic game/engine code was not vectorized at all, and was thus running at very low IPC. I was frequently profiling game/engine code and adding cache hints and fixing the most crucial LHS stalls. Only a small part of the code was vectorized, but quite a big percentage of CPU time was spend in optimized VMX-128 loops. Hard to say any typical numbers since every game is different.

Update:
We had 6 (core locked) threads like most other studios (3 cores with SMT/hyperthreading). It was crucial to have two threads running per core as it was the only way to hide all these long stalls. Each hardware thread had their own register file, so SMT doubled the available registers. Two VMX-128 threads on the same core had in total 256 4-wide vector registers (that's a lot, even by todays standards). So it was definitely possible to keep the long pipelines fed with independent instructions.
 
Last edited:
So Bethesda's Pete Hines just commented on the NX:

Pete Hines said:
We talk to Nintendo all the time – we’re pretty well briefed in on what they are doing. It’s definitely something we will look at; and our philosophy is that we will put our games out on any format that supports the games as we envisage and make them. If the NX fits that from a technical standpoint, and fits the game that a developer in our stable is making, I don’t see why we would not put it out on NX. But it’s too early to say, ‘we’ll definitely be putting games out or not.’ Like with mobile, we want to have the right fit for the right formats.


Doesn't look like he's confident that the NX will be able to run Bethesda's flagship titles.
At this point where the hardware specs are finalized, the "wait and see" approach is sounding a lot like "this is too weak to run our AAAs but if it's selling lots of units then perhaps we'll do some casual stuff for it."

And thus begins the 3rd party absence.
 
So Bethesda's Pete Hines just commented on the NX:




Doesn't look like he's confident that the NX will be able to run Bethesda's flagship titles.
At this point where the hardware specs are finalized, the "wait and see" approach is sounding a lot like "this is too weak to run our AAAs but if it's selling lots of units then perhaps we'll do some casual stuff for it."

And thus begins the 3rd party absence.
That's not what I'm seeing. Well, yes, their absence, but I see it in their wait and see attitude. How long does it take to see if a console is a success or not? Once it has been determined that the console is successful enough, how long does it take to produce a game for it?
Western third party support for the NX is a bit of a chicken and egg problem. They won't support it unless they can see a sufficient market, but it is difficult to build volume without the major third parties on board.
 
A: that you/we haven't heard any complaints doesn't mean there aren't any.

B: we haven't really heard ANY devs talking about wuu hardware, off the record, in-depth. Perhaps this is because there are very few games that actually try to push the console.

C: drawing takes place in eDRAM, but textures have to come from main memory. 12.5GB/s is seriously pathetic any way you choose to look at it.

I would be interested in hearing more details on the Wii U hardware from devs as well. Perhaps a post mortem of sorts. Still, Wii U's bandwidth makes sense considering it has half the texture units as the 360. I'm not sure how they pull off some of the better ports (if the originals were truly pushing the Xenos hardware), but I'm betting that on-chip cache has alot to do with it. Iirc, Xenos has only 32 kb texture cache, while the R700 era hardware that Wii U is based on features a hierarchy of 8 KB L1 per SIMD engine (so 16 kb total) and 64 KB L2 caches (I think it's 2 x 32 KB on Wii U, but that's based on some guesswork and SRAM counting on the die--I was really bored back then). Each likely has much higher throughput than the cache on Xenos if we go on the improvements made in R700.
 
Bethesda want to port Fallout Shelter, a promotional metagame really, from its current 2 billion-ish mobile devices to a few million NX machines??
 
Bethesda want to port Fallout Shelter, a promotional metagame really, from its current 2 billion-ish mobile devices to a few million NX machines??
Want? No. Can do easily? Probably yes. Easier than Fallout 4 or Skyrim Remastered.
 
People put to much value in having parity with the other gaming consoles. What Nintendo needs to focus on is having a wide library of games that aren't available on those other platforms, and that means reaching out for third party exclusives. Simply being another platform that has third party multi plats wont do much. It obviously doesn't hurt, but its not the silver bullet many people make it out to be. Nintendo also needs to take the chains off games like Splatoon and Mario Kart, and bring them voice chat. Online multiplayer is contagious. When your friends are playing a game online, you are more likely to make the purchase to join in, but without proper voice chat, the impact is minimized. I would also like to see Nintendo reach out to some third parties to resurrect some franchises that have been dormant for years. A new exclusive Prince of Persia, Beyond Good and Evil, Time Splitters, and even some neglected first party titles like F-Zero, Wave Race, and 1080. Nintendo also needs to expand its western development houses like Retro and NST. Even if NX is more so a powerful portable, if there is a wide spread line up of games, with broad appeal, Nintendo can find success. Success for NX I believe is 50 million units of hardware. We are consolidating two pieces of hardware into one, so this new platform selling a little less than the 3DS+Wii U numbers seems very obtainable, while also being realistic that Nintendo's platform will still be more niche than the PS4/X1 market.
 
People put to much value in having parity with the other gaming consoles.
No, they don't. If you don't have hardware parity you can't receive ports of multiplatform franchise games. Development costs are such today that you can't afford to build a triple-A game that sells only to one single console, unless your name happens to be Nintendo, and the reason THEY can afford it is they don't need to pay royalties to anyone, and their hardware is typically underpowered and graphics is typically cartoony (and thus cheaper to realize) rather than the much more expensive photorealistic-imitation look most other big titles go for.
 
Although somewhat true, more specifically 1st parties can afford to sell to a single console. UC4 certainly didn't come cheap. So Sony and MS as well as Nintendo can target whatever hardware they have. But yes, for everyone else, to have a decent library, you need a console that can run ports without fuss.
 
If you believe that being a 4th option for playing multi platform games will make NX a success, then sure, hardware parity is important, but I personally don't see Xbox, Playstation, and PC gamers flocking to the NX to play games that they have been playing elsewhere.
 
If the only option for Nintendo is to be a Nitnendo-own-brand console, they will be limiting themselves to a tiny market and probably becoming too small to sustain themselves as is. The market for Nintendo enthusiasts who'll buy a N. console to play Mario et al is shrinking every generation.

Now if they had a console that could serve the COD and FIFA and GTA and Overwatch and Destiny players and also provide unique, family friendly, high quality N. games, then they'd have wide appeal and could snatch up a lot of customers who'd otherwise go with Sony or MS.

It's called 'competing'. :p
 
If the only option for Nintendo is to be a Nitnendo-own-brand console, they will be limiting themselves to a tiny market and probably becoming too small to sustain themselves as is. The market for Nintendo enthusiasts who'll buy a N. console to play Mario et al is shrinking every generation.

Now if they had a console that could serve the COD and FIFA and GTA and Overwatch and Destiny players and also provide unique, family friendly, high quality N. games, then they'd have wide appeal and could snatch up a lot of customers who'd otherwise go with Sony or MS.

It's called 'competing'. :p

Yes, but I believe the trick is to offer something that attracts some of those consumers in a different way. Sort of like many 3DS gamers own an Xbox or PlayStation, because what they offer is distinctly different. Fighting the vicious battle with Sony and Microsoft by copying what they offer just doesn't seem like a recipe for success. If your going in that direction, why not just work out an exclusive deal with Sony or Microsoft and go software only.
 
Although somewhat true, more specifically 1st parties can afford to sell to a single console.
1st party games have the advantages of generous cross-promotion by Sony, Microsoft and so on, often being included in console bundles, and while I'm not privvy to the exact business deals concerning 1st party developers, I could well imagine they might even get a more beneficial royalty deal as well. *shrug* :)

I personally don't see Xbox, Playstation, and PC gamers flocking to the NX to play games that they have been playing elsewhere.
The point would be for gamers to not play multiplat games on other consoles first and then again on NX, but rather first, primarily, on NX. If you're gonna poo-poo multiplat games, then you're saying NX is for Nintendo games (and waggle shovelware shite made cheaply, quick and dirty by 3rd parties), and we saw with wuu how well a nintendo-centric strategy worked. It's not just the wacky controller for wuu which held it back, it was the lack of power and lack of 3rd party titles. You're not going to buy a console which has only one major game coming out for it per quarter (at best), because Nintendo just can't produce them any faster than that in-house.

They need 3rd parties, and to attract them, they need performance parity. Otherwise they're just gonna keep eating their own tail ouroboros-style until they disappear.
 
...and we saw with wuu how well a nintendo-centric strategy worked. It's not just the wacky controller for wuu which held it back, it was the lack of power and lack of 3rd party titles.
This. Imagine Wii U was next-gen quality, ~XB1 power. Maybe launching a bit later. It'd have been attractive to core gamers in the same way XB1 and PS4 are, and continued to compete with PS4 and XB1 after they came out even if not as powerful by offering a unique control experience. I'll add, while I'm dreaming, that the Wuublet was a classy design and not a preschooler style toy. That machine would have been competitive and had numerous reasons to appeal to a wider market. But N.'s anaemic design meant they left the core gamers behind, left the third parties behind, and could only offer a limited experience for their titles relegating the console to a Nintendo-enthusiast machine. Nintendo got where they are today by appealing to a very wide range of gamer. Closing down aspirations to cling onto their dwindling loyal userbase is accepting a long, slow death.

NX may compete if it offers a decent enough core experience and gets the games. But the rumoured spec makes it sound like only high-end mobile games will be interested. Maybe PC ports? Stick nVidia's game services (Tegra Zone?) on it and at least it will have a reasonable library without having to wait for Nintendo to make stuff.
 
But it was probably that stupid Wuublet, with crappy screen that probably made it impractical to offer more CPU and GPU power.
 
Status
Not open for further replies.
Back
Top