Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
But the difference between Nintendo doing this in 2009 - 2012 and Sony/MS doing it pre-2005 should be obvious. There's an enormous wealth of knowledge on developing on the level of hardware Nintnedo is working with now - because developers have been working on this scale and with similar technology for over 6 years.
Collective knowledge != individual knowledge. The only way anyone at Nintendo has 6+ years' worth of knowledge of programming current graphical routines is if he's been recently hired on from Capcom or something. This is a case where I expect American third parties to be able to immediately produce better results than Nintendo's internal teams.
 
Maybe Nintendo could gain a lot by either purchasing a 3rd party graphics engine, or re-use one engine created by an up-to-date in-house set of tech heads, instead of every internal Ninty dev studio creating their own proprietary stuff for virtually every title or franchise they make.
 
Maybe Nintendo could gain a lot by either purchasing a 3rd party graphics engine, or re-use one engine created by an up-to-date in-house set of tech heads, instead of every internal Ninty dev studio creating their own proprietary stuff for virtually every title or franchise they make.
Licensing third party engines isn't always an easy solution, especially if the team is solely used to working with their own code base. Basically the team has to learn a new way of working. Change is usually even bigger for content development team, as each engine/middleware has their own tools, content pipelines and editors. Lot of new learning is required to get the team to full speed again.
 
I expect 3rd party games to make much better use of the hardware. Batman AA should be making pretty good use of the GPU, and the CPU should be well used as well assuming it's an OOO flavour of Xenon. In fact there's nothing about the CPU as we know it that'd make it hard to extract performance from. Nintendo's internal studios will be challenged by both parallel CPU code and a new GPU architecture, and they'll need whole new content pipelines, and maybe even whole new skills if they haven't been normal-mapping and the like. Nintendo should be around where Sony was with PS3, though with a simpler architecture that's a lot easier to use.
 


This doesn't say anything about it being 1080p, but it does sound like the game
has been cleaned up:

The graphics of the game showed off the Wii U’s horsepower and are comparable to the current generation counterparts. Environments were crisp and clean, Batman himself stood out amongst the scenery. Even in the dark, gloomy setting of Arkham City, level progression was obvious, visual cues stood out, and the textures were gorgeous.

The Wii U brings a few new features to the table that make the experience feel unique to Nintendo’s new console, the Wii U’s horsepower gives the game a slight visual boost, and the core elements of the game feel intact and preserved. Players who fit in the middle of the Venn Diagram between Nintendo fans and DC universe fans will likely find themselves interested in this port, as it stays faithful to the original while adding a twist.

http://gamerliving.net/archives/12238
 
Does Nintendo have a launch game in development that
s pushing a technological presentation?

.

It doesn't look like it in fact it looks like games like Pikmin and SMBU could have been developed on Wii and ported over with some improvements.
According to a video interview I watched with Reggie, games that take full advantage of the WiiU have been in development from teams like EAD and Retro which will come later.
 
Change is usually even bigger for content development team, as each engine/middleware has their own tools, content pipelines and editors.
Yeah, but the idea was that you'd only need to re-learn once, across the company, and then you're up to speed again. Having every dev team re-tool their current Wii engines and tools would mean a lot of redundant work across however many teams Nintendo has these days, in-house and first party included.
 
Yeah, but the idea was that you'd only need to re-learn once, across the company, and then you're up to speed again. Having every dev team re-tool their current Wii engines and tools would mean a lot of redundant work across however many teams Nintendo has these days, in-house and first party included.

Yeah that worked so well fro EA when they bought renderware, and the only games to ship on time in the entire year were the ones given exceptions to the you must use renderware mandate.

Dev's should be left to make technology decisions, and they should be held accountable for the decisions, they should certainly evaluate 3rd party engines, but core tech decisions have to be made in the context of the projects scope and timeline.

FWIW I've never heard any dev claim that buying an engine saved engineering time, it does potentially let you start content development sooner.

The primary reason 1st parties/AAA developers develop their own engine, is control, I might write an internal 3D engine in say 100K lines of code, unreal engine might be 1M lines of code (I made both of those up, but the ratio is probably in the right ballpark), making a change in the latter is exponentially more expensive and usually the successful users of 3rd party engines work within the limits rather than trying to fix the shortcomings. In fact being technical competent can hurt you here as you invest a lot of time trying to fix features in the engine without ever really understanding the totality.

Engines provide general solutions and will always be less efficient and require more code. Having said that I think it'll hard to be able to tell the difference between an engine like unreal and a high quality in house engine going forwards.

The engine itself is also a small part of the technology problem, most of the code is in the tools and asset processing, and how they are deployed, how the artists and the designers work etc etc etc.
 
Based on the rumors that we have for Durango/PS4, it will probably be a bit closer to the PS3 than the 720 in raw power. Wii U's GPU and the RAM quantity for games will at least be 2-3x greater than current gen, but the Durango/PS4 may end up being 3-5x greater than Wii U's. It may end up being closer to the other next-gen consoles in terms of graphical features, though, depending on the customizations Nintendo had done to the GPU.

Correct, which is why I used the term "transitional console" months ago when trying to establish a power baseline for the Wii U. The 2-3x power metric comparative to the PS3 I believe is somewhat misleading, or problematic. This is due to what is possible simultaneously & independently on the DRC. Rendering two separate viewpoints, or scenes requires a doubling of various work loads. Most notably geometry, transformation, lighting, shader effects, etcetera. The initial 720p "native" resolution for all current 1st party software was secured to provide very stable & predictable performance in conjunction with heavy utilization of the DRC. (display remote controller) No matter how simplistic or complex the game may be. The system is by no means completely bound to it.

Does Nintendo have a launch game in development that
s pushing a technological presentation?

Publishers don't seem all to interested in having devs change texture resolutions, all devs say is that the game will be on par with PS360 versions.

This takes more time, publishers want to simply get the software up & running optimally as quickly as possible. Taking advantage of the architecture of a new system takes time. Batman:AC is definitely taking some advantage of the additional ram, GPU functionality, OoOe, as well as using the DRC creatively rather than simply as a submenu. V-sync is enabled, texture resolution is increased, post-process anti-aliasing, (not present in the PS3/360 versions) as well as additional art assets & geometry have been included in certain scenes. The 3rd party title that will demonstrate the differences most noticeably will be Gearbox's Aliens:CM, but that is a Q1 2013 release. The Wii U's version of Metro:LL is still officially on development hold, though this was showing early potential as well.

The first two visual showcases from Nintendo's internal studios will be Retro & Monolith Soft's titles. Both running on proprietary engines that will demonstrate the Wii U's technical presentation properly. There are still a few announcements left regarding japanese exclusive & multi-platform software, though nothing really exploiting the hardware. (from what I've heard)
 
Yeah that worked so well fro EA when they bought renderware, and the only games to ship on time in the entire year were the ones given exceptions to the you must use renderware mandate.
So with one piece of anecdotal evidence you have disproven my entire postulation? I'm not quite sure I buy that kind of logic, but okay... You're a developer after all. ;)

The engine itself is also a small part of the technology problem, most of the code is in the tools and asset processing, and how they are deployed, how the artists and the designers work etc etc etc.
Just to stick with the Unreal example, that toolset ought to be the most polished in the games industry by now, bar none pretty much...but what do I know. :p
 
So with one piece of anecdotal evidence you have disproven my entire postulation? I'm not quite sure I buy that kind of logic, but okay... You're a developer after all. ;)


Just to stick with the Unreal example, that toolset ought to be the most polished in the games industry by now, bar none pretty much...but what do I know. :p

My point was more that adopting technology is in no way free, it's extremely expensive and that expense is proportional to the size and complexity of the adopted technology. Managers often seem to think that engineers don't choose the "easy to use 3rd party solution" because they are stubborn or because of NIH, while there can be some of that most experienced engineers aren't basing decisions on that.

For whatever reason people seem to think you install the SDK on developers desks and just develop, it doesn't work that way when there are 100 people on the team.

The Unreal engine if fine, the tools are fine, if I was trying to ship a game in under 18 months with no existing technology I'd seriously consider it and other 3rd party engines.
But it doesn't define workflows or centralized build systems, or how you manage, track and distribute assets etc etc etc.
It provides a set of tools, you still need to do all the production work, and 3rd party engines are not panaceas.
 
I grok what you're saying, ERP, but considering the severely outdated tech Nintendo has right now, surely modifying it to such an extent that it can faithfully use a modern piece of hardware like the Wuu must be close to rebuilding from the ground up. Wii is OVER a decade old tech by now, that hardware project was originally started in the late 1990s. That's some seriously old shit in this business, it's like a delivery company driving trucks from the 1940s...

Surely tools and workflows and all of the stuff you mention ought to have changed somewhat as constraints and capabilities also changed.
 
Correct, which is why I used the term "transitional console" months ago when trying to establish a power baseline for the Wii U. The 2-3x power metric comparative to the PS3 I believe is somewhat misleading, or problematic. This is due to what is possible simultaneously & independently on the DRC. Rendering two separate viewpoints, or scenes requires a doubling of various work loads. Most notably geometry, transformation, lighting, shader effects, etcetera. The initial 720p "native" resolution for all current 1st party software was secured to provide very stable & predictable performance in conjunction with heavy utilization of the DRC. (display remote controller) No matter how simplistic or complex the game may be. The system is by no means completely bound to it.



This takes more time, publishers want to simply get the software up & running optimally as quickly as possible. Taking advantage of the architecture of a new system takes time. Batman:AC is definitely taking some advantage of the additional ram, GPU functionality, OoOe, as well as using the DRC creatively rather than simply as a submenu. V-sync is enabled, texture resolution is increased, post-process anti-aliasing, (not present in the PS3/360 versions) as well as additional art assets & geometry have been included in certain scenes. The 3rd party title that will demonstrate the differences most noticeably will be Gearbox's Aliens:CM, but that is a Q1 2013 release. The Wii U's version of Metro:LL is still officially on development hold, though this was showing early potential as well.

The first two visual showcases from Nintendo's internal studios will be Retro & Monolith Soft's titles. Both running on proprietary engines that will demonstrate the Wii U's technical presentation properly. There are still a few announcements left regarding japanese exclusive & multi-platform software, though nothing really exploiting the hardware. (from what I've heard)

It sound like Wii U is like 4x~5x but the controller can limit the power that games can use... Is it right?
 
Where do you get that from? How are people even measuring 4-5x??

I'm talking based on this statement:

The 2-3x power metric comparative to the PS3 I believe is somewhat misleading, or problematic. This is due to what is possible simultaneously & independently on the DRC. Rendering two separate viewpoints, or scenes requires a doubling of various work loads. Most notably geometry, transformation, lighting, shader effects, etcetera.

It sounds to me like "games look like 2-3x because DRC drain some power".
 
If those games are also rendering full 3D screens on the DRC, then yes. If they're rendering a 2D inventory or map or similar, than no.
 
Status
Not open for further replies.
Back
Top