Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I'm seriously starting to think that even if WiiU would put even fastest PC's out there to shame both speed- and tech wise, TheChefO still wouldn't be happy with it, and Nintendo can't do anything right.

I will say that hardware tech alone isn't going to solve all their problems.

They would also need to work on dev tools and get their own internal dev teams up to snuff on cutting edge development (or even just UE3) as well in order to succeed. So yes, you are correct in that hardware alone will not be enough.

Nintendo will have to get off their pile of cash they made on Wii and reinvest it into their company (like other modern businesses do that aim for growth). And while they're at it, hire a financial adviser to help them avoid yen/dollar and yen/euro issues. It's like a comedy of errors over there. They keep effing up, but some how, still avoid the pitfalls and come out alive.



"If Nintendo don't do what I want them to do, they're doomed."

:LOL:

Personally, I don't care either way. I didn't buy a Wii (played others enough to see glaring and frustrating pitfalls of the tech), and didn't buy a n64. Bought a GC when it was on sale for $100 to go back and play some games I missed out on, but certainly not a hardcore N fan and from what I've seen, I won't be buying a WiiU.

Overall, I just want to see a healthy competition between Sony/MS/N.

My comments of caution for Nintendo are more out of desire to see that competition thrive with a hint of, "ooh I hope they aren't that stupid."

I enjoy the dynamics of the console biz, but I don't want to see another Sega moment.

which process is it on? I used to think 40nm but this gives a performance/power disavantage and thus a slower GPU in the small wiiU budget.
but maybe on 40nm it's cheap and high availability.

if it uses VLIW5 or VLIW4, why not, it's what is in current and future AMD APUs.
I expect low specs, if it's VLIW5 maybe 240SP, if it uses radeon 6970 architecture then 256 SP, both with 64bit gddr5.

it' more than good enough. for instance I've seen a sandy bridge laptop with a renamed radeon 5450 with ddr3. it's pretty good at running games at 768p, we tried far cry 2 at default settings. it's like an order magnitude better than stuff like the X300 SE we had a few years ago.
the unobtainable radeon 6450 w/ gddr5 is twice better. something a bit above compares well with PS360 I guess.

Again, if they're aiming for 2005 hardware which is on it's way out, how long do they expect it to last and how much do they reasonably expect people to pay?

"But it's got a tablet!" Who cares? The ability of the tablet to co-display will obviously be limited as one of the design bullet points was that the TV could be freed from gaming at any given moment and toss the tv image to the tablet. Thus, the interaction needs to be able to fit into a strictly tablet use case.

So this puts the interaction of WiiU at a tablet level. Now how about price?

If it is essentially a playbox360 with a tablet, I'm guessing they can squeeze the BOM in at the end of the year somewhere near the $250 mark. If they sell it at that break even price, they might have a shot, but knowing Nintendo, they will likely try and squeeze a crazy margin out of the box day one and mark it up to $350.

Nevermind the fact it isn't focused on motion gaming anymore which was the entire lure of Wii and nevermind that the casual market is being targeted heavily by the likes of Apple, Google, Facebook, and MS (Kinect).

Nintendo is stuck between a rock and a hard place here.

They have been since the project Natal unveil. They would have been better off partnering up with a big player at that time (or licensing Kinect). As is, we have a hardware averse toy company competing in a battle from a disadvantageous position.

Their one shot at competing solo going forward was/is not with the casual crowd and a halfbreed tablet. Their one shot was with a core targeted console. A dire and slim chance, but that is really all they had here.

Otherwise, suck it up, form a strategic partnership with Apple/Google/MS/Sony and become a software developer that also happens to make portable consoles (for now).
 
"But it's got a tablet!" Who cares? The ability of the tablet to co-display will obviously be limited as one of the design bullet points was that the TV could be freed from gaming at any given moment and toss the tv image to the tablet. Thus, the interaction needs to be able to fit into a strictly tablet use case.

I hadn't read about this, and understood that a game could require the use of both the controller and tv. Have Nintendo stated this somewhere?

So this puts the interaction of WiiU at a tablet level.

Irrespective of processing power, an iPad will never have the standardised, high quality controls that the WiiU has and its games are unlikely to target dual displays. What makes the WiiU platform unique is what Nintendo are banking on for success. Comparisons to the iPad fundamentally miss what the WiiU is and what the base platform provides.

"Virtual thumsbicks" and buttons and a lack of analogue should buttons are a necessary evil when you're gaming on a device that wasn't designed for gaming. An iPad could be 1,000,000 more powerful than a real games console or a PC but I'd still have one (of each) because I want better than a view blocking touch screen interface for everything. Graphics whores aren't the only type of gamer out there, and even graphics whores can have standards elsewhere!
 
Who then ?
It's impossible to support varied, unknown hardware configurations at a low level. The DX/OGL APIs are essential to supporting upgradeable or varied hardware from multiple IHVs. DX has evolved to support lots of great features that aren't being used because the PC market doesn't have a large enough, relevant enough install base of high-end GPUs to make it worthwhile for the devs to target. They target the lowest common denominators which are DX9 integrated graphics and the like. It's worth noting in the past that the IHVs have provided access to unique capabilities of their hardware but they were never targeted by developers for sane economic reasons. "So what if this new ATi board supports hardware accelerated scrolling - hardly anyone owns it so we're not going to waste our time targeting it specifically."

As Joe Public upgrades their PC, they eventually get access to later DX features by it just being included, rather than by choice, moving the common denominator target up a level. But it's Joe Public not wanting to invest in the latest hardware, resulting in the developers not wanting to target the latest hardware, that limits what PCs are achieving. If not for DirectX, PC gaming wouldn't exist at all, so complaining about how it's holding the sector back is pretty ridiculous. And going forwards, devs are going to use middleware more and more because writing low-level code for such monster processors is very difficult and costly. All the console manufacturers are talking about ease-of-development because that's what the industry needs more than ultimate performance extraction from every transistor.
 
It's impossible to support varied, unknown hardware configurations at a low level. The DX/OGL APIs are essential to supporting upgradeable or varied hardware from multiple IHVs. DX has evolved to support lots of great features that aren't being used because the PC market doesn't have a large enough, relevant enough install base of high-end GPUs to make it worthwhile for the devs to target. They target the lowest common denominators which are DX9 integrated graphics and the like. It's worth noting in the past that the IHVs have provided access to unique capabilities of their hardware but they were never targeted by developers for sane economic reasons. "So what if this new ATi board supports hardware accelerated scrolling - hardly anyone owns it so we're not going to waste our time targeting it specifically."

As Joe Public upgrades their PC, they eventually get access to later DX features by it just being included, rather than by choice, moving the common denominator target up a level. But it's Joe Public not wanting to invest in the latest hardware, resulting in the developers not wanting to target the latest hardware, that limits what PCs are achieving. If not for DirectX, PC gaming wouldn't exist at all, so complaining about how it's holding the sector back is pretty ridiculous. And going forwards, devs are going to use middleware more and more because writing low-level code for such monster processors is very difficult and costly. All the console manufacturers are talking about ease-of-development because that's what the industry needs more than ultimate performance extraction from every transistor.


http://www.industrygamers.com/news/pc-developers-just-want-direct-x-to-go-away/

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://games.slashdot.org/story/11/...tting-in-the-way-of-pc-game-graphics-says-amd

http://forum.beyond3d.com/showthread.php?t=59795


(he makes a small comment about it)
http://www.youtube.com/watch?v=NxFdsEtr-TY

(dicusses it at the beginning, after 3 min)
http://www.youtube.com/watch?v=hapCuhAs1nA
 
I hadn't read about this, and understood that a game could require the use of both the controller and tv. Have Nintendo stated this somewhere?

It was at their product unveil at e3.


An iPad could be 1,000,000 more powerful than a real games console or a PC but I'd still have one (of each) because I want better than a view blocking touch screen interface for everything.

You and I (and all other core gamers) would have a problem being stuck without thumbsticks, but there are many more who have no problem without them. Unfortunately for Nintendo, their demographic is of the casual kind that can (and does) do without ...

Factor in the very likely scenario of an ipad/tablet having xbox360+ graphics in the next few years and you see a certain dilemma with aiming for such a low spec. Especially when such a low target is completely unnecessary.
 

That's four links to the same story bro. The B3D one has an interesting discussion following it btw.

I'm not trying to argue that DirectX is a bad thing - just that (like most things) it comes with a cost.

For the vast majority of developers the 'cost' of using DirectX is well worth it. For some very high end ISVs like DICE it has become a bottleneck that they'd like to go away.

I guess if I were to refine my wording I'd say some ISVs want to be able to render PC graphics without using an API like DirectX or OpenGL...

Personally, as a consumer, I don't want that. Back in the late 90's there was nothing glorious about having two graphics cards in my PC and having lots of stuff supercede them fast and not work properly (or at all) on either. The great thing about DX is that games can run on your hardware long after an IHV has stopped paying pubs to target it and long after elite developers have stopped getting excited over its new features. Undermining a (relatively) hardware agnostic API for a short term boost for a big budget publisher/developer/nVidia is not in my interests.
 
Personally, as a consumer, I don't want that. Back in the late 90's there was nothing glorious about having two graphics cards in my PC and having lots of stuff supercede them fast and not work properly (or at all) on either. The great thing about DX is that games can run on your hardware long after an IHV has stopped paying pubs to target it and long after elite developers have stopped getting excited over its new features. Undermining a (relatively) hardware agnostic API for a short term boost for a big budget publisher/developer/nVidia is not in my interests.

In general I agree, but looking at the current market, there are only two GPU manufacturers (until PowerVR shows their cards).

At the time of DX invention it was the 3D wild wild west with new cards and technology constantly on the rise.

Matrox
Intergraph
3Dfx
S3
Nvidia
Ati
3Dlabs
PowerVR
etc.

These days, with only two real venders (intel integrated graphics can cling lovingly to DX for dear life), it might make sense to abandon it. Though I think the performance hit at this point has more to do with PC architecture and OS than DX, so then again maybe not. :p
 
In general I agree, but looking at the current market, there are only two GPU manufacturers (until PowerVR shows their cards).

You're not talking about GPUs, you're talking about graphics cards for x86 systems.
During the years following Windows 8 release, pretty much everything will change, since the O.S. will support the GPUs from:

- nVidia
- AMD
- Intel
- PowerVR
- ARM (Mali)
- Vivante
- Qualcomm (Adreno)
- HTC/VIA (S3 Chrome)
- Probably some more that I'm not recalling right now

Eventually, you might also want to add whatever Huawei has in their SoC and Broadcom's follow-up to the fairly successful VideoCore IV.


So yeah, wild wild west all over again.
You still think we don't need high-level APIs for PCs?
 
As an avid Linux user, I'd really love it if OpenGL would become more relevant again. Well, the Khronos group was really to blame too, for kicking OGL into the mess it's now in (no updates for LONG times etc...)... but with 4.0 and now 4.2 it's taking up steam again, and since basically all handheld devices also use OpenGL too (well OGLES), it might get more traction again.
 
Are you actually going to debate this at any point?

This one article is saying DX has overheads. Yep. That doesn't change any of my points though. You cannot code to the metal on unknown configurations. It's possible to target specific configurations or even families such as GCN for those who have that card, but what about older cards? So supporting a market larger than just a couple of million (or whatever) top GPU owners requires hardware virtualisation. And all that is moot when faced with the cost of creating improved assets. Even with 10x the power (which the PCs do have. It's not like DX is capping them to console levels of capability), developers aren't going to throw money at creating 10x the game on the PC when they can just port over the same console assets and render them at higher IQ with a couple of niceties like PhysX particles and a better lighting engine. It doesn't make economical sense to most developers to go all out for the ultimate PC experience. A couple will give it a go, like CryTek. Very, very few devs are going to be banging their heads against DX lamenting how they have so little power to work with. Very, very many devs will be frustrated at having to test against several different hardware configurations while alienating a lot of older PCs.
 
You're not talking about GPUs, you're talking about graphics cards for x86 systems.
During the years following Windows 8 release, pretty much everything will change, since the O.S. will support the GPUs from:

- nVidia
- AMD
- Intel
- PowerVR
- ARM (Mali)
- Vivante
- Qualcomm (Adreno)
- HTC/VIA (S3 Chrome)
- Probably some more that I'm not recalling right now

Eventually, you might also want to add whatever Huawei has in their SoC and Broadcom's follow-up to the fairly successful VideoCore IV.


So yeah, wild wild west all over again.
You still think we don't need high-level APIs for PCs?

At least the Wild West back then had more than 2 APIs, but yes, it's getting interesting again, especially with AMD and Nvidia having to take on the mobile market in some form.
 
As an avid Linux user, I'd really love it if OpenGL would become more relevant again. Well, the Khronos group was really to blame too, for kicking OGL into the mess it's now in (no updates for LONG times etc...)... but with 4.0 and now 4.2 it's taking up steam again, and since basically all handheld devices also use OpenGL too (well OGLES), it might get more traction again.
Judging by the development of GLES, Khronos are perfectly capable of governing such standards. You can blame the 'graphics workstation' ISV/IHV fatcats for the state of desktop OGL - they had too big CAD/DCC/driver codebases and no desire to bring those up to date to a more modern OGL. Thus the gargantuan legacy creep OGL had (and still has, to some degree) to cope with for generations on.

'The fate of an open standard is not dictated by the governing committees as much as by the major users.' -- an ancient Sumerian saying.
 
These days, with only two real venders (intel integrated graphics can cling lovingly to DX for dear life), it might make sense to abandon it. Though I think the performance hit at this point has more to do with PC architecture and OS than DX, so then again maybe not. :p

My take on it is something like this:
Thanks to AMD, Intel are starting to take graphics semi-seriously again! And they're doing it in a none destructive way thanks to DX. DX has really allowed ATI and, now, AMD to remain competitive - I think that without it nvidia would have dominated high end graphics and mainstream PC gaming could have been stuck around Intel integrated level. DX seems to have been a focal point around which tons money has been thrown at R&D and that's now feeding out into console land and making tech available for people like Nintendo and MS that they wouldn't have been able to afford to develop independently.

I could be wrong of course, and this is all OT so I should probably leave it there. Suffice to say, I disagree with Stewox that IHVs and DX are to primarily blame for the way PC gaming hardware gets used, and think it's mainly down to customers (people like me) and I don't hate them (me) for it. DX10 has been around for a loooooooong time now; it's neither MS nor AMD/Nvidia's fault that so many games are still being made for a DX9 baseline.
 
No-one. I'm making the point that it seems premature to worry about low level access when most games haven't even moved beyond DX9 limitations. But this is all getting very off topic now.


You don't understand, it's not about feature wise, it's about performance what carmack is talking about.

The driver of the industry is just a few developers and ID Software is one of them, do you think Zynga drives the industry, who cares about those losers, we aren't talking about the mainstream developers trying to use this, obviously the most talented and actually those that CARE are the ones that want to see API go away. Not all of them even are experienced enough to wirte to the metal anyways.

And he's talking for him self, not the other companies who don't worry about this much because they're not as experienced probably.

As Carmack said, it's about the access to the hardware that's keeping PC games from not running as fast as they should, plus OS overhead , and we all know how windows is good at the buggy code that's in there compared to linux.

The overhead issue is underestimated, it makes the whole difference. But as we see, for consoles, this can't magically the memory insufficiency.


And my answer to the "premature" ... this is an subjective opinion, some of devs who know what they're doing, want that, and who's going to stop them ? I support them, because i don't see how mainstream developers should have any valid opposite input for this, they can use their APIs and they aren't forced to do metal programming.

And you miss a bit on the fact that this is not getting rid of the API as disabling it , this is about the ability to bypass and override it, obviously you're not going to write the WHOLE game using assembly.

And for the record who said DX has metal-programming features, those are a joke, ask carmack, it's just a smokescreen, the drivers are doing something else behind your back.

As a matter of fact ... why "premature" is a joke, it's because Carmack has been pushing and pressuring vendors about this for more than half a decade ... cannot trace back his words on this since it's a few years i heard him say that, maybe saw in an interview.

In the end, he implies it but he doesn't say it, with the access, Carmack wants to essentially write his own drivers, that means overriding and bypassing a lot pretty much, which is obviously the best way to get max performance, those who write drivers for AMD/Nvidia aren't game developers to begin with, and second they aren't really putting that much effort into making them as best as possible.

https://twitter.com/#!/ID_AA_Carmack/status/50277047104323584
https://twitter.com/#!/ID_AA_Carmack/status/50277106856370176
https://twitter.com/#!/ID_AA_Carmack/status/30655938016837632


There might be as well troll developers who spit bullshit on the web just because they think they know better or they don't realize what carmack is about. Probably some noobs who work for "insert_major_publisher_here" ... imo

http://gamedev.stackexchange.com/qu...ance-and-do-windows-8-do-something-to-address
 
Last edited by a moderator:
I don't see the advantage of low level acces in the pc space. All your going to end up with is one giant mess of games not running properly on a lot of hardware because the game is trying to do stuff written for a different gpu. My old 4870 can still play modern games. So that's the 4xxx, 5xxx, 6xxx and 7xxx series from AMD alone. Nvidia probably has 4 generations as well since than so that would mean a dev has to keep in mind atleast 8 different gpu's and that's not even keeping in mind the differences within a series.

I'd rather put of with the ''inefficiencies'' of DX and have games working properly even on older hardware than having than having to sort out the mess of is this particulair game going to run on my particulair gpu?
 
I don't see the advantage of low level acces in the pc space. All your going to end up with is one giant mess of games not running properly on a lot of hardware because the game is trying to do stuff written for a different gpu. My old 4870 can still play modern games. So that's the 4xxx, 5xxx, 6xxx and 7xxx series from AMD alone. Nvidia probably has 4 generations as well since than so that would mean a dev has to keep in mind atleast 8 different gpu's and that's not even keeping in mind the differences within a series.

First of all, you failed to understand my point.

Second, it's your opinion from whoknows what you do for a living.

Third, your claims are complete

Forth, those devs that don't know how to program obviously are not forced to release broken games. Giant mess is up to them, if this happens, they're all stupid who are doing it, and a few of those games from the actual devs that know how to use this, will make awesome progress.

Fifth, the "problems" which you brought up are no problem for the developers that have the experience and tools to deal with it.

I'd rather put of with the ''inefficiencies'' of DX and have games working properly even on older hardware than having than having to sort out the mess of is this particulair game going to run on my particulair gpu?

That's your opinion, and since it won't be like that it's not an issue to begin with.



It's up to you to view it from what point you want, but the fact is, the drivers of the industry don't really care about the opinion of the mainstream.

Either you are mainstreams and try to view this in the favor of the mass mainstream market, or in the best interest of the creative progress and innovation. No thanks, im for innovation and progress, those that can't keep up can go work something else if they don't like it, you can be a cook or a toilet cleaner, or whatever your ideals work for best, it's up to you.

And I will end this discussion here, it is pointless to explain to people without fundamental understanding of the word "perfection".

So whatever you guys say, im just warning you before you get surprised one day, that everything you say is irrelevant, Carmack and others will drive and influence the vendors to get lower access whether you like it or not, and if you don't like it, then there's a big chance you have no idea what you're talking about.

The Standardization that was made brought security and stability in the programming, now people are much more experienced and the standardization is in the way of innovation. Ask Carmack this and you'll get the same answer.


I do not look at this as a consumer, I do not look at it as a end result of how many games will I be playing next week, I don't play a lot of games anymore, except nintendo first parties and Starcraft 2 currently, everything shitty is filtered out, waste of time, so it's not an issue to go buy hardware for the game, rather than games for the hardware.

The fact that 4870 runs all your modern games is because of the massive RAW power that PC GUPS have over consoles, but as well as the fact that the PC games haven't gone up in visual quality much, which is a totally different thing and has nothing to do with

By the use of low level optimizations, you could have your card for like 3 more years just because you would suck out all the performance it has.

Pretty much 40-50% of the performance is wasted to thin air by the overhead.

And the thing about caring ... do you think the Creator of first person shooters even cares about how other developers would have hard time programming something on the new technology like "oh those poor EA people won't be able to do this, so i will not seek innovation for my own loss and legacy and stick my head in this old mindet for 10 more years, just because of the mainstream not experienced enough oh" get out of here.

You people consider your selfs technology conservatives, why the heck conserving something Microsoft made and then put onto their console to make profit out of 13 old children (who then get on forums and start tossing bs about tech) with the use of half broken software. ... obviously the untalented Zynga assholes want to capitalize while the big industry can't get their pay for what they DESERVE because of massive budgets but undercut results because of a single bullshiter in the company named Microsoft who are obviously ALLIED with the casual/mainstream MAX PROFIT FOR LESS WORK.

The point because the drivers of the indsustry push forward is to not get out of business, the problem is larger than what you think, the problem is not small. The new generations of kids are being fed these stupid crap casual games which makes them stupider (not as smart) as the 90' generation which was presented with the hardes games

It's the INVASION of financial interest and casual smarthats into the strong talented and creative industry that was once stellar, these pests made it steer from the original PATH, I will vigorously defend against these capitalist thieves.

The mainstream industry is breeding and exploiting a generation of inexperienced gamers in turn for the profits, the more the young people they turn at early age the more will join them because they were fed the silly games, obviously never seeing the real hardcore they will turn out to be the bullshiters as we see some of them now, untalented and wrong, they will drive the same shit on, and im not happy with this, to see communities of stupid people playing stupid games, it saddens me, and I can't more than thank Blizzard and their efforts with Starcraft 2, cannot express the gratitude and how Blizzard is able to operate on such hard conditions with Activision and still make it work in the hardcore space.




You might watch GDC 2011 Iwata Keynote - I agree with him, crappy iphone games negatively effect the industry and it's legacy, I couldn't not agree, because it's obvious.
http://gdc2011.nintendo.com/
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top