Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Your "arguments" are not making any sense. :rolleyes:

Simple-> a game has 30 fps,so 15 GB/30=0,5GB/frame.
So on every frame the CPU+GPU can use only the quoter of the main memory to construct any frame.
Considering that the consecutive frames are using the same assets,it is mean 512 effective mem.

In the case of the XB360 example the machine can use the whole main memory to render the frame.

So even if the WIIU has 2 gig of memory, it is worth only 512 from a game standpoint.The remaining is just cache.


This is the reason why there is not so much 60 FPS game out there.the xbox360 can read only 256 megs of the memory ( roughly) within 1 frame in 60ftps mode.
 
Simple-> a game has 30 fps,so 15 GB/30=0,5GB/frame.
So on every frame the CPU+GPU can use only the quoter of the main memory to construct any frame.
Considering that the consecutive frames are using the same assets,it is mean 512 effective mem.

In the case of the XB360 example the machine can use the whole main memory to render the frame.

So even if the WIIU has 2 gig of memory, it is worth only 512 from a game standpoint.The remaining is just cache.


This is the reason why there is not so much 60 FPS game out there.the xbox360 can read only 256 megs of the memory ( roughly) within 1 frame in 60ftps mode.
Only 1 GB is accessible for games.
 
http://en.wikipedia.org/wiki/Shin'en_Multimedia

Maybe because the last console they worked on is Wii.

Im more inclined to think that Nintendo and its partners were thinking out of the box when
designing the WiiU. Just like this development:


You may be reading this while you’re slowly updating a Steam game or watching a buffering Netflix show on your PS3 even though you pay for a fairly speedy Internet. Regardless of how far we’ve come regarding Internet speed, and how our phones can watch television shows while we wait in line at the bank, there’s always something left to be desired when it comes to Internet speed. The maximum speed provided by an Ethernet cable is often preferred to the ease, but slower maximum available speed of a wireless signal, but with the addition of a bit of algebra to clear of network clogging, Wi-Fi signals may become much faster without the addition of any new hardware.

project lead and MIT professor Muriel Medard found that a standard bandwidth of one megabit per second was boosted to sixteen megabits per second — a significant increase that didn’t require any new hardware.

The redesign of the 360 made it faster.
This is not an open and shut case.
What are they doing more, with less?
 
What the hell was Nintendo thinking? This launch looks like one of the greatest disasters in console history.
Has it been mentioned that there is no support for DLC yet, either?

Also, they make their partners look bad. Judging by the hardware details it's impossible to produce a port of even a mildly complex X360/PS3 game, but all the flak from the audience will hit the developers instead of the one responsible.
 
Wow, this entire launch seems like a disaster.
To early to tell, the review are far from bad, actually I would say that as I expect the gamepad has a lot of win to it.
As pointed out on GAF, it's now entirely possible Wii U will be in a worst position relative to PS4/720 than Wii was to 360/PS3 (considering it appears Wii U may be at or below current gen, whereas presumably Wii was >Xbox and certainly PS2).

I did say way back when I figured Nintendo would find a way to badly bottleneck the thing...
On the technical front I agree, I'm now convinced that my speculations, while factually wrong, about how the GPU is connected to the Edram may very well be right. Looking at the kind of interconnect IBM offers for its ppc 47x, some leaks about the memory hierarchy (mem1 and mem2), the lack of claims wrt to MSAA vs the amount of edram rumored, I assumed that the Edram was not as tighly linked to the GPU as say in the PS2 or in the case of the 360 to the ROPs (where you need most the bandwidth). In that conditions you did not expect to have the gob of bandwidth both the aforementioned system provide for rendering (ie where you need it the most).
The GPU and CPU are not on the same die we know for sure, still it looks like the Edram is used as a not that specialized pool of ram.

Overall it seems I was right to assume that the tribute the system paid to backward compatibility is to high once you consider that there are emulators around, Nintendo could have buy one and rework it to achieve better results / good enough results (not too mention that they would still have a PPC based CPU). It's a bit disheartening as the few reviews I read make clear that as far a super casual gaming /party gaming gaming, the system is only an incremental improvement over the Wii.
Though the system adds a lot of convenience through it's pad and open new possibilities for game-play. To me it means that the system could have reached a lot of casual/occasional gamers (not the super casual, like +50 years old), and win them over the strong point of the system (new game pad, wiimote and other accessories for party games, etc.). It may still have quiet some success, it is tough to predict costumers reactions. One could turn out right based on false premises.
I think that Nintendo had a shot to extend its reach to the more traditional gamers. Imo they will reach some of them but it could have gone further if the system had been a tad beefier, which it could have been without either breaking the power constrains set for the system or increasing production costs.
It seems that Nintendo made some short terms saving on the RAM and use a type a type of ram cheap but already out of fashion and it's unclear how costs will evolve along the system lifetime. I'm not sure, it is definitely something I expect Nintendo to have consider. thing is once you account the importance of RAM, I'm not sure that was the spot to save a few bucks (or less vs a tad better RAM). One opinion might get worse when you consider the design use more expansive process for accomodate Edram, is not a SoC, use a MCM, etc.

Looking back, I was convinced that a Power a2+GPU SoC was the way to go.
The Power A2 is far from perfect but neither Xenon or the Cell were. It is low power, it is tiny, it is better than Xenon on every metric. It supports transactional memory which would have made devs life a tad easier. For those that have concerned about the amount of thread available, well for me it is a non issue, I mean you don't have to use all the 16 threads a simple quad-core would provide. Nintendo could have limited the number to say 8, end of the story.
Power A2 is meant to use IBM process allowing for Edram, looking at the money Nintendo seems willing to spend, I would now discard it. PPC 470s on TSMC or Samsung /whatever foundry was the way to go, just use more of them. 6 sounded right to me.

For the GPU something akin to Trinity would have been a good place to start. Trinity runs every games @30 fps or more @ 720p (way more in some cases) and while fed on DDR3.
Nintendo may have stick to a dual channel memory controller to DDR3 or investigate the following options their doability (I'm not an engineer) and costs:
3 channels to DDR3 (/192 bit bus)
1 channel to DDR3, 1 channel to GDDR5 (with a tiny amount of GDDR5 like 256MB so you only render here, texture are read from the main ram / ddr3).
1 channel to ddr3, 2 channels to gddr5 (again a tiny amount of vram)

Overall (if doable) I think option 2 may have give a nice boost to performances if we use Tirnity as a ref. It is not a crazy increase in bandwidth as the number would remain low (like ~"45 GB/s vs 30 GB/s") but it is still a 50 % increase, not too shabby.

They should have gone with a tad faster RAM (DDR3 1600 worse every pennies) and more of it, if the were to lock 1 GB for the OS / non gaming related functionality, 3GB sounded right.

I would be surprised if a single SoC (so including 6 pretty tiny cores, a sane amount of cache, a kurt like GPU, 2 memory controllers, the few accelerators Nintendo uses, and the supposedly present ARM core(s)), even taking in account the extras they would have spend more on RAM (3 GB of DDR3 1600, 256 of cheap GDDR5 for the Vram), would have ended more expansive than producing two chips, on supposedly 2 process that allows EDRAM, testing both chips, having a MCM, etc. and the related R&D costs.

For me BC is important (not for me but I get that it is important) but at some point you have to stop looking backward, the constrains imposed by BC seems to me to have lead Nintendo in the wrong direction. At some point you have to move forward, especially as the Wii relied on the same tech as gamecube and that is eleven years old technology.

That is more OT but I'm still not sure about what to think about the system, I like the idea of the gamepad and clearly see the benefits it brings (convinience and new gameplay), my wife loves Nintendo games, the systems should play ps360 games but it forces to make a rough arbitration, ie buy a system I know is not better than the 360 I sold one year ago and I got five ago. Contemplating that choice let me with a bitter taste in the mouth, a sense of missed opportunity for Nintendo and me. As it is I will wait a long while before I take any decision (so most likely at least one year). I want to see if the system sells, to a sane democraphic, if genres that the pad should enable are coming to the system (or if the demographic is simply not there and devs pass), etc. I'm not that concerned about next gen perfs, for all I know I might buy a new PC in 2014 (haswell or better + 22nm should be available for GPU too). The big "but" is that I'm not pleased to ga backward usually... I'm not sure I want something that perform no better than the 360 I owned or the laptop I'm currently playing on (and that is not top of the line) by principle. The system will have to evolve in really nice way, the strong point I see in the concept win me over the lacking I see in hardware.

And there's a 5GB, one hour plus download out of the box for basic functionality. If Microsoft launched a console in this state the internet would have burned down from the rage.
Well Sony got its fair share on complains on similar issues. MSFT got criticized on forum anytime Live is down. Imo it's not really a good iea to get into that kind of consideration, the internet bubble is just that an Internet bubble with fanboys, haters, sane people, etc. lets not kill that discussion with that crap :)
 
Doesn't mater.
The game can use only half GB ,the remaining half GB is for cache, and there is on other gig cache for the operating system.

Can you provide a source link with this information that game developers can use at most 512 Meg of RAM for their games?
 
Doesn't mater.
The game can use only half GB ,the remaining half GB is for cache, and there is on other gig cache for the operating system.
:LOL:
I give up.
Can you provide a source link with this information that game developers can use at most 512 Meg of RAM for their games?
No, he can't. It is his own opinion based on some questionable assumptions and dubious logic.
 
Last edited by a moderator:
Simple-> a game has 30 fps,so 15 GB/30=0,5GB/frame.
So on every frame the CPU+GPU can use only the quoter of the main memory to construct any frame.
Considering that the consecutive frames are using the same assets,it is mean 512 effective mem.
What if you have a complex world with lots going on you want to store that isn't graphics assets? What if you want to accumulate multiple rendered frames for special effects? How's about generating an storing procedurally constructed assets, loading say 100 MBs of people assets and constructing 500 MBs of people assets in RAM at load time to save constructing them on the fly?
 
What if you have a complex world with lots going on you want to store that isn't graphics assets?
Bomlat's argument breaks down at his assumption, that all frames in a certain period need the same assets and therefore memory. He basically wants to render a static scene with a fixed camera. According to his logic, we should wait 10 seconds each time we turn around in a game so the engine can load the needed stuff from disc. :LOL:
Your arguments come then on top of that.
 
Simple-> a game has 30 fps,so 15 GB/30=0,5GB/frame.
So on every frame the CPU+GPU can use only the quoter of the main memory to construct any frame.
Considering that the consecutive frames are using the same assets,it is mean 512 effective mem.

In the case of the XB360 example the machine can use the whole main memory to render the frame.

So even if the WIIU has 2 gig of memory, it is worth only 512 from a game standpoint.The remaining is just cache.


This is the reason why there is not so much 60 FPS game out there.the xbox360 can read only 256 megs of the memory ( roughly) within 1 frame in 60ftps mode.

You seem to be confusing memory bandwidth with memory size.
 
Bomlat's argument breaks down at his assumption, that all frames in a certain period need the same assets and therefore memory. He basically wants to render a static scene with a fixed camera. According to his logic, we should wait 10 seconds each time we turn around in a game so the engine can load the needed stuff from disc. :LOL:
Your arguments come then on top of that.

Check back all console prior to the WII U,divide the bandwidth by 30,and you will get a number bigger than the memory size.

The WIIU is the fist console that doesn't using this principle.

1500 MB is nothing else just a big cache to eliminate the loading :D
 
The bandwidth define that how much memory can you use to render the game.

The problem is your assumptions are flawed.
You need to have a lot more than what's on screen to render the scene. And you certainly don't have to touch all of the memory to render.

I do agree the bandwidth is on the low side, but given Nintendo AFAICS wanted something on par with PS360 it's in the right ballpark.
 
ERP said:
Nintendo AFAICS wanted something on par with PS360 it's in the right ballpark.
I think that is the main problem... I imagine most people were expecting something at least a little better than the 360.

Initial impression is that it is gimped by the CPU and low bandwidth...
 
Status
Not open for further replies.
Back
Top