Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Not anything can be ported, but the Vita has an advantage in being capable by itself, so a lot of communication only needs to be done at a very basic level, over bluetooth or wlan. Of course, it's not standard, so there's going to be limits. And as it works quite differently from the Wii U, the number of ports of this functionality may be limited.

On the other hand, it may be pretty easy to do - Guacamole, Playstation All Stars, Marvel vs Capcom, LBP2 upcoming DLC, there's definitely a number of titles upcoming that support this in some way or other. Particularly just using the Vita as a controller with a touch interface is fairly easy to do, and if that proves popular between Smart As, Wii U, and Vita, then all three consoles may start seeing some of that stuff.

But that the Wii U will get the best support for this kind of functionality by a mile, seems pretty much set-in-stone. As others have said, there's nothing like including such functionality in the box for getting support for it, let alone making it your defining feature.
 
Nor the "more advanced GPU" part is known yet.

It's known that the _base_ was R700-generation (DX10.1), and they for sure didn't somehow less capable in terms of features, the only question is wether it's DX10.1-level, DX10.1+then some -level, DX11-level or DX11+then some -level.
[then some meaning some specific (possibly fixed function) hardware Nintendo wanted for specific effects/features)
 
So they are lazy bastards

Hey bomlat, here's a thought for you. Go and show these lazy developers what you're made of! Take some classes in programming, buy "Computer Architecture: A Quantitative Approach", suit up with all the knowledge there is and show them who's the boss!

But if you cannot do that or you're not willing to do that, stop calling other people lazy. It's insulting. It's offensive. It's trolling. It's as far from technical as possible. It does not belong to this place.
 
Sigh. No it does not help because you are speaking out of your ass, as many here have repeatedly tried to tell you.......

it is a technical topic so ANSWER FOR NUMBERS WITH NUMBERS.

So, why can't the WIIU use the free half gig for game caching?
 
Yes you can. There are clear indicators of bottlenecks in games. Nothing Nintendo does can change the fact that e.g. higher resolution required more PS power or that more intelligent entities on screen require more CPU power.
No, but that doesn't mean that the limitations in frame rate are directly caused by these issue.


I'm sure they don't render straight to the memory mapped file. That would be absurd. Same goes for rendering to RT that's not on embedded DRAM.
Err... I'm sure I wrote that I didn't believe that I was talking about the actual causes for slowdowns, but were pointing out our lack of actual knowledge, and gave examples of causes that could be actual (if not likely), and possible to do something about given motivation and time.

That said.
If you are in charge of porting a game to a new console in time for launch day, the one overriding concern is MEET THE DEADLINE. Given that it is a new console, and nobody may actually have access to the hardware, most notably not even the people responsible for the tools you are supposed to be using, there is bound to be issues. And they need to be worked around. Not only do you have to solve the bugs and problems that goes with porting in any circumstances, but you also have to deal with bugs in tools and OS services, and even hardware. Testing different approaches to a problem takes time, and time is what you haven't got. Nor do you have other groups to contact who have shipped product for the console, and who might be able to help you out. It's new. Energy and time will go into getting stuff to work. Part of that is of course to ensure that performance is acceptable, but acceptable is all anyone is shooting for, and if you're getting squeezed for time, the acceptance limit will be adjusted downwards.
Spending time on spit and polish of the finer points of graphics rendering isn't necessarily at the top of the list, when it is already good enough, and there are issues that are critical for actually getting the product out on time that is demanding attention.

Needless to say, I don't believe that launch day ports are the best examples of what a console might be capable of visually, or in any other respect either for that matter.
 
But if you cannot do that or you're not willing to do that, stop calling other people lazy. It's insulting. It's offensive. It's trolling. It's as far from technical as possible. It does not belong to this place.

Hey, mate .there is a big "OR" there .Please don't read selectively, or jump to gun.
 
Last edited by a moderator:
If you are in charge of porting a game to a new console in time for launch day, the one overriding concern is MEET THE DEADLINE. Given that it is a new console, and nobody may actually have access to the hardware, most notably not even the people responsible for the tools you are supposed to be using, there is bound to be issues.
But you're painting grim scenario which is most likely not the case. 1. this is not an alien architecture, so it's not like you've got to learn everything from scratch; 2. I used to write code for three architectures in parallel with one of them being super crazy and based on that experience I'd say that a lot of bugs just won't surface in 360->WiiU because of the similarities; 3. toolchain is most likely based on GCC or (mayyybeee) on LLVM so chances of stuff not working, purely broken, etc. are slim; 4) if you add platform, you add resources, otherwise there's no point, and; 5) ATVI got resource management on CoD to the point of art (e.g. for first BLOPS they knew very well that Treyarch can't complete game in-house given schedule and hired Certain Affinity to help them and it worked pretty well).

Anywayz - there are many options as to what might have happened. On one hand we've got unfinished tools, shaky HW specs till the 11th hour (which is BS - they've had plenty of time to manufacture Wii U's and create a huge patch for the firmware) and lazy devs. On the other something as simple as HW at most as good as current gen consoles. Which combination of factors is more likely: three things failing or one saddening fanboys?
 
So because one developer didn't make the best use of PC, you think 3rd party developers in general are clueless how to use DX10 and 11 features and are approaching Wuu effectively from scratch?

I have no idea what you're saying here. how do you know Wuu is a bandwidth monster will effectively unlimited BW for particles etc.?

Right. So there's very good chance the eDRAM isn't there to provide graphical power relative to the competition, but to get similar performance from a cheaper box with better margins.

So why relate the product to a tablet?


Or the developers put minimal efforts to port the game.There is a difference between knowing the capability and using it by full potential.


There is a middle range GPU in the machine, with limited amount of texture input.
What else could you do with it?
the particles requiring relatively low amount of bandwidth,

The edram has to be there.32 megs by 90% chance.
The main memory hasn't got enough latency to emulate the WII:

The WII U can get market from the tablets.
Portable gaming, off from the tv.
So the women can watch the X factor,the man can play with same shooter, without getting complain the women about his absence.Deal :D better than the angry birds.
 
It's known that the _base_ was R700-generation (DX10.1), and they for sure didn't somehow less capable in terms of features, the only question is wether it's DX10.1-level, DX10.1+then some -level, DX11-level or DX11+then some -level.
[then some meaning some specific (possibly fixed function) hardware Nintendo wanted for specific effects/features)

Yes, but we don't know (yet) how advanced is Wii U GPU.
 
Yes, but we don't know (yet) how advanced is Wii U GPU.

Like I said, we do know R700-gen (DX10.1 level) was the base, and it definitely hasn't gone less advanced from that, while PS3 has DX9 chip in it. So yes, we do know it is definitely more advanced, the question is how much more
 
Anywayz - there are many options as to what might have happened. On one hand we've got unfinished tools, shaky HW specs till the 11th hour (which is BS - they've had plenty of time to manufacture Wii U's and create a huge patch for the firmware) and lazy devs. On the other something as simple as HW at most as good as current gen consoles. Which combination of factors is more likely: three things failing or one saddening fanboys?

I'm out.
Be well.
 
Like I said, we do know R700-gen (DX10.1 level) was the base, and it definitely hasn't gone less advanced from that, while PS3 has DX9 chip in it. So yes, we do know it is definitely more advanced, the question is how much more

Xbox 360 is "DX9+", it even has some kind of "tessellation". I'm not saying X360 is DX10, but it is more advanced than DX9.
 
Like I said, we do know R700-gen (DX10.1 level) was the base, and it definitely hasn't gone less advanced from that, while PS3 has DX9 chip in it. So yes, we do know it is definitely more advanced, the question is how much more
We know the CPU is inferior, and the memory bandwidth is also inferior, so the question is whether the GPU is advanced enough to compensate and surpass in a significant way.
I'm thinking WiiU->PS3 port would likely need much less CPU time, and the Cell would be free for additional GFX work.
 
It has to be a RV710 variant as the HD 4350 and 4550 match up nicely to the power consumption and base line performance we have been seeing.

80SP's
4 ROP's
8 TMU's
64Bit memory bus
242m transistors at 55nm
73mm² at 55nm
25w maxim power consumption at 55nm

How much would the EDRAM add to the die size at both 55nm and 40nm processes?
 
Great, half gig can't be used for assets,only for caching,and there is lagging due to load?
So they are lazy bastards,or hadn't have enough time to port the game.

For the record, any further continued trolling by anyone with "lazy devs" remarks will be met with a nice vacation from the forums.

Thank you, have a nice day.
 
It has to be a RV710 variant as the HD 4350 and 4550 match up nicely to the power consumption and base line performance we have been seeing.

80SP's
4 ROP's
8 TMU's
64Bit memory bus
242m transistors at 55nm
73mm² at 55nm
25w maxim power consumption at 55nm

How much would the EDRAM add to the die size at both 55nm and 40nm processes?

I would guess the WiiU GPU has at least 8 ROPS, unless it's clocked significantly faster than the 360/PS3 parts, just because it would have real issues matching resolutions with them.
But all of those components are modular, there is no need for it to be based on a particular retail GPU.
 
I would guess the WiiU GPU has at least 8 ROPS, unless it's clocked significantly faster than the 360/PS3 parts, just because it would have real issues matching resolutions with them.
But all of those components are modular, there is no need for it to be based on a particular retail GPU.

Oh they could of tweaked little things here and there but you can't deny that by just looking at the RV710 it screams Wii U :LOL:
 
Remind me not to try to type replies on my phone.

The die size of the Wii U GPU is more than double that of the RV710 you mention.

Even accounting for the EDRAM and other logic it would be way off.
 
Status
Not open for further replies.
Back
Top