Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
There may be something to that. The compression is probably MJPEG, which isn't going to work if you suddenly change three quarters of the framebuffer while its supposed to be being compressed. This could see Nintendo enforcing a vertical sync to ensure the FB is complete before being compressed and broadcast.

Anyone know how remote play on PSP/Vita works by comparison on games that tear?

The front buffer is probably locked while the compression routine needs it. That'd make the buffer flip block until it's done, which would also block rendering from starting in the backbuffer. The locking may only happen long enough to copy the framebuffer into another memory space if that's faster than compressing it directly. Either one should be much faster than what the screen needs to display it, so the impact here is a lot lower than being vblank locked.

If they're okay with a similar kind of tearing on the WiiU display they could just lock the framebuffer while it copies single macro-blocks or macro-block rows, which would be a negligible amount of time.

With all of Wii's additional memory it shouldn't be a problem to triple buffer, so long as the latency is acceptable..
 
The front buffer is probably locked while the compression routine needs it. That'd make the buffer flip block until it's done, which would also block rendering from starting in the backbuffer. The locking may only happen long enough to copy the framebuffer into another memory space if that's faster than compressing it directly. Either one should be much faster than what the screen needs to display it, so the impact here is a lot lower than being vblank locked.
I agree, and there's more than system design that wouldn't impact the main rendering to TV out. However, the lack of any tearing in any Wuu games is suggestive of, perhaps, a system that is v-locked, and the Wuu blet out could be a reason for that. Then again, I don't know of any Wii games that tore, so maybe it's an artefact Nintendo won't allow either by requirement or hardware design, and Wuublet has nothing to do with it.
 
According to Digital Foundry, the Wuu delivers a full 720 signal to the Wuublet before it is downscaled there.

Am I reading wrong, or are they implying there isn't compression going on in the console? It sounds like the signal isn't touched until downscaled in the tablet.

If it really is basically a repurposed N wifi signal, shouldn't that be more than sufficient for throughput? I'm impressed there is apparently NO effect on latency when beaming to the Wuublet (LOVE that word, btw).
 
According to Digital Foundry, the Wuu delivers a full 720 signal to the Wuublet before it is downscaled there.

Am I reading wrong, or are they implying there isn't compression going on in the console? It sounds like the signal isn't touched until downscaled in the tablet.

If it really is basically a repurposed N wifi signal, shouldn't that be more than sufficient for throughput? I'm impressed there is apparently NO effect on latency when beaming to the Wuublet (LOVE that word, btw).
1280x720 x 24bits x 60 = at least 1.3 Gb/s plus wifi overhead and error correction.

Could they be using multiple channels? or 802.11ac ?
 
According to Digital Foundry, the Wuu delivers a full 720 signal to the Wuublet before it is downscaled there.

Am I reading wrong, or are they implying there isn't compression going on in the console? It sounds like the signal isn't touched until downscaled in the tablet.

If it really is basically a repurposed N wifi signal, shouldn't that be more than sufficient for throughput? I'm impressed there is apparently NO effect on latency when beaming to the Wuublet (LOVE that word, btw).
That is not going to work. Just take a 30 frames per second with 1280x720 resolution second and you arrive at an uncompressed data rate north of 600 MBit/s.

It makes much more sense to send a downscaled and compressed image. The controller would then just need to decompress it. And some test I've seen has already revealed some compression artifacts, so the data gets at least touched and compressed before sending it over.
 
According to Digital Foundry, the Wuu delivers a full 720 signal to the Wuublet before it is downscaled there.
They mentioned in another article that it looks like the signal is compressed with MJPEG. 720p pure uncompressed video is far too demanding, and pointless given the destination screen size (unless Nintendo are considering a Wuu XL with a 720p screen down the line ;)). I also don't know how you can tell if the video stream is shrunk before broadcast or in the Wuublet. It makes far more sense to do that in the console and save compression time and wireless BW. I think the real meaning here is that the 720p is downscaled to the Wuublet screen and not rendered at native Wuublet resolution.
 
Dated 18th November. You think we haven't seen this already? ;) AlStrong broke the RAM BW prior to this AFAIK; he was certainly quoted in an amount of coverage during the release day.
 
Ok, I missed that article. I wasn't really thinking about the math so much as I thought I remembered something about a proprietary use of MIMO N for the Wuublet.

Why would they use MJPEG over something like OnLive/Gakai's use of h264/x264? If it works that decently on an internet connection, it should be pretty darn good on a 30 foot wifi N connection.
 
Could you link me to the interview that referred to Wii GPU support? Or preferably if you could give me the snippet that addresses it.

Sure ;)

http://iwataasks.nintendo.com/interviews/#/wiiu/console/0/0

Iwata: Especially since the Wii U had to be backwards compatible with Wii.

Shiota: Yes. The designers were already incredibly familiar with the Wii, so without getting hung up on the two machines' completely different structures, they came up with ideas we would never have thought of. There were times when you would usually just incorporate both the Wii U and Wii circuits, like 1+1. But instead of just adding like that, they adjusted the new parts added to Wii U so they could be used for Wii as well.

Iwata: And that made the semiconductor smaller.

Shiota: Right. What's more, power consumption fell. That was an idea that only designers familiar with Wii could have put forth. We were able to make such a small semiconductor because so much wisdom bubbled up!

---

Whereas I would be pleased by the idea that the WiiU has a redwood under the hood, I feel startling the fact that the system performs so badly. I run a redwood on the laptop I'm currently typing on, it easily out performs the 360.
To me looking at redwood or Turk or AMD APU, the only conclusion I can draw by watching at how a game like CoD struggles while rendering at 880x720 is that the overall design sucks badly. It will get better but such crappy results should not happen to begin with.

Anybody that would release such a hardware platform (not even as a console) would get mocked for a reason. I guess Nintendo is something special as Apple with the difference that Apple comes with good hardware, their last CPU is as good as it get within its power budget.

You are absolutely right, that's what I am wondering too. I mean even a Llano shouldn't have problems with such games at that resolution and it has ONLY DDR3, so the slow RAM can't be the reason.
 
Ok, I missed that article. I wasn't really thinking about the math so much as I thought I remembered something about a proprietary use of MIMO N for the Wuublet.

Why would they use MJPEG over something like OnLive/Gakai's use of h264/x264? If it works that decently on an internet connection, it should be pretty darn good on a 30 foot wifi N connection.

Latency, needed processing power and licensing costs would be my guess.
 
Is it likely that the config can be guessed at from a general arrangement of components? eg. 710 is:

80SP's
4 ROP's
8 TMU's

If the SP:ROP:TMU ratios are kept the same...

710 won't help you. The ratios it used weren't magic, it was just appropriate for the design targets. As you pointed out, not all of the 7xx chips have the same ratios either.

The ratios for WiiU are appropriate for WiiU, and not necessarily dependent on other instances of the family it came from.

Some developer will eventually leak the clock speed. If they are really curious they will also write a bunch of simple test cases to work out the numbers you're asking about. Some surely have already, they just don't want to violate NDAs. But eventually it will get out.
 
Shiota: Right. What's more, power consumption fell. That was an idea that only designers familiar with Wii could have put forth. We were able to make such a small semiconductor because so much wisdom bubbled up!

:cry:
 
lherre made another post about wii u and this guy always makes good posts considering his position

http://www.neogaf.com/forum/showpost.php?p=44832703&postcount=10892

Who said that? I mean who with "trusted" info?

I hate the multipliers that people uses here. I never gave one in those threads. When people threw crazy multipliers I (and arkam) tried to show people that this was wrong. In fact some people think that I hate wii u or something (I have some pm's telling me that).

I always said that wii u is more powerful than ps3-x360 (it will be difficult not to be in 2012) but the difference is another story (and the gamepad). Not more not less. So I don't know why people is suprised at all. In fact I said that ports from ps4-xbox next will be "hard" because the difference in performance is huge, so I don't understand why the people is still saying that it will be "easier" because arquitectures are similar. It's not that easy.

It is like the mantra with the EDRAM or GPGPU like it's some miracle or something new that will save any system that have them.

"huge" performance difference between wii u and 720/ps4.

also fwiw his statement wii u >ps360 might be looked at as interesting news rather than a given given recent events. guessing the tablet is really sapping some power.
 
Latency, needed processing power and licensing costs would be my guess.

H.264 is easier on the hardware and dedicated encode/decode is implemented in all modern SoCs and MCMs. AirPlay Mirroring is also using H.264 to stream video and it works without a hitch.

Just look at the battery life the iPad has when playing back H.264 content, the iPad 2,4 (A5r2, 32 nm) gets nearly 16 hours of playback, the iPad 4 nearly 14 hours and the iPad Mini is good for over 11 hours.
 
lherre made another post about wii u and this guy always makes good posts considering his position

http://www.neogaf.com/forum/showpost.php?p=44832703&postcount=10892



"huge" performance difference between wii u and 720/ps4.

also fwiw his statement wii u >ps360 might be looked at as interesting news rather than a given given recent events. guessing the tablet is really sapping some power.

Its always great to hear from lherre. Its like a big slap round everyones faces when he drops some knowledge (for both sides of the "power debate", that is and me included) He brings everyone back from the brink :)

I'd love to know exactly what the Gamepad requires, "power" wise, for say a 2D map or inventory compared to a 3D mirroring of the main screen. Would there be a difference? Will we ever be able to find out? The only reference I've seen to it was in a dodgy rumour apparently from a Crytek tester or somthing where they said the 'Pad screen was locked to "on", and they thought they could get more out of the system if they could turn it off. I dont think it mentioned what was showing on the 'Pad screen and the whole rumour sounded bogus to me so I never put much thought into it.
 
I'd love to know exactly what the Gamepad requires, "power" wise, for say a 2D map or inventory compared to a 3D mirroring of the main screen. Would there be a difference?
Yes. Mirroring requires no additional drawing beyond what's appearing on the TV. Any alternative graphics on the Wuublet are going to occur additional cost in their rendering prior to output, presumably at 720. That'll be very little impact at all in the case of a few sprites or a simple map, but it is still extra effort. Rendering a whole separate 3D scene will be the most costly option, but that's no different to rendering a complete 3D view in a rear-view mirror of a game, for example.
 
Yes. Mirroring requires no additional drawing beyond what's appearing on the TV. Any alternative graphics on the Wuublet are going to occur additional cost in their rendering prior to output, presumably at 720. That'll be very little impact at all in the case of a few sprites or a simple map, but it is still extra effort. Rendering a whole separate 3D scene will be the most costly option, but that's no different to rendering a complete 3D view in a rear-view mirror of a game, for example.

So it's unlikely any of these first lot of ports are suffering becuase they're doing too much with 'Pad then, as I dont think any of them are rendering much different than whats on screen (not at the same time anyway)? Does the resolution effect this? Presumably the image going to the 'Pad is 480p or somthing?

___________

Its interesting that DF keep mentioning V-Sync as being a bit of a bug bear. Begs the question really - why is it being deployed if its causing problems? Is it possible it's required to keep the 'Pad image in sync?
 
So it's unlikely any of these first lot of ports are suffering becuase they're doing too much with 'Pad then, as I dont think any of them are rendering much different than whats on screen (not at the same time anyway)? Does the resolution effect this? Presumably the image going to the 'Pad is 480p or somthing?
DF reckons the Wuublet is receiving a 720p video. It's safe to say it's displaying a downscaled 720p image from DF's investigation, but I imagine it's being transmitted already downsampled. Doesn't make sense to send a 720p screen and shrink on Wuublet.

Its interesting that DF keep mentioning V-Sync as being a bit of a bug bear. Begs the question really - why is it being deployed if its causing problems? Is it possible it's required to keep the 'Pad image in sync?
Mentioned before in this thread if you've missed it. Might just be a Nintendo requirement - Wii games never tore. May also be linked to a requirement in supporting the Wuublet by needing a full frame prior to transmission, meaning no option to change framebuffer partway through the display transmission (tear). Can't tell what's going on at this point.
 
H.264 is easier on the hardware and dedicated encode/decode is implemented in all modern SoCs and MCMs. AirPlay Mirroring is also using H.264 to stream video and it works without a hitch.

Just look at the battery life the iPad has when playing back H.264 content, the iPad 2,4 (A5r2, 32 nm) gets nearly 16 hours of playback, the iPad 4 nearly 14 hours and the iPad Mini is good for over 11 hours.
But there's a huge difference. The Wii U can't use a pre-compressed video stream for games, it has to compress frame by frame in real time in order to get reasonable latency.
 
Status
Not open for further replies.
Back
Top