Gradthrawn
Veteran
They said the drive can read CD, but the system cannot playback Audio CD.Could the CD thing be laser related, the laser in the BD drive does not have to be able to read CDs? Could you save money by doing that?
Could the CD thing be laser related, the laser in the BD drive does not have to be able to read CDs? Could you save money by doing that?
Not just the diode, you also significantly simplify the lens assembly. If you dropped the red-laser DVD diode too, you could make a much slimmer, much cheaper drive.I wonder. The PS3 diode appears to have 3 individual diodes (one for each wavelength) grouped together in "Common Cathode Can" container (according to that link). Dropping the infra-red diode could potentially save money, I suppose, but the problem is we know it reads CDs, therefore we know it already has the right diode for that. Doesn't make much since with the limited info we have.
Not just the diode, you also significantly simplify the lens assembly. If you dropped the red-laser DVD diode too, you could make a much slimmer, much cheaper drive.
A trademark is just a trademark, IE, a logo on the casing or elsewhere (instruction booklet, whatever.)Actually, I'm not sure what the licensing agreements are or if that would even effect playing back CDDA. Non Redbook compliant/licensed player?
It is also using almost standard 802.11n, which made things easy to experiment on a PC."
Custom communication protocols means we can discount previous theories that Nintendo employed the use of Broadcom's streaming video Miracast technology to get the Wii U GamePad working, although there are similarities.
"Video is compressed using h.264 (baseline profile, so no B frames)," Bourdon shares. "Audio is usually uncompressed, but we've found mentions of compressed audio formats in the firmware... We found mentions of [Miracast] when we started working on the GamePad, but it turned out to be false. There is no Miracast anywhere in that GamePad. Audio, video and input streaming is done over custom protocols."
Baseline profile h.264 rules out many of the more advanced compression techniques employed by the codec, but Nintendo makes up for it via sheer, raw bandwidth. A sample capture from the Wii U WiFi stream offers up 33MB of data captured across 87 seconds - this gives us an average of around 3mbps. This is fairly lavish for an 858x480 stream at 60 frames per second, but the video captured here is only displaying the Wii U's front-end menus. Pierre Bourdon tells us that the Wii U uses variable bitrate, meaning that bandwidth scales up according to the complexity of the image it has to encode.
"This measurement does not include audio. Here is a graph of frame size over time in these 33MB," he says.
Despite the 3mbps average, we're seeing spikes of anything between 25-40mbps, and a massive variation in bandwidth that can only be down to variable bitrate h.264 video encoding. The more complex the image, the more information is required to maintain image quality - something the Wii U seems more than capable of successfully transmitting over its 802.11n wireless link.
"I haven't checked but I think the spikes are just the Wii U sending a large I-frame (full picture/key frame)," Bourdon explains. "If you average the bandwidth over something like 10 frames these spikes mostly disappear. In normal operation mode the Wii U sends one I-frame and then only P frames, unless the application requests to send an I-frame or a frame was not received properly by the GamePad (because of packet loss)."
As everything got moved over there, one could mention in this hardware thread, that the timing information from the profiler shown in the Killzone presentation confirmed without doubt, that the CPU cores in the current devkits run at 1.6 GHz. There may be a bit of movement for the final version going on sale at the end of the year depending on the yields and power consumption AMD/Sony experience (and they have to decide about that pretty soon if it's not yet fixed), but one probably shouldn't expect 2+GHz or something like that.Killzone discussion here.
Some interesting info from the Wii U Gamepad reverse engineering. Hardware wise, it appears to use what is "almost standard" 802.11n
As for the communication protocols used:
Since Off TV play is generally reported as being very responsive I thought this was encouraging news for the potentials of Remote Play with the PSV and PS4. The hardware used is much the same (802.11n + dedicated video encode/decode hardware). Which means what's left is the software stack. I think this helps clear up what was mentioned as "Gaikai tech" for PS4 Remote Play during the Feb PS Meeting. I now suspect its entirely on the software side (which would make sense).
Of course, some of the remaining questions revolve around how (if) they plan to optimize Remote Play for across WAN access. Some of the bandwidth spikes with the Wii U's Off TV play can't be accomidated by your typical home BB upload speeds:
The answer to this most likely lies in the engineering knowledge they gained from Gaikai as well. I'm curious to see how their approach differs, however.
Clocks tend to go down when chips are put in their final resting places on the motherboards. Its not likely this time as 1.6 is well in jaguars heat range but pushing it higher is definitely not going to happen
Physical defects generally don't care how awesome a core is. The number of cores raises the chance that at least one is going to be hit, but without more data we don't know what Sony has decided for those chips.
I can deal with the possibility that Sony may reserve a core for the OS. But if they are disabling a cpu core -- of a tiny low power cpu -- for yields that is pathetic. I could understand if the CPU was even midrange, but it is a wimp already.