3D Gaming*

Was it 720p or something else ? And what 3D encoding format did they use ? (side by side ?).
Avatar Xbox 360 version supports following stereo 3d modes: side-by-side, full checkerboard, split checkerboard (RealD and Sensio encoding) and scanline interleave.

The PC version supports all modes listed above, and additionally supports NVIDIA 3d Vision.

If necessary, something small can be clipped on to help tracking. Johnny Lee's IR glasses experiment should be available in the Playstation developer network too. The guy who ported the idea to PS3 said that he made it available there.
I doubt that all the current and future glasses manufacturers agree to put something standardized visual aids in their glasses to support just a few console games. If head tracking would also help BR stereo movies then it would be likely standardized, and some kind of head tracking system could be made a part of the standard.

It uses anaglyph type glasses right?
No, anaglyph is a thing of the past.

Avatar uses all technologies supported by various stereo 3d compatible HDTV sets and stereo 3d compatible DLP projectors. Basically all these modes work by encoding data of both eyes to a single 1080p image. For example every other pixel belongs to right/left eye. This way when the TV/projector receives one 1080p image at 60 frames per second, it can split each image to both eyes, and show it at 120hz with shutter glasses. Alternatively the HDTV can polarize each other pixel (scanline) differently and utilize polarized (passive) glasses to direct correct scanlines to correct eyes. Both shutter glasses and passive polarizing glasses are also used in modern stereo 3d theaters (to show Avatar the Movie for example).

So how is it that you can play Avatar in 3D on PS3 if PS3 isnt 3D capable yet and you cant buy a 3D TV? Anaglyph doesnt require anything other than those special glasses. Polarized requires a special screen so that's out. If it's active shutter and Avatar works right now, in 3D (and looks good according to you) why do we need 3D TVs at all?
Avatar supports both 3d stereo TV sets and projectors with polarized and shutter glasses. These have been available a few years already.
A list of stereo 3d compatible projectors (using active shutter glasses): http://www.3dmovielist.com/projectors.html
A list of stereo 3d compatible HDTV sets (both polarized and shutter glasses): http://www.3dmovielist.com/3dhdtvs.html

Stereo 3d does not require any hardware support. If the console can output 1080p at 60fps then it's enough. The stereo image is encoded so that each other pixel belongs to each other eye. For example in side-by-side mode you can fit two 960x1080 images (left and right eye) to the one 1080p image. 960x1080 is more pixels than 1280x720, so current generation of games do not need any hardware support beyond that. The new HDMI standard allows full 1920x1080 in stereo, but current generation consoles do not have enough horsepower to feed this many pixels (unless the game is really simple).
 
Last edited by a moderator:
Avatar Xbox 360 version supports following stereo 3d modes: side-by-side, full checkerboard, split checkerboard (RealD and Sensio encoding) and scanline interleave.

What is the HDMI bandwidth requirements for these modes ?

The PC version supports all modes listed above, and additionally supports NVIDIA 3d Vision.

Does the nVidia 3D Vision require more resources to implement ?

I doubt that all the current and future glasses manufacturers agree to put something standardized visual aids in their glasses to support just a few console games. If head tracking would also help BR stereo movies then it would be likely standardized, and some kind of head tracking system could be made a part of the standard.

To do 3D glasses with head tracking, we'd only need one vendor to start first; specifically Sony (i.e., if they want to).
 
So the side-by-side encoding technique half the res (bandwidth) ? ...since 1.2 has about half the bandwidth compared to 1.3 and 1.4 (if I remember correctly). I am a little sleep deprived these few days. So don't trust my memory.

Xbox 360 can run 1080p60 therefore it easily has the bandwidth to sustain 2x720p. However, the fact that HANA doesn't seem able to cope with 1920x1200 output suggests that its ability to handle vertical frequencies above 1080p is open to question.

However, there is nothing to stop the 360 resizing the output to two 960x1080 or 1920x540 images as sebbbi suggested earlier. Indeed, this may be how some of the PS3 titles operate as the 2x720p image has only been confirmed for Stardust.
 
Xbox 360 can run 1080p60 therefore it easily has the bandwidth to sustain 2x720p. However, the fact that HANA doesn't seem able to cope with 1920x1200 output suggests that its ability to handle vertical frequencies above 1080p is open to question.

However, there is nothing to stop the 360 resizing the output to two 960x1080 or 1920x540 images as sebbbi suggested earlier. Indeed, this may be how some of the PS3 titles operate as the 2x720p image has only been confirmed for Stardust.

Ok, I checked the specs. I see where the confusion is now. HDMI 1.2 can support 1920x1200 at 24bit. The increased video bandwidth in HDMI 1.3 and above is for deep color support (48bit).

Have no idea about the claim that HANA doesn't support 1920x1200.
 
How do the active LC glasses sync up without hardware support?
I answered it before... but here is it again:
1. The game decodes both eyes to one 1920x1080 image (using a stereo encoding supported by the TV set), and the console sends that image normally over HDMI to the TV/projector. Console does not even know that you are doing stereo 3d rendering.
2. The TV gets the 1920x1080 image from the HDMI. If 3d mode is not set active from television's own setup, the 1920x1080 image is shown normally. If the 3d mode is set active, the televion set splits the image to two images (each other pixel belonging to right/left eye).
3. TV set shows the split images at 120 hz. When it's showing the first image it informs the shutter glasses to block the left eye, when it's showing the second image it informs the shutter glasses to block the right eye. So both images are shown during the time period that normally takes to show one image.

So the TV/projector syncronizes the glasses. Console (or any other video source, like your BR/DVD player) is not responsible for this. This way only the TV set needs to support stereo 3d. The image sources (BR/DVD-players, gaming consoles, etc, etc) do not need to have any special hardware support for stereo 3d.

If the glasses synchronization would be inside the console, the console would need to know exactly the latency of the HDMI transport (varies by cable length) and the latency of the TV set (varies a lot - scaling takes time, frame interpolation takes time, color correction takes time, etc, etc). It would be basically impossible for the console to synchronize the glasses. The synchronization needs to be implemented in the TV set.

I read the link, and I'm not sure I'm understanding this correctly. The article states that PS3 needs a firmware update in order to play 3D games. This update is also needed to support 3DTVs
This is incorrect information. Avatar on PS3 already supports stereo 3d rendering (on currently available stereo 3d televions and projectors). Your game just has to encode the stereo image, and it will work. The PS3 stereo support most likely just implements the stereo encoding inside their SDK, so the developers do not need to program all the encoding methods themselves (there are at least 5 competing encoding methods currently).

But it might be that the future LCD/LED 3d televisions support some kind of new stereo encoding that requires hardware support (frame tagging to left/right eye for example). However none of these new 3d televisions are yet available for customers, so it is impossible to say if the new TV sets support the existing standards or if they support a new stereo encoding standard.
 
Last edited by a moderator:
So the TV/projector syncronizes the glasses. Console (or any other video source, like your BR/DVD player) is not responsible for this. This way only the TV set needs to support stereo 3d. The image sources (BR/DVD-players, gaming consoles, etc, etc) do not need to have any special hardware support for stereo 3d.

That's what I thought. I guess the confusion on my part has been that I've always considered the display device as hardware. Thanks for clearing that up.
 
Last edited by a moderator:
Related 3D movement on the cinema side:
http://latimesblogs.latimes.com/ent...ing-to-pay-for-rollout-of-digital-cinema.html

Lifting a roadblock to the rollout of 3-D in theaters, investment firm JP Morgan has raised nearly $700 million to finance the digital conversion of thousands of screens around the country, three people familiar with the matter said Friday.

The funding, delayed for longer than a year due to the credit crunch, would pay for the installation of digital projectors for about 12,000 screens, easing a bottleneck caused by an abundance of 3-D movies competing for not enough screens. There are currently only about 3,500 digital 3-D screens in the country.
 
Finally got my hands on the preliminary v1.4 HDMI specs (free download to everyone since last week). New forthcoming HDMI has some new metadata packets and new info stored in the frames for 3d formats. It seems to support most of the same formats as current stereo 3d TVs/projectors support, and additionally full res side-by-side and frame packing (a slightly different full res up/down encoding with extra "v-blank"). The biggest change in the specification is that the HDMI is now aware that stereo 3d data is transferred over it. This naturally requires the console to output the metadata bits to the HDMI packets. The good thing is that in the future the TV set (/projector) can automatically enable it's 3d mode when it receives frames that are tagged with stereo 3d encoded data. In the current TV sets (/projectors) the user has to manually enable it (choose same 3d format from the game settings menu and from the TV settings menu - a complicated process for mainstream user).

The bad thing about the future HDMI standard is that the console hardware has to support it, and there needs to be some SDK support to tell the console the stereo encoding type your game is using (so it can encode the right stereo bits in the HDMI packet metadata). It might not be possible to software update the HDMI chips inside the consoles to support this. However even without HDMI v1.4 support in video source (console or BR player), the TV sets can still likely be manually setup to decode the stereo 3d encoded data (just like the current stereo 3d ready TV sets). This way at least the side-by-side and scanline encodings should work fine (and those are the most efficient choices for game developers). But nobody really knows if this works in practice until the HDMI v1.4 compliant HDTV sets start to hit the market... but I am pretty sure the TV manufacturers will test their sets with "Avatar the Game" on both Xbox 360 and PS3 to quarantee it works perfectly. It's the poster child of stereo 3d gaming after all.
 
Ok, I checked the specs. I see where the confusion is now. HDMI 1.2 can support 1920x1200 at 24bit. The increased video bandwidth in HDMI 1.3 and above is for deep color support (48bit).

Have no idea about the claim that HANA doesn't support 1920x1200.

For a device to be rated as HDMI 1.4 requires a framepacked (two frames and meta data) 60Hz 1080P signal (24Hz, 720P60Hz &50Hz). The TV takes the two frames and displays them at 120Hz so you have two sequential frames in the time it takes for one 60Hz frame. The shutter glasses alternate turning (on-off) (off-on) at this rate (120Hz). This results in very little flicker and with good glasses almost no reduction in brightness or contrast. The framepacking method can vary. A device can support some of the HDMI 1.4 required standards but can not call it'self compliant.

The output from the 1.4 display device (PS3 for instance) is 60HZ or less not 120Hz.

The HDMI 1.4 device outputs a 60Hz or less video signal. When 3-D is active the framebuffer in the PS3 doubles in size and two frames are packed into, (timing) at double the transfer rate, the timing window that represents the 60 Hz frame rate associated with TV. The HDMI 1.4 TV recognizes this and pulls the two frames out of this "window" and displays the two frames alternately at 120Hz resulting in two 60 HZ (right and left) images for 3-D appearing to occur simultaniously because of persistance in the eye. The double 1080P resolution also supported by HDMI 1.4 is possible because the frame buffers in a HDMI 1.4 device have to be twice as large for 3-D so why not make these buffers available for double res if you are not doing 3-D.

The Xbox has some limitation due to it's fixed 10Meg video buffer and can not display HD 3-D at 720P. The data transfer rate is also limited by the HDMI 1.2 port. The XBOX GPU can also not do 1080P-60Hz , it can do 1080i. What you see at the timings representing 1080P is a lower res image filled in by AA dots due to the 10 meg buffer being too small (10meg) to display a 1920X1080 24 bit image. Front and back buffer would be in this case 6meg + 6meg=12 which is larger than the available 10 meg. You either have to reduce the color depth or in some other way, reduce resolution.

How did the Xbox do the 3-D games Avatar and Gen Tao? Those were done half resolution 720P. Another interesting point is that the game Avatar rendered at 30Hz on both the Xbox and PS3 but packed (two frames at half resolution) into, depending on display device, 60 Hz frames. My DLP required 60HZ checkerboard 1080P double frame (packed) and the DLP pulled the two frames out and displayed at 120 Hz.

With checkerboard DLP there is only one frame but it consists of alternating (checkerboard) video from the right and left images. The DLP TV pulls the alternating video apart to produce the two frames for 3-D. This results in slightly greater than 1/2 resolution due to processing (smooth motion) in the TV.

MS has stated that the Xbox can do 3-D, the whole truth is that it can only do something like half 720P resolution 3-D. This will be enough for the streaming 3-D being proposed for current set top boxes. Some of the proposed standards for HDMI 1.4 will probably be suported by the XBOX but only the PS3 can be called HDMI 1.4 compliant as it supports the requirement; double frame packed 1080P at 60Hz.

The PS3 can do full HD 1080P 3-D because it has 256meg of GDDR3 (shared) memory that can be allocated with software for video frame buffer or CPU use. Blu-ray at 1080P 3-D is possible for the PS3, 720P is the accepted standard for games in 3-D and the PS3 can support that at full 720P resolution but again the xbox can not do true HD 1080P or 720P 3-D because of the 10meg video buffer size limit.
 
And I must assume the (DAC->)VideoOut (whatsit called? scan out?) is never running from the EDRAM. That is, the front buffer is never in the EDRAM. Or you couldnt modify it :LOL: (without seeing all the interesting cubemaps, shadowmaps, fp16 tiles that the engine is rendering, all mixed in the same frame).
So can the xbox 360 decode (from say a net H.264 stream) 1080p to a UMA buffer and scan out real 1080p?
And what are you smoking?
 
It sounds like the Xbox 360 needs 'another upgrade'!

Anyway, so all you need is an HDMI 1.3 port on the console with an HDMI 1.4 TV and the TV will pick up that its 3D and display appropriately?
 
The PS3 can do full HD 1080P 3-D because it has 256meg of GDDR3 (shared) memory that can be allocated with software for video frame buffer or CPU use. Blu-ray at 1080P 3-D is possible for the PS3, 720P is the accepted standard for games in 3-D and the PS3 can support that at full 720P resolution but again the xbox can not do true HD 1080P or 720P 3-D because of the 10meg video buffer size limit.

Are you saying that tiling is not available as an option for the 360 in 3D scenarios? You'd think it was very well suited for it ...
 
The Xbox has some limitation due to it's fixed 10Meg video buffer and can not display HD 3-D at 720P. The data transfer rate is also limited by the HDMI 1.2 port. The XBOX GPU can also not do 1080P-60Hz , it can do 1080i. What you see at the timings representing 1080P is a lower res image filled in by AA dots due to the 10 meg buffer being too small (10meg) to display a 1920X1080 24 bit image. Front and back buffer would be in this case 6meg + 6meg=12 which is larger than the available 10 meg. You either have to reduce the color depth or in some other way, reduce resolution.

Um.... Xbox 360 *can* do 1080p60. Here's a nice picture. The 10MB RAM issue is overcome by tiling into main RAM, as any Xbox 360 game running native 720p resolution with anti-aliasing has to do an way.

So, yes it is 3D capable.
 
Um.... Xbox 360 *can* do 1080p60. Here's a nice picture. The 10MB RAM issue is overcome by tiling into main RAM, as any Xbox 360 game running native 720p resolution with anti-aliasing has to do an way.

So, yes it is 3D capable.

What about the older 360 models that lack HDMI, are they also capable of 1080p60? If I remember correctly the component output got some bandwidth restrictions?

Anyways, I guess 3D output will be restricted to the newer models with HDMI.
 
What about the older 360 models that lack HDMI, are they also capable of 1080p60? If I remember correctly the component output got some bandwidth restrictions?

Anyways, I guess 3D output will be restricted to the newer models with HDMI.


Component can handle 1080p60 no problems, what it can't handle is the AACS!
 
Back
Top