Why did Sony partner with Nvidia instead of ATI for the PS3?

Shifty Geezer said:
Are you (blakjedi) talking about seperate backbuffers/frontbuffers for each screen vs. one buffer shared between two screens?

Which is currently used?
Both. You can usually choose in the driver panel whether to use span or dual view. This affects how windows maximize, and whether DirectX fullscreen mode will use one or both screens.

DirectX "fullscreen mode" (which is slightly faster than a screen-covering window in windowed mode) only uses one "adapter", so in span view your 3D scene spans both screens, in dual view it's only on one screen.
DX fullscreen is exclusive to the foreground application/window. Other applications or windows can appear using 3D on the other screen in dual view, even covering the whole screen. But as soon as the window that is set in fullscreen mode loses the focus, it will hide.

In DX windowed mode or OpenGL, your 3D scene can span the screens, regardless of whether you set span or dual view in the driver panel. However, when you set dual view the driver will actually create and manage two render targets transparently. This is far slower, but can get you around render target size limitations. I.e. you can actually get a 3D window that's more than 2048 pixels wide on an ATI card this way.

Generally speaking, a single buffer is faster even when you use it in split screen fashion with multiple viewports. OTOH, it's less flexible because it means identical settings (AA, bit depth, resolution) for both screens. Separate buffers allow separate settings (low-res noAA stats display on an SDTV as second screen, maybe ;)). And on a console it should only be a little slower than a single buffer.
 
Back
Top