Anyone who tried to take advantage of the HD titles available for the original XBox is certainly appreciative of Microsoft's position this generation to use an outboard video display chip with its' scaling capabilities. It's inclusion had some definite trade-off's. I wanted to get a pulse from the techies / videophiles in the room to get a sense if it was the right decision.
XBox 1 supported 480i, 480p, 720p, and 1080i, from which developers were free to choose what they wanted to target / support. The problem is obvious. Each game is different, and may not be compatible with your TV. For example some titles, only supported 480i, which looked terrible on progressive sets. Most of the HD games, only supported 720p, while most TV's only supported 1080i as an HD input; meaning even if you had a HDTV and an HD game, you would likely only be able to play it at 480p. Such was the case for me and many others. And there were other issues not as apparent. Even with TV's that support all resolutions, many don't have a different screen position memory for each input rate. So as you switch between a 720p game and a 1080i game and a 480p game, you may be forced to reposition, center, and size your TV each time. Or the TV might not re-sync properly and you have to power the TV off and on. While I have no idea how many support calls and disappointed customers this created, it certainly factored in to the 360 design. The 360 simply scales any developer defined frame-buffer size to any consumer selected output resolution.
This design decision seems to have taken many hits. I'm assuming in it's current implementation the scaling chip works in the analog domain which prevents digital output support in the form of DVI or HDMI, and currently lacks the ability to run at 1080p. The output of the chip is good, better than the encoder in XBox1 or PS2. Because the PS3 appears to be going the XBox 1 route, giving developers the control (perhaps with a minimum spec of 720P, anyone know?), do you guys think Microsoft made a bad choice?
XBox 1 supported 480i, 480p, 720p, and 1080i, from which developers were free to choose what they wanted to target / support. The problem is obvious. Each game is different, and may not be compatible with your TV. For example some titles, only supported 480i, which looked terrible on progressive sets. Most of the HD games, only supported 720p, while most TV's only supported 1080i as an HD input; meaning even if you had a HDTV and an HD game, you would likely only be able to play it at 480p. Such was the case for me and many others. And there were other issues not as apparent. Even with TV's that support all resolutions, many don't have a different screen position memory for each input rate. So as you switch between a 720p game and a 1080i game and a 480p game, you may be forced to reposition, center, and size your TV each time. Or the TV might not re-sync properly and you have to power the TV off and on. While I have no idea how many support calls and disappointed customers this created, it certainly factored in to the 360 design. The 360 simply scales any developer defined frame-buffer size to any consumer selected output resolution.
This design decision seems to have taken many hits. I'm assuming in it's current implementation the scaling chip works in the analog domain which prevents digital output support in the form of DVI or HDMI, and currently lacks the ability to run at 1080p. The output of the chip is good, better than the encoder in XBox1 or PS2. Because the PS3 appears to be going the XBox 1 route, giving developers the control (perhaps with a minimum spec of 720P, anyone know?), do you guys think Microsoft made a bad choice?