Actually, if you are a regular in the 3D Architechture & Chips section you'd have seen quite a lot of discussion over ROPs over the years for various GPU architectures. Especially when it comes to Number of ROPs versus available bandwidth to feed them. As well as to how effectively those ROPs are used. And how effectively those ROPs can use the available bandwidth. Oh and the capabilities of those ROPs, when compared to the competition. ROPs have increased in capability over the years adding on more dedicated functions.
Of course, I may be biased coming from the PC side of things. Higher graphics performance doesn't = higher resolution when talking about pushing the 3D graphics envelope. True a game can be made to run at 60 fps at 2560x1600. But it won't ever look as good as a game made to run at 60 fps at 1920x1200 or 1920x1080.
Graphics hardware on PC is many times faster now than it was in 2005. But for smooth gameplay at max settings on hardware that is current with a game that pushes the boundaries (fairly rare) at the time the game is released generally means smooth gameplay is only achieveable at 1920x1080. And even then it may chug along at times and require turning down some settings to achieve smooth gameplay. 1920x1080 or 1920x1200 is the equivalent of 720p for consoles.
So basically what it comes down to. A console game optimized for 720p at 30 or 60 fps will likely almost always look better than a game optimized for 1080p at 30 or 60 fps. Assuming same game genre type. IE - open world versus open world or corridor versus corridor; and not open world versus corridor shooter.
Absolutely nothing stopped developers from making 1080p games on PS3 and X360, except for one thing. A 1080p 30 fps game would not look as good as a 720p 30 fps game.
The same will likely be true for Orbis and Durango. 1080p 30 fps may look a bit better than 720p 30 from the past generation but it won't look better than 720p 30 fps on the same hardware.
I fully expect that some developers, maybe more than last generation, will target 1080p. But I believe that all the best looking games will be 720p or perhaps slightly higher.
Not necessarily. It just means there is no dedicated hardware support for it. So it can be done, it just requires more GPU resources in order to do it.
Regards,
SB
I'm a PC gamer too. The performance hit (and cost of the screens) at rendering post-1080p resolutions just doesn't really interest me. I'd need a very compelling reason to buy that screen and render at it, and right now I'm kind of 'meh' towards it.
I understand fully that rendering at a lower resolution provides performance benefits. But I think the console makers and the developers would very much like to hit 1080p for fidelity reasons- and to maximize the HD TV's that their owners have.
I think 1080p may have been a bit far out of reach for current generation consoles for memory and bandwidth reasons. In that scenario it just didn't make a whole lot of sense, since 720p provided an adequate resolution jump from prior generations while still allowing them to get good mileage from the hardware.
I don't see why there wouldn't be a similar jump with this new hardware..