I'm having a hard time understanding what's the big deal with ROP "efficiency", it's never been talked about for graphics cards. It's always been balanced on a need basis, and the 78xx has 32ROP for 153GB/s. That was supposed to be a good balance. Is Durango expected to perform like a 77xx card? It has a LOT more bandwidth available than a 77xx, unless the DME are wasting half of it.
Actually, if you are a regular in the 3D Architechture & Chips section you'd have seen quite a lot of discussion over ROPs over the years for various GPU architectures. Especially when it comes to Number of ROPs versus available bandwidth to feed them. As well as to how effectively those ROPs are used. And how effectively those ROPs can use the available bandwidth. Oh and the capabilities of those ROPs, when compared to the competition. ROPs have increased in capability over the years adding on more dedicated functions.
I kind of doubt that the consoles will be targeting anything but 1080p. Or, to put it another way, most games will be 720p+ resolutions. Whether or not we see games that aren't quite 1920x1080p is another story, though I am willing to bet we will see a good portion which aren't quite there.
But I think both MS and Sony will apply pressure for devs to target higher resolutions.
Didn't MS, back in the day, have some kind of requirement when they certified a game for the 360 to be 720p? I mean obviously alot of games came out that weren't, but were those special exceptions or was there no resolution requirement for the platform?
Of course, I may be biased coming from the PC side of things. Higher graphics performance doesn't = higher resolution when talking about pushing the 3D graphics envelope. True a game can be made to run at 60 fps at 2560x1600. But it won't ever look as good as a game made to run at 60 fps at 1920x1200 or 1920x1080.
Graphics hardware on PC is many times faster now than it was in 2005. But for smooth gameplay at max settings on hardware that is current with a game that pushes the boundaries (fairly rare) at the time the game is released generally means smooth gameplay is only achieveable at 1920x1080 or 1920x1200. And even then it may chug along at times and require turning down some settings to achieve smooth gameplay. 1920x1080 or 1920x1200 is the equivalent of 720p for consoles.
So basically what it comes down to. A console game optimized for 720p at 30 or 60 fps will likely almost always look better than a game optimized for 1080p at 30 or 60 fps. Assuming same game genre type. IE - open world versus open world or corridor versus corridor; and not open world versus corridor shooter.
Absolutely nothing stopped developers from making 1080p games on PS3 and X360, except for one thing. A 1080p 30 fps game would not look as good as a 720p 30 fps game.
The same will likely be true for Orbis and Durango. 1080p 30 fps may look a bit better than 720p 30 from the past generation but it won't look better than 720p 30 fps on the same hardware.
I fully expect that some developers, maybe more than last generation, will target 1080p. But I believe that all the best looking games will be 720p or perhaps slightly higher.
So that basically means that PlayStation games will have the same resolution for interface and graphics while Durango is able to render interface and graphics with a different resolution?
Not necessarily. It just means there is no dedicated hardware support for it. So it can be done, it just requires more GPU resources in order to do it.
Regards,
SB