Heh, fire up good ol' Duke Nukem at 640*480 versus 1024*768... DN3D probably uses textures that's like 64*64 pixels or so (maybe at most) and blow them up across an entire wall sometimes, and it still makes a heck of a difference in-game. Of course, it doesn't mipmap, so everything distant beyond a certain limit turns into a mess of crawly ants; increasing rez pushes that distance back...And doesn't higher resolutions on the same screen have a huge impact on the texture quality as well?
If you have higher resolution but same screen size you will see less aliasing. If you move from 720p to 1080p on the same TV you will see less aliasing.
I have a 24 inch 120Hz display for my PC that has a feature called "Smart Scaling" which allows you to have a ratio of 1:1 between screen resolution and render resolution. I connected my PlayStation to it and used the 1:1 ratio and it looked absolutely great! The image was razor-sharp, it just looked as if rendered on a high end PC. The downside: The image was very tiny with very huge black bars at the sides of it...
I want that for next gen (the razor-sharpness, not the tiny image). I'm not going to buy a console if it isn't playing games in 1080p and I'm not going to buy a single game that is not rendered in 1080p (maybe except for dynamic resolution rendering techniques). It's 2013, I've had 720p for 8 years...
It's just supersampling. Render twice the screen resolution and shrink to normal res.How about TXAA and the new AA Timothy lottes is creating !
Yes, that's the whole point of my post. Just saying "high resolution" is meaningless and rather misleading. Is aliasing on a modern day top of the line 1080 - 80" TV going to have less noticeable aliasing than a top of the line 720p - 40" TV from 2003? It has more resolution, no? But depending on how far you sit from the TV will actually determine whether it appears to be more aliased or not.
There's a difference between sitting 1-2 feet away from a desktop monitor and 3-4 meters from a living room TV. In the first case, I can certainly quite easily notice whether I'm running at native resolution or not. On my 55" TV that is 3-4 meters away in my living room? Absolutely no visible difference between 720p and 1080p (and this is PC games where I can certainly crank up the detail).
As to your second paragraph. I'd be prepared to not play a lot of games on Orbis and Durango then. Especially not the ones that try to push graphics to the next generation.
Basically pick one.
Current gen level of graphics at 1080p. Or next gen graphics at 720p.
Regards,
SB
Basically pick one.
Current gen level of graphics at 1080p. Or next gen graphics at 720p.
(...)while 720p titles will use PPAA/MSAA combos a la Horizon/American Nightmare for better IQ.
AA doesn't have to add blur. MSAA and SSAA don't introduce any blur. In the case of 1080p 2xMSAA vs 720p 8xMSAA+16xAF, 720p will have the better IQ. but if viewed with a wide FOV, the higher resolution may be preferable.Image quality is not only anti-aliasing. A 720p game with a more potent AA algorithm may have less aliasing but it will have a blurry image with blurry textures compared to the 1080p game.
Downsampling is antialiasing, known technically as supersampling. This is the highest quality AA because it samples all values so includes texture filtering too, but comes at considerable cost. MSAA uses selective supersampling on geometry edge, but doesn't filter textures/shaders. However, high levels of MSAA will reduce the amount of visible steps in an aliased edge by more than a 2x supersampled image will. Similarly, high levels of texture filtering will reduce texture aliasing far better than 2 samples per pixel. That's where development of more targeted, more efficient techniques are important, and supersampling is left to PCs with too much power on their hands and nothing else to do except rendering everything 5x bigger.So we have one extreme on one side (low res + high AA) and one on the other (downsampling).
Most 2D games are sprite based and hence don't have polygons, which happen to be the main source of aliasing in 3D games. The ones that do (Street Fighter 4, Marvel vs Capcom 2) have obvious aliased edges.I have a silly question, i've just thought of it and realise I don't really know.
How do 2/2.5D XBLA/PSN/PC type games do AA?
I'm playing the excellent Don't Starve on my Mac, it's running in Chrome and the IQ is excellent, does it render using vectors or something?
MSAA uses selective supersampling on geometry edge, but doesn't filter textures/shaders. However, high levels of MSAA will reduce the amount of visible steps in an aliased edge by more than a 2x supersampled image will. Similarly, high levels of texture filtering will reduce texture aliasing far better than 2 samples per pixel. That's where development of more targeted, more efficient techniques are important, and supersampling is left to PCs with too much power on their hands and nothing else to do except rendering everything 5x bigger.
Reason why games with 2D art usually do not have aliasing is because of the pre-filtering of art.Most 2D games are sprite based and hence don't have polygons, which happen to be the main source of aliasing in 3D games. The ones that do (Street Fighter 4, Marvel vs Capcom 2) have obvious aliased edges.
Considering that the game is 2D game you can easily use what ever method you like for it. (Including many which require no polygons, like distance fields)However I'd like to know what they do in case of games Rayman Origins where they have something like a fluid that moves (the water) but looks sharp with no aliasing, it's still a shader and I suppose it's made of polygons too considering it reacts and deforms to the character movements.
Dropping 8xMSAA should help as well.Next gen should be able to get pretty close, without the PC overhead and with the better CPU/APU/GPU integration.
Basically pick one.
Current gen level of graphics at 1080p. Or next gen graphics at 720p.
Really?? Are things looking that bad for next gen?
The jump from PS2 to PS3 was quite enormous, we got higher res graphics and IQ at the same time as a level of detail which was many times better - think any game really but Uncharted comes to mind.
Surely a PS4 will be able to increase the IQ as well as increase the level of detail, without having to choose between the two?
I would be very, very disappointed if that were not the case. They're many years apart after all.
It may not be that bad. But it's what my expectations are. If it manages to improve on that, then that'll be a nice surprise for me. But, given the specs released for each console thus far, that's about what I'm expecting.
Considering that on PC it takes a rather significant increase in rendering power to drive both increased graphics rendering technology combined with increased resolution. Usually with a generation to generation upgrade you expect minor increases in one or the other but not both. It takes multiple generations to see a significant increase in one or the other, again not both. And potentially more than an order of magnitude increase to have both better graphics rendering technology combined with an increase in playable rendering resolution.
The benefit on PC however is that if the developer offers comprehensive control over the rendering tech used, you choose your own compromises between resolution, rendering speed, and the level of graphics technology used.
For consoles, the developer offers one setting that everyone uses. Hence you'll likely either get high resolution at similar to slightly better quality of current gen or current gen resolution (maybe slightly higher, IE - no sub 720p) with greatly enhanced graphics technology/performance.
Regards,
SB
They use MLAA with more refined edge searching.They should go with GoW:A/Beyond/LastOfUs postprocessing AA solution or smth simmilar. Those Sony games have fantastic IQ.