Will next gen consoles focus on improving IQ at current HD resolutions?

Yes, but my point is comparing scaling of 720p and 1080p displays. It's pretty obvious when a game is lower resolution. There are quite a lot of 720p displays that still employ upscaling to 136x by 768 resolution, and I'm not sure that's significantly better in practise (perception & in motion) than upscaling to 1080p (in order to call "720p upscaled to 1080p" horrendous).

CoD still looks like a blurry mess regardless of the display type, but I can't say it's any better on a 720p/768p display than a 1080p one.
 
IF a blurry 600p mess = 30million sold, expect more. It obviously hasn't been a major impediment to sales for some titles. It's a tradeoff, if you add 20+% more pixels, they are going to have to take something away, perhaps something you'd actually prefer to the pixels. Would COD have sold better if it were 720p with AA and ran at 30FPS? I doubt it.
 
Well, part of the CoD resolution does have to do with fitting into eDRAM in a single tile whilst taking advantage of MSAA to give better edge sampling. Would we have seen a different resolution if there were enough eDRAM? They would still target 60fps, and as should be obvious, the quality of shading or # of effects would have scaled.
 
I still haven't seen a single game that looks more realistic than amateur shot VHS home videos. And VHS resolution is 333x480 (luma resolution, chroma is only 40x480). If we could get that good pixel quality, I would be willing to go that low in resolution :)

Although I visit this forum daily, it's been ages since I posted here for the last time, but I couldn't help seconding this so bad, because it's exactly what I think. IMHO some people are focusing way too much on resolution, and I think a neat IQ is even more important than that. I think that camera effects (such as DOF and subtle blurring/motion blur) and a PERFECT AA method would help a lot.

Besides this, I think that lighting is what provides more realism to graphics: in real life you can have a single cube in an empty 4 wall room, that cube painted in one colour, and you will still perceive that image as real life mostly because of lighting and shadows. I mean, the cube has "a low polygon count", the same goes for the room. They don't even have "textures"!! But you see they're real. In 3d graphics it's almost the same. When you're working with a high polygon model (not for a game, but for some art, for instance), and you see a "viewport" real-time preview, it might look like shit because of the lighting, no matter how many polygons has nor if the textures are ultra high res: you won't get a realistic image until you render it with the proper light/ambience properties.
 
It shouldn't matter if you are playing COD on a 720/768 or 1080, its going to be blurry either way because of the 600p rendering. Upscaling it just spreads the image across more pixels, which if anything should lead to smoother round edges and less stairstep/sawtooth look.

On the topic of 720p tvs, 1366 x 768 is good, better than 1280 x 720

Example >

You have a 720p image that you need to blow up to 32"

You can either blow it up @ 1280 x 720 and deal with blockier pixels

Or you can scale it too 1366 x 768, and then blow that up to 32", for smoother edges

That's the one part geeks seem to forget, they worry so much about it being native res, they forget that such a thing barely exists in the tv world due to overscan and other issues, and they also forget that the image is being blown up 200, 300, 400% to fit on their big screen tv. Always more pixels is better.
 
I still haven't seen a single game that looks more realistic than amateur shot VHS home videos. And VHS resolution is 333x480 (luma resolution, chroma is only 40x480). If we could get that good pixel quality, I would be willing to go that low in resolution :)
I can't agree with that. VHS sucks! I know it was okay at the time but with TVs getting bigger, and LCDs being far less forgiving, there's no way that degree of blur would be workable nowadays. My eyes would get strained trying to focus on objects that cannot be focussed!

There is the possibility of smart upscaling, and if a game could be rendered lower resolution and upscaled in resolution and framerate successfully. Whatever happened to Toshiba's Cell-powered super upscaler? Did that ever work well? Those techs have potential I think as a compromise. And rendering games arther than upscaling video feeds, using different game data could lead to much smarter upscale than just a TV working on 2D colour data (and temporal changes). Basically the same sort of idea as AA construction. Taking a photorealistic 960x540 or lower game at 30fps using GPU power to render awesome pixels, and then upscaling that cleverly to 1920x1080 60fps in a way that works well enough, could be a workable solution, as long as upscaling has performance benefits and doesn't cost more than just rendering higher res!
 
Well, part of the CoD resolution does have to do with fitting into eDRAM in a single tile whilst taking advantage of MSAA to give better edge sampling. Would we have seen a different resolution if there were enough eDRAM? They would still target 60fps, and as should be obvious, the quality of shading or # of effects would have scaled.

I don't want to turn this thread into a CoD framerate/resolution discussion, but I don't know that eDRAM was the only reason for that decision as the PS3 version runs a lower resolution (I suppose that compromise was for taking advantage of horizontal scaling) at a generally poorer framerate (talking black ops, using digital foundry as source). So even if the 360 had more edram, I'm still not sure you'd have seen the game running at a higher resolution. I didn't really want to get into specifics because it doesn't really suit the topic, different developers may have made different decisions given the same resources and it's hard to know how those choices would affect the end product.

Obviously people have different responses to rendering resolution and I don't think I'd use the term blurry mess to describe COD on my 55" TV from 10', and I expect most of the reviewers chose against that terminology as well. I don't expect developers to lock into 1080p just because almighty has great eyesight, or plays 3' from his display or whatever else makes him sensitive to the lower resolution. Developers need to make the best choice for their whole target audience and it's going to mean compromises that seem offensive to certain people, but that doesn't make it a wrong choice as the alternatives are almost always going to be an unknown to the end user. (I'm quite certain they didn't choose 600p over 720p because they were lazy.) Hardware will certainly impact the direction of those compromises, and 3rd party developers may often need to choose some sort of middle ground to offer the best similar experience (provided it's reasonably possible).
 
As for 720p games looking "horrid on pretty much all 1080p TV's" i just don't believe that too be true, most 1080p TV's display 720p as well as 720p TV's do. That being said, on my little 32" panasonic lcd, the image quality at 720p is outstanding.

This will vary wildly depending on your tv. All my plasma tv's upscale horribly for example, to where the 360 games looked fine since it has a hardware scaler but all ps3 games looked blurry until I used an external scaler on it. Kinda moot though since next consoles will almost certainly both have good hardware scalers this time around.


What a lot of people don't know is even your 1080p is scaling the image... all TV's scale the image because all TV's have overscan, even your precious 1080p has pixels that aren't being used :) everything is scaled :) isn't technology fun???

Yes this is very important to mention! This for example is why some people dont see the blur caused by post process anti aliasing, don't notice qaa blur as much, don't see the huge sharpness difference between 1920x1080 and 1280x720,etc. That's because their tv is always bluring the image on it's own even when fed a "native resolution" tv signal. Hence why they don't see other blurs like fxaa, qaa, etc as easily even when watching 1920x1080 material. On the other hand set your tv to 1:1 pixel mapping to where it bypasses tv processing and the sharpness of a 1920x1080 feed will jump out at you, the difference is big! In that situation you will more easily notice anything that blurs the images like lower res, post process aa's, etc. It may be why some pc gamers are far more sensitive to it since if they game at 1920x1080 all the time and have 1:1 mapping set on their tv, then the blur dropping down to console res and post aa is significant.


Personally, I think that if a game runs at 1920x1080, on most TVs at normal sitting distances, the game doesn't need AA and will still look better than 1280x720 with 4xAA.

Depends on the games color palette as well. A game like Mirrors Edge will still look very aliased at 1920x1080, whereas Batman won't be that bad.


I still haven't seen a single game that looks more realistic than amateur shot VHS home videos. And VHS resolution is 333x480 (luma resolution, chroma is only 40x480). If we could get that good pixel quality, I would be willing to go that low in resolution :)

I was waiting for someone to bring that up :) Although truth be told vhs looks blurry as hell. The example I would use is ESPN or ABC which broadcast their hd channels at 1280x720, yet I don't see people complaining. I wonder how many even know that their shiny new 1080p tv is actually 720 while they watch those channels.
 
I fully expect 1080p for nextgen to be as common as 720p this gen if not more, just take a look at the benchmarks for modern GPUs, they cut through 1080p res like warm butter. But personally I still think 720p game looks good on a 1080p set so it really depends on your own tolerance.

I hope hdmi updates to handle 4096 x 2160 resolution(update edit: seems current cables may be able to handle it. ). While impractical for most games, I could see some less demanding psn and live games using it.

I don't know if it can be done, but it would be interesting to see if different elements in the scene can be rendered at different resolutions. We know that some things can be upscaled and the difference with native higher is negligible, other things not so much. If we could render portions of the screen at different resolutions and upscale the simpler portions, we'd likely be able to handle far higher resolutions in many scenes. Though there may be reasons why this is difficult if not impossible, someone with more knowledge may know*(we know that the retina only has a small high resolution area in the fovea, yet we perceive most of the entire view as high res implying the lower res portions are somehow upscaled.).
 
Last edited by a moderator:
I don't want to turn this thread into a CoD framerate/resolution discussion, but I don't know that eDRAM was the only reason for that decision as the PS3 version runs a lower resolution at a generally poorer framerate (talking black ops, using digital foundry as source).

Sure, it's not the only reason. The point I'm trying to make is that it's certainly important ( since CoD3) for picking a particular rendering resolution since it's essentially arbitrary.

In trying to hit 60fps, it's pretty clear that a higher resolution makes it hard. On one of the platforms that you don't want to do redundant geometry work (360 tiling). We know that MSAA can be done in a single cycle (2x on PS3, 4x on 360). While there are memory and memory bandwidth considerations, 2x is free for the ROPs (note because of said bandwidth considerations, it's not free overall. I'm just talking about hardware efficiency and not going to multiple cycle costs. Lowering the resolution will clearly reduce the bandwidth reqs anyway ).

So the question is then what resolution with 2xAA? Well, if you look at the 10MB per tile on 360, there's your answer. It should be clear that 2xAA 600p gives a better edge resolution & stability than 720p no AA, not just being an overall higher number of samples.

What happened on PS3 for Black Ops a few years later is irrelevant since that has more to do with pushing the envelope for other effects and shaders on the other platforms and trying to maintain parity. The point is that the 360 versions are still set to 2xAA @ 1024x600. That configuration sets the bar for their pixel shaders and alpha throughput, and if the other platform can't cope, then that's too bad.

The PS3 SKU also has less usable RAM as well, and the gap widens when you consider that resolves on 360 result in single sample per pixel (unless you purposely alias to a larger texture to get access to the multisamples). Also see Crysis 2's 1024x720 saving them 14MB of RAM for example.

Even so, we are again back to 1024x600 for both versions in MW3. It remains to be seen what the next title will do. If/when the 360 version goes below 1024x600 2xAA, then it is fairly safe to say that the eDRAM isn't the main concern - the per pixel effects are. I'm not leaving room for a deferred setup since that makes it much more difficult to hit 60fps for this sort of title.
 
I was waiting for someone to bring that up :) Although truth be told vhs looks blurry as hell. The example I would use is ESPN or ABC which broadcast their hd channels at 1280x720, yet I don't see people complaining. I wonder how many even know that their shiny new 1080p tv is actually 720 while they watch those channels.

While it's a good point, I have now seen many instances where games look better than real-life at those resolutions. No, not always more realistic, but I wouldn't trade in all circumstances. Sometimes sharpness is more valuable.

Of course, a car in a game like Gran Turismo 5 already looks better than on VHS on many occasions (maybe even most), especially on the variable day-night and weather settings. That is still an exception now, but will get true more and more as we go on. But yes, we are certainly not there quite yet.

 
Well I think that 720P as a basis going nowhere.
This is what Rein said mid gen:
Mark Rein said:
Over half the users who played Gears of War 2 so far do not have HDTVs.

Ouch and so easily forgotten. The interview is here and from summer 2009.
Yes a Core gamers game, M rate, the nail on the coffin in what most believe about the tipical core gamers playing environment.
Lot of people can't access the best TV in the House. It's used too much when they can they prefer connect to whatever old tv they have and free of the gaming schedule. That's why I believe WiiUmote have a place to secure the console in the main living room.

Thing are getting way better and old tv should soon be HD ready TV, but there are still SD TV ground even in US I saw quiet some already just not in the main living room.

For me 720p is still a sane base but dev should be given choice, no arbitrary policies like for 30/60 fps let them choose depending on what they want to achieve.
 
Developers need to make the best choice for their whole target audience and it's going to mean compromises that seem offensive to certain people, but that doesn't make it a wrong choice as the alternatives are almost always going to be an unknown to the end user.

Exactly.

Also, what we on the forums don't know exactly are what the statistics are saying about the number of consoles even set to high def vs SD - that idea goes back to Mark Rein's comment about more than half of the Gears 2 players not even playing in HD.

edit: ha, liolio already mentioned that tidbit.

At any rate, I'd love to see the stats for Call of Duty gamers considering just how many more people have bought it. I'm not saying devs should just go with 480p and call it a day, but it's a bit naive to hate on them for choosing a sub-720p resolution, a number which itself doesn't even cater to the native resolution of a lot of "720p displays" (back to 768p thing)
 
Well I hope they put some good scaling hardware in there. Assuming they can eliminate IQ issues at 1080p or 720p, it'd be nice to have a quadhd upscaler in there. The ability to do native quadhd would also be welcome for browsing, future-youtube, and maybe even a nextgen quadhd bluray standard.

100inch plus nextgen tvs would benefit, and so would nextgen projectors.
 
would to add for those scared about possibly underwhelming specs, that if this time around no extra power is used to feed a resolution increase then lesser increases in power may translate in greater jump in power per pixel than previous gen provided.
Say x4 the raw power is x4 more operation per pixel (+ all the architectural improvement, great jump in texturing power, etc.) for example. I'm not sure about the jump from the ps2 to ps360 provide once jump in resolution is taken in account.

Still not perfect but should ease the pills for those scream at really under specced systems.
 
No it doesn't really ease anything. IF both MS and Sony release 100W or less boxes, it just means I'll be back to PC gaming exclusively. And there still is a resolution jump because they didn't exactly get to 1080p in the last gen.

<edit>The console experience may be approaching diminishing returns, but they are in constant competition with the PC for my money. It doesn't need to be better, but I need to get enough that I feel my purchase will carry me more than a couple years. If next gen comes out with performance you get from a $700 or less PC I just don't see the justification for buying in that low.
 
In terms of architectural improvements, I'd really like to see how GCN ALUs compare to the VLIW4 or VLIW5 ones. Ideally the clocks and ALU counts would be the same for a heavy shader test. The closest paper spec'd hardware seems to be 4770 vs 7770 (640 ALUs, 32 vs 40 TMUs, 16 ROPs). I wonder how that compares back to Xenos.

I'd put changes to things like L2 under architectural improvements.

-----
I was hoping to see a 1536 ALU part in the Southern Islands family to compare directly to Cayman, but that doesn't seem to be happening.
 
I agree too that it was pretty farcical that current consoles didn't downsample to SD resolutions... but then again... they did want to get you all to buy the new HD TVs :devilish:

There 360 games that downscale to SD, giving you "free" super-sampling. And they look really good, even at SD. It's probably a different situation with the scaler-less PS3.
 
Simple downscaling is easy and undemanding. No reason not to support it on a console with or without scaler.
 
Back
Top