The Next-gen Situation discussion *spawn

It's pretty straightforward. The lens of your eye projects an image onto your retina. The rods and cones in your retina sense the light and transmit the info to your brain. You cannot see details of the image that are smaller than your photoreceptors. Didn't know this was controversial.

I explained it pretty thoroughly. The top left plot quantifies the returns in image quality you get by doubling the resolution of the image in terms of the L^2 norm, clearly illustrating the diminishing returns that come with increasing screen resolution.

So it's only about resolution? I thought you had some kind of program dealing with "graphics" as well.

Anyways, I dont know what to say, except that it's obvious to me the leap from 720P to 1080P is a big one, and well worthwhile. And, I'm sure at some point the leap to 4K will be desirable. After that, I dont know.

Is 720>1080 less of a leap than 480>720 imo? Probably, sure. But it's still big and we still lack power to do it. I would love to gasp, play console games in the native resolution of my HDTV. They will look so much sharper.

Yet, there is real debate that next gen consoles will even have the power go to 1080P. That's how pathetic our power levels are! And we're talking about diminishing returns???

I would claw my eyes out if I had to play PC games in 720P upscaled on my 1080P PC monitor. It's really horrendous.

The whole debate started as diminishing returns. I guess my position is that whether or not those exist (they probably (surely?) do), I dont think we're near them anytime soon. I cant see the future, but I am pretty sure we are not there now.

It's the same as console gens. I am not sure there will be another after the next round, but I am sure there will be at least the next (Durango, Orbis) round, right? Which in itself many predicted would not happen. Whether there's a gen after that, I will only be able to get a feel for towards the end of next gen. But, I already have a sneaking suspicion there will be. The death of consoles has been predicted for so long, going back 20 years, but not happened yet.

As you can see, the gains begin diminishing right away, and really taper off at around 1/4 resolution (670 x 1001). The next step up is 947x1415, but the gain is pretty small, so if this were a game development context, it might make more sense to do something else with your pixels

Your first resolution is something less than but close to 720P in pixel terms, the second is something above 720P but well below 1080p.

so you definitely lose me there, because there's clearly large benefit in increasing from 720p to 1080p.

edit: saw this this evening, 1080p handsets are here http://www.engadget.com/2012/10/17/htc-j-butterfly-htl21-440ppi/
 
So it's only about resolution? I thought you had some kind of program dealing with "graphics" as well.http://www.engadget.com/2012/10/17/htc-j-butterfly-htl21-440ppi/
He does. He's just showing the influence of resolution, but it's not limited to that.

Consider another example, applying fearsomepirate's metric to artists' works. The intention is to recreate a photograph as closely as possible. Drawing on a 6"x4" sheet, each image is scanned at 300 dpi and compared pixel-to-pixel with a scan of the 6"x4" photograph. The artist who gets the smallest deviation from the photo source is the closest. An amateur will score a large deviation. Using a comparison of each pixel, we can score how close to 'real' the artwork is with a scientific metric.

Now I don't think this metric is a perfect solution in itself, but it does solve the issue for subjective 'looks better' and trying to understand the affect of polys and textures and shaders. Rather than look at how the image is made, fearsome's notion is just to compare the end results with a perfect reference and measure how close the rendering gets. If a console can get closer to the real image with less polys or lower resolution or higher resolution (and framerate, as we're not just dealing with stills here), it would show in his qualitative analysis.
 
1080P is an obvious target I would have thought. It seems reasonable that the console would be designed to best display at that target resolution, like people said before 1080P @ 30 or 720P @ 60 ought to be a bare minimum. Surely something in between the HD 77xx and 78xx ought to pull it off nicely?
 
1080P is an obvious target I would have thought.

We can hope! And, cautiously optimistic.

The rumored PS4 specs if they pan out, are absolutely tailor made for 1080P. No reason not too on a 256 bit bus.

Durango is more an odd duck with it's low bandwidth DDR3, and the leaked docs speaking of 4-6X power which is borderline, but as it seems to have more GPU power than expected in recent leaks, is looking hopeful imo as well.

Heck I wouldn't be surprised if Durango was even planned at 4-6X, and then MS engineers stepped in and said "we need to give this thing a little more boost to target 1080P comfortably, which we feel is necessary next gen" or something.
 
Yet, there is real debate that next gen consoles will even have the power go to 1080P. That's how pathetic our power levels are! And we're talking about diminishing returns???

I am sure they have the power. I think the discussion is more about, as it has been for this gen as well, whether to have more pixels or prettier pixels. You can do 1080p now as well but most have opted for less but prettier pixels. It is similar to the 60 vs 30 fps debate. It is not that the consoles are so low on power that they can't handle 60fps, just that most developers go for less but prettier.

Well next gen will be no different I guess, unless they do impose that games have to do 1080p, or 60 fps and so on. Developers will still have to choose whether they go for more or prettier. I am in the camp that it should be up to them. I am biased though as I have only a 720p TV.

I would claw my eyes out if I had to play PC games in 720P upscaled on my 1080P PC monitor. It's really horrendous.

Well, when playing so close to the screen then yes, resolution becomes even more important. But lets be faire though. If your GPU is not up to the task your out of luck. It is not that PC games automagically will render at 1080p. Most of the time you will have to choose yourself what you like better. Will you run the game with full effects and sacrificing fps/resolution or the other way around? In that sense PC gaming is great because it lets you choose the trade offs and not the developer. The other thing is that fps/resolution is one of the easiest ways as a PC developer to make your game look better. Do you target the absolute latest enthusiast GPU out there, and add all the bells, whistles and polys they can barely manage at 30fps/720p or do you adjust your game for more "reasonable" cards out there and let the enthusiasts "just" enjoy better fps/resolution? Wasn't there a lot of complaints from the PC crowd that consoles have inhibited the progress of PC games? I guess those that complain might feel like the fps/resolution advantage on top of the visual advantage they have over the consoles is not enough.

The whole debate started as diminishing returns. I guess my position is that whether or not those exist (they probably (surely?) do), I dont think we're near them anytime soon. I cant see the future, but I am pretty sure we are not there now.

I agree with you. We are not at the diminishing returns yet. However, a 10x power jump now, does not yield the same perceived visual difference as it did once in the "old" days. Going from a 300 poly character to a 3k poly character will most likely give a bigger visual difference than from a 3k to a 30k and even more so from a 30k to a 300k. The other thing is, I am not sure whether resolution would be the nr1 priority for making games looking like the took a true generational jump. I am not disputing that resolution plays a role and maybe a big one at that, just that it might not be the one that needs to be addressed above other factors. I am certain that most people would say that a 720p prerendered movie (the top ones at least) looks much better than basically any 1080p game. No doubt the 480p to 720p jump should be classified as generational, something that benefited almost anyone and almost anyone could see. The 720p to 1080p jump, sure it will yield differences, however then you need to start considering viewing distance, screen size and so on where this difference will actually translate into visual fidelity.

Don't get me wrong, it is not that I can't see jaggies here and there, and and sure in some games they are just horrendous. But I can also see abundant polygon edges all over the place as well, I can see flickering shadows, crappy lighting, grass and leaves like they are cut out from cardboard and so on. In the end it all about trade offs I don't see things changing next gen, no matter how powerful those consoles might turn out to be, we will still be discussing how developers have chosen the wrong trade offs to improve...
 
He does. He's just showing the influence of resolution, but it's not limited to that.

Consider another example, applying fearsomepirate's metric to artists' works. The intention is to recreate a photograph as closely as possible. Drawing on a 6"x4" sheet, each image is scanned at 300 dpi and compared pixel-to-pixel with a scan of the 6"x4" photograph. The artist who gets the smallest deviation from the photo source is the closest. An amateur will score a large deviation. Using a comparison of each pixel, we can score how close to 'real' the artwork is with a scientific metric.

Now I don't think this metric is a perfect solution in itself, but it does solve the issue for subjective 'looks better' and trying to understand the affect of polys and textures and shaders. Rather than look at how the image is made, fearsome's notion is just to compare the end results with a perfect reference and measure how close the rendering gets. If a console can get closer to the real image with less polys or lower resolution or higher resolution (and framerate, as we're not just dealing with stills here), it would show in his qualitative analysis.

True, but getting to my KZ3/C2 point, one reason I think we cant measure those is, are either really trying to recreate a simple photograph (especially maybe not in KZ's case)?

If so, they're probably pretty different photos. The games just do not look alike.
 
Fearsome's evaluation is, as I understand, based on the machine's capabilities and game rendering targets. Clearly games not going for photorealism can't be directly compared to a photo, but I dare say they could be compared to fine art. Edit: or an offline CG render that recreates exactly the look going for. We'd see resolution is still important and also a diminishing return, as would shader power and texture fidelity etc. We'd also see we actually hit the limits on some genre like cel shaded probably next-gen. Taking something like Rogue Galaxy on PS2, that game with higher polys, higher res, and AA'd edges would be pretty close to a real cartoon. Ni No Kuni seems pretty much there, so next gen should have that quality nailed. Although for full comparison, animation would need to improve too.
 
Durango is more an odd duck with it's low bandwidth DDR3, and the leaked docs speaking of 4-6X power which is borderline, but as it seems to have more GPU power than expected in recent leaks, is looking hopeful imo as well.

Heck I wouldn't be surprised if Durango was even planned at 4-6X, and then MS engineers stepped in and said "we need to give this thing a little more boost to target 1080P comfortably, which we feel is necessary next gen" or something.

I wouldn't discount sheer misinformation at this point. It is easier for them to get information out that they want us to know than for us to know which information sources are real. The rumours suggest however that both machines are heavily GPU orientated which doesn't gel with the concept of them not meeting 1080P spec. I would pick DDR4 personally for Durango given it is likely to have greater longevity and performance as well as packaging advantages either at the start or further into the generation.

The thing which concerns me actually is what the recent troubles at AMD would be doing for the PS4/Xb3 teams and delivering on time. Although perhaps the reason why the rumoured job losses are happening now is that the console projects could be considered finished?
 
Current-gen consoles can already draw 1080p images. Not sure where anyone's getting the idea that next-gen consoles won't be able to. The debate isn't whether or not next-gen consoles will be able to draw 1080p frame buffers, it's whether or not next-gen, we'll hit the point in both hardware and software technology that the gains from rendering in 1080p will be bigger than using those pixels for other things.

My guess is not. Even if some launch titles run at 1080p, unless MS & Sony enforce 1080p for certification, by the end of the generation, developers will have invented a lot more interesting uses for the available fill rate than just maxing out the resolution, and you'll see a lot of games at sub-1080p, though 600p may be gone for good.
 
The thing which concerns me actually is what the recent troubles at AMD would be doing for the PS4/Xb3 teams and delivering on time. Although perhaps the reason why the rumoured job losses are happening now is that the console projects could be considered finished?

My guess that both consoles are finished as far as the system and architectural design. Both are probably in bug fix/validation mode. If the rumors of a September launch are true, the CPU/GPU/SOC has to probably be ready for mass around March. That's only 5 months or so, probably not enough time to make any significant changes, just tweaks.

What's interesting is how the IP ownership is shaping up. If the CPU is x86, can MS outright own the CPU IP without Intel getting involved? Or is AMD the only one who can maintain and tweak it for future process shrinks? I'm assuming MS wouldn't have touched a CPU from AMD if they couldn't have owned the IP and reused it as they please.
 
My guess that both consoles are finished as far as the system and architectural design. Both are probably in bug fix/validation mode. If the rumors of a September launch are true, the CPU/GPU/SOC has to probably be ready for mass around March. That's only 5 months or so, probably not enough time to make any significant changes, just tweaks.

What's interesting is how the IP ownership is shaping up. If the CPU is x86, can MS outright own the CPU IP without Intel getting involved? Or is AMD the only one who can maintain and tweak it for future process shrinks? I'm assuming MS wouldn't have touched a CPU from AMD if they couldn't have owned the IP and reused it as they please.

AMD can't license out x86 IP. MS or Sony would have to buy the parts from AMD. Now MS/Sony and AMD can come to some general understanding with generous terms but AMD would still have to act as the chip supplier. MS or Sony could choose the chip manufacturer and dictate things like process shrinks, but that probably would be done through AMD.
 
My guess that both consoles are finished as far as the system and architectural design. Both are probably in bug fix/validation mode. If the rumors of a September launch are true, the CPU/GPU/SOC has to probably be ready for mass around March. That's only 5 months or so, probably not enough time to make any significant changes, just tweaks.

What's interesting is how the IP ownership is shaping up. If the CPU is x86, can MS outright own the CPU IP without Intel getting involved? Or is AMD the only one who can maintain and tweak it for future process shrinks? I'm assuming MS wouldn't have touched a CPU from AMD if they couldn't have owned the IP and reused it as they please.

I would assume that AMD still owns the rights to the I.P. but the contract is something like cost plus royalties? AMD already contracts their production to a separate company so I suspect that this arrangement allows some flexibility. AMD would probably be contracted for die shrinks like they were when Microsoft had the 360 chips shrunk to 45nm and combined onto the one die.

Current-gen consoles can already draw 1080p images. Not sure where anyone's getting the idea that next-gen consoles won't be able to. The debate isn't whether or not next-gen consoles will be able to draw 1080p frame buffers, it's whether or not next-gen, we'll hit the point in both hardware and software technology that the gains from rendering in 1080p will be bigger than using those pixels for other things.

It is a pretty obvious design target to ensure that a console is technically designed for 1080P rendering. With current generation consoles it is a huge stretch to render at 1080P because the bandwidth/fillrate/embedded memory isn't large/powerful enough to do it comfortably. The easiest way to improve graphical fidelity without throwing money at the problem for developers is to increase the rendering resolution so if the performance is available then the incentive will also be there to render at a higher resolution.
 
With current generation consoles it is a huge stretch to render at 1080P because the bandwidth/fillrate/embedded memory isn't large/powerful enough to do it comfortably.
Really? They couldn't do, say, Quake III at 1080p? Maybe you're right.
The easiest way to improve graphical fidelity without throwing money at the problem for developers is to increase the rendering resolution
Taking your old, dated graphics engine and old, dated asset generation process and just upping the resolution, AA, and AF might be the easiest way to increase fidelity, but it is by far the least competitive. That's why, despite God of War Collection running comfortably at 720p and 60 fps, current-gen developers have opted to typically run at 600p and 30 fps. Normal mapping, SSAO, HDR, and other techniques might be a lot harder to do than drawing single-textured, Gouraud-shaded polygons at 720p and 60fps, but it's necessary to be competitive.

I wouldn't be surprised if the same is true about 1080p next-gen.
 
Yeah, it really is a question of whether you want to use the processing power for something else other than 1080p (especially since the majority of console owners would get negligible benefits from Full HD over 720p given their screen sizes and viewing distances).

I'm sure we'll see quite a few native 1080p games though, probably as many as the 720p games we've seen this gen. The Durango alpha hardware should be able to run BF3 at 1080p 60fps on Ultra settings.
 
Last edited by a moderator:
I think it the next generation will end up more like DC/PS2/GC/XBOX gen with regards to resolution. Surely any eDRAM added will be specced to fit a 1080p 4xAA frame buffer this time around and any system without eDRAM will have quite a bit of memory bandwidth to the main memory. Because of this there will be no steep drop off in performance when going 1080p. So it will be like the DC-XB gen, where most games use the full resolution of typical TV's at the time, where the generation before supported this, but hardly any games used it. Of course one console may end up using tricks or resolutions lower in one axis, just as the PS2 did when competing with the XBOX. The difference won't be as large, so I'd expect it to be more like this gen but even less of a delta between the two. I expect there will be as many 1080p games as there were 720p this gen and most others will be slightly below 1080p to squeeze out performance. Of course if there is a good enough difference in performance (it surely won't be anything like the current gen) one studio may test the waters by releasing a 720p game and if consumers don't notice on the whole, others may follow. I just don't see it being too likely as the performance curve between 720p and 1080p will be nowhere near as steep as it is on current hardware.
 
Last edited by a moderator:
People are acting like "most" games weren't 720P this gen, but I believe the majority were. Dont have a source for that offhand but, yeah.

The exceptions can be high profile but that doesn't make them the majority, they are still exceptions.

To the 1080P debate, I still think it'll happen. The same things about prettier pixels applied this gen, but we still moved to 720P cause "we" thought the overall result was better.

Also the tech. Besides obviously more power, more bandwidth, at least 32MB of EDRAM instead of 10, and more ROPS will all make 1080P more of a natural fit, where the hit may be relatively less, so it becomes more compelling to use it.

I can actually see a lot of games doing some half rez, like 1280X1080, though. That's still a big jump on 720P, while allowing prettier pixels. A compromise.

Guess we will see how it goes.
 
Really? They couldn't do, say, Quake III at 1080p? Maybe you're right.

I don't see how improvements in fidelity as well as resolution are mutually exclusive especially when considering the relatively low base we're improving from. If developers don't render at a higher resolution then will we see an improvement in game fidelity?
 
People are acting like "most" games weren't 720P this gen, but I believe the majority were. Don't have a source for that offhand but, yeah.
That may be true, but stable 30 fps is definitely lacking. Most disc-release games have drops and/or tearing (at least, UE3 games on PS3). Personally I'd take 720p at a constant 30 fps or better yet 60 fps over 1080p with the sorts of low framerates and tearing that I've experienced this gen. That all comes down to the targets devs set themselves though. If framerate was ever considered important, they'd reduce the rest of the graphics, even resolution, to hit it, and that'll be true next-gen. If they can render a game at 1080p but with slower framerate, they have the option to reduce scene complexity, and/or reduce resolution to hit the solid framerate. That's true of PC too, only there the devs can choose to hit a visual complexity target and leave the customers to decide with their wallet what quality of resolution and framerate they'll play it at.

It all comes down to dev choices though, not hardware. Resolution, framerate, detail, are in balance and the devs shift the balance for their own preference.
 
Back
Top