*downscaled* Resolving Resolution

Status
Not open for further replies.
I feel a bit sad to find the difference between native & upscaled so little, I'm starting to think it might not really be worth the computation difference.
On the other hand it's nice, it means I should be happy with a 720p projector ;p

Are you joking? The difference in the shot's I posted is immense...
 
They'll have to move to 1080p eventually... And the same could be said for current consoles, infact why even bother with D in first place, lets all move back to 480p

I don't understand what you're saying here, sorry.

And what's close enough?

Where people would weigh up many/all aspects of the game and decide that the upscaler blur wouldn't put them off buying the game. Or perhaps that they wouldn't actually notice in the first place.

The PS3 has basically gotten away with having the balance of lower res games and worse or none existent upscaling, after all.
 
Yeah, it's just that most of that difference isn't resolution.

PS1 to PS2 = Resolution jump

PS2 to PS3 = Resolution jump

PS3 to PS4 = Resolution jump

I'm honestly shocked that you think up scaled blurry games is acceptable on what would be 2013/2014 hardware.

If they're going to stick with low resolutions and up scaling then there's really no point in having higher resolution textures, all that extra detail would just get lost in the up scaling process and the low screen resolution.
 
Very subtle difference in texture quality but even little details like the hair and eye lashes are blurred out and lost.

360_1.jpg.jpg


pc_1.jpg.jpg


Whatever time and effort developers put into improving the little details and such will be destroyed by up scaling.
 
PS1 to PS2 = Resolution jump

PS2 to PS3 = Resolution jump

PS3 to PS4 = Resolution jump

I'm honestly shocked that you think up scaled blurry games is acceptable on what would be 2013/2014 hardware.

If they're going to stick with low resolutions and up scaling then there's really no point in having higher resolution textures, all that extra detail would just get lost in the up scaling process and the low screen resolution.

Is this true? As if so i must have missed this.

As far as i'm aware most, if not all PS1 & PS2 games were standard sub-HD 480p resolution.

I'm personally not interested in wasting CPU and GPU cycles to push game res upto 1080p, however only if there is provided an easy and non-resource intensive way to do AA at 720p. I want perfect IQ next-gen as much as the next guy, and if next-gen console provide enough memory bandwidth to do fast and cheap AA, then I'd rather they spend the remaining cycles on prettier pixels or improving framerates in games that it makes sense.
 
Is this true? As if so i must have missed this.

As far as i'm aware most, if not all PS1 & PS2 games were standard sub-HD 480p resolution.

I'm personally not interested in wasting CPU and GPU cycles to push game res upto 1080p, however only if there is provided an easy and non-resource intensive way to do AA at 720p. I want perfect IQ next-gen as much as the next guy, and if next-gen console provide enough memory bandwidth to do fast and cheap AA, then I'd rather they spend the remaining cycles on prettier pixels or improving framerates in games that it makes sense.

PS1 was 320x240.

It's amazing how people we excited and amazed when Sony announced PS3 was 1080 native now all of a sudden they don't want it and it' a bad thing for games.
 
Pure 720p and 1080p are both wasteful (a simple regular sampling grid). Brute force processing of million (or two) pixels isn't the most clever thing to do. If you could position the sampling points perfectly (where it matters the most), you would need less samples than 720p to provide more quality than 1080p.

Blu-ray movies look good. But still BR saves croma in half resolution. And it stores most frame data with a low resolution (compressed) approximation. Bandwidth is utilized in the areas that need it the most. It's all about human perception. You don't need to render more than maybe 10%-20% of the pixels of each frame at full resolution to make the image look like real 1080p. The real question is: which pixels? and how to generate the others efficiently.
 
This is a very interesting thread. I wonder if games like Doom 3 for the PS3 and the X360 are doing something similar to achieve 60 fps at 720p. I mean, using the true HD resolution where it matters the most.

Despite of being a 2004 game it has aged very well, and it just shows how disappointing are games like Half Life 2 on the PS3 and X360.

Half Life 2 is a game with Jurassic technology and runs at 30 fps at most on PS360, when both consoles are way beyond the capabilities of that game in every possible way.
 
Half Life 2 is a game with Jurassic technology and runs at 30 fps at most on PS360, when both consoles are way beyond the capabilities of that game in every possible way.

That is because Valve does not really give a shit about how the game runs. Just look at CS:Go, highest input lag ever in a FPS!
 
This is a very interesting thread. I wonder if games like Doom 3 for the PS3 and the X360 are doing something similar to achieve 60 fps at 720p. I mean, using the true HD resolution where it matters the most.

Despite of being a 2004 game it has aged very well, and it just shows how disappointing are games like Half Life 2 on the PS3 and X360.

Half Life 2 is a game with Jurassic technology and runs at 30 fps at most on PS360, when both consoles are way beyond the capabilities of that game in every possible way.

That is because Valve does not really give a shit about how the game runs. Just look at CS:Go, highest input lag ever in a FPS!

That's because HL on consoles uses Episode 2's uprated Source engine which is much harder on the system then then engine HL2 used when it was originally released.

If you look at benchmarks for Episode 2 you'll see that the console performance and frame rate are right where they should be.
 
That's because HL on consoles uses Episode 2's uprated Source engine which is much harder on the system then then engine HL2 used when it was originally released.

If you look at benchmarks for Episode 2 you'll see that the console performance and frame rate are right where they should be.
Even so, I could understand this somehow in a game like HL2, but like tuna I have CS:Go -I seldom play it though, I had never played a Counter Strike game and got caught up on the hype- and I wonder how a game with such average graphics/small maps can't be rendered at 60 fps.

Nobody would miss the "extra detail" lost when the game doesn't look good at all even at 30 fps tbh.

It's one of the reasons I don't like Valve on consoles. Valve don't behave well to PS3 and X360 owners when it comes to optimization.
 
Last edited by a moderator:
lol, wut?

A 7870 will enable them to do so many things, like OpenCL physics for one, tessellation is also a massive thing to have.

And it will ultimately lead to the same game engines that have existed for the last 10 years? COD runs on the Quake 3 engine.... Current consoles use old engine tech.

Next generation should finally see the death of flat textures and nice ones with deph like this

vpt4.jpg

One could only estimate, there's a difference between a bench mark used for graphics and a game. years before the 360 and ps3 we already had sub surface scattering and parallax mapping and so forth as guides. it's should be of note that a game has to sustain such fidelity throughout the game; meaning it can't show off realistic environments and then make drastic compromises when a whole town has to be rendered, with hundreds of people needing the same attention to detail at the asking resolution.

you hear from all over developers cutting back here and there because of the unnatural levels of detail changing throughout thier games. If it can render a war on that terrain or a town with just as much detail on the citizens at 1080p i would be a bit more confident in the presentation. ...so far we only have current gen games to bench mark, nothing like what lucasarts will be releasing in the years to come.
 
I doubt upscaling of framebuffers is done bi-cubicly (or lanczosly), you completely kill the texture-sampling hardware/caches with it. DC's thread-shared "buffering" may help, but you don't want new powerfull hardware just to do decent post-filtering and nothing else.

Of course not if it's done in software, (which is why the PS3 is terrible at upscaling), but the 360 has a dedicated scaler DSP chip (HANA) which allows you to do resampling with a variety of filters (including bicubic and lanczos) for free and without introducing input lag.


I do not care one bit about that graph, EVERYONE IS DIFFERENT....
.

Visual acuity in humans is objective and as that article says the graph is based on:
Based on the resolving ability of the human eye (with 20/20 vision it is possible to resolve 1/60th of a degree of an arc), it is possible to estimate when the differences between resolutions will become apparent.

So while you might have superhuman visual acuity (unlikely) most people who have 20/20 vision or worse will not be able to tell the difference between resolutions when watching content on a certain screen size from a certain distance away, merely as a function of the resolving power of the human eye.

And, as Gubbi pointed out, comparing PC and 360 Face Off shots is not a valid comparison since there are other differences affecting IQ (like antialiasing) between the 360 and PC and it's not simply a matter of one rendering at a higher resolution as the other.

I'm honestly shocked that you think up scaled blurry games is acceptable on what would be 2013/2014 hardware.

If they're going to stick with low resolutions and up scaling then there's really no point in having higher resolution textures, all that extra detail would just get lost in the up scaling process and the low screen resolution.

Firstly, as I mentioned numerous times now, the next Xbox should be able to run BF3 at 1080p at 60fps on Ultra settings (given what we know of the devkit specs)

So really I would expect quite a lot of games to be rendering at 1080p next gen.
However for devs who want to trade off resolution for improved visuals or higher framerates then 720p will be an option, and when paired with a decent scaler it will make very little difference to the majority of gamers, who do not have big enough displays or sit close enough to be able to discern the difference between real 1080p and upscaled 720p. ( and let's not forget that over half of gamers who played Gears of War 2 did so on SDTVs!)
 
Pure 720p and 1080p are both wasteful (a simple regular sampling grid). Brute force processing of million (or two) pixels isn't the most clever thing to do. If you could position the sampling points perfectly (where it matters the most), you would need less samples than 720p to provide more quality than 1080p.

Blu-ray movies look good. But still BR saves croma in half resolution. And it stores most frame data with a low resolution (compressed) approximation. Bandwidth is utilized in the areas that need it the most. It's all about human perception. You don't need to render more than maybe 10%-20% of the pixels of each frame at full resolution to make the image look like real 1080p. The real question is: which pixels? and how to generate the others efficiently.


Given how much dof is used nowadays, they could also consider rendering stuff that falls in Z focus at full res, then all the rest whose Z falls into the out of focus areas can be rendered at lower res. The would especially help with cutscenes which tend to be dof heavy, it would let them bump up the render quality there.
 
Isn't there still a bunch of 720p broadcast?
There's a lot of 720p broadcast in the US.

Very subtle difference in texture quality but even little details like the hair and eye lashes are blurred out and lost.

360_1.jpg.jpg


pc_1.jpg.jpg


Whatever time and effort developers put into improving the little details and such will be destroyed by up scaling.
The hair behind the eye lash is a bigger deal to me than texture resolution. Like others have said you're really comparing apples and oranges when comparing 360 and PC images. The only useful comparison was PC to PC.
 
Pure 720p and 1080p are both wasteful (a simple regular sampling grid). Brute force processing of million (or two) pixels isn't the most clever thing to do. If you could position the sampling points perfectly (where it matters the most), you would need less samples than 720p to provide more quality than 1080p.
Agreed completely, a properly decoupled shading would be nice.
http://cg.ibds.kit.edu/ShadingReuse.php

For actual scaling I'm quite sure that we will see more intelligent ways to 'scale' and reconstruct final frame in next generation.
Building on edge detecting post-AA methods, it should be possible to scale image to whatever resolution needed while retaining sharp anti-aliased edges.
FXAA4 has/had one of the first tries toward this and I'm sure it will get better as idea gets more research.
 
Status
Not open for further replies.
Back
Top