*split* multiplatform console-world problems + Image Quality Debate

*spinning* The Image Quality Debate (2013 IQ is unacceptable!)

There's no such thing as "enough to do 1080p", you'd need a much closer definition of the workload.
A single 1080p frame could take days on a highend multicore PC system or it could be rendered by something like a TNT2 graphics card, depending on its contents.

As for the X1 having enough resources to render at 1080p with a reasonable level of detail, we'll have to wait and see. Forza 5 can obviously do this at 60fps, but it has no characters, very few effects and few active entities; FPS or open-world TPS games seem to be looked at more as benchmarks for rendering complex graphics. Also, both COD and BF4 aim for 60fps, halving the framerate would definitely help there, but it's unknown by how much (if it's an issue with the small size of the ESRAM).

In the end though, what matters more, prettier pixels at 900p that you wouldn't know if noone told you or 1080p but at an inferior quality?

Do you honestly think that all the latest debate about resolutions have anything to do with logic and what is practically good or not? I've seen a good amount of people on various forums saying that they won't buy/play a game if it's running at 720p.

The funny thing as you already said is that most of them wouldn't have a clue about the resolutions of the games if DF or some dude here in Beyond3D hadn't told them.

Higher resolution doesn't mean better IQ IMO, just look at GTAV that runs at 720p but it has some of the worst IQ I've seen this gen same with Portal 2 on 360, personally I'd get a softer look over a sharper looking game with jaggies and shimmering all over the place.
 
ive read a post written by a knowledgeable person on xbox one's scaler and he suspects that the scaler is pretty advanced and is capable of allowing the xbox one to render at 720p and have the pixel quality match or exceed native 1080p. which to me screams bs i mean there is a limit and i think 2.25 more pixels is past that limit. but wanted to get your opinions on this train of though?

This person should learn how rasterisation works.
 
It's not even about rasterisation in this case. It's the information theorem, which has been posted before. In an uncompressed image (and consoles tend to do just that), an upscaled image can never reach the information content of the native resolution. No matter if it's a photo, rasterized or raytraced.

HOWEVER, a good upscaler can make it less distinguishable. The question is, how much power they want to use for such a solution. If they already have to render at lower resolution to reach the framerate target, I would guess it's not comparable to a good GIMP implementation, by a long shot.
 
IF you have very high quality antialiasing (which usually means supersampling, as in offline CG) then you can usually upscale the image by a pretty large amount without making it obvious. We've successfully did this many times over and it just creates a slightly blurred image.

You can also use certain algorithms to try to enhance the image using some sort of fractal patterns. The results can... vary.

However, if you have significant amounts of aliasing in the original image, then the upscaling will have a very, very hard time dealing with it. But just as it's possible to smooth out aliasing on a basically point sampled image using the various morphological algorithms, it should be theoretically possible to write a scaler that can do the same thing.
In both cases you are adding extra information based on 2D clues, trying to guess at the lost detail. People are usually quite OK with the post-AA solutions so such a scaler should work pretty well too. The question is, does the X1 scaler has enough power and programmability to develop this kind of solution?
 
ive read a post written by a knowledgeable person on xbox one's scaler and he suspects that the scaler is pretty advanced and is capable of allowing the xbox one to render at 720p and have the pixel quality match or exceed native 1080p. which to me screams bs i mean there is a limit and i think 2.25 more pixels is past that limit. but wanted to get your opinions on this train of though?

The only way something like that would be achievable would be to take multiple frames and reconstruct from that. There are super-resolution techniques that use multiple images to reconstruct one big one that looks pretty close to what it would have been (had it been take at the large size originally).

In a camera, either those images have to come from different sensors or they're staggered in time, thus all are slightly different so you have much more information to generate one big image.

Unfortunately, that's something not applicable to gaming as I think the latency penalty would be to big to use multiple frames or you need to render at a much higher frame rate (and if you can do that, why bother to waste cycles on that instead of upping the res).
 
There was a post here in the forum about some 2D pixel graphics game that used some advanced scaling technique to smooth everything out. The results were pretty amazing.
 
There's been a lot of research into that, yes, and the various post-AA solutions are actually based on similar principles. My point is that most upscaling solutions in A/V land so far were not based on using an aliased (basically point sampled) source image so they were probably not the best for video games.

However the various post-AA algorithms are already quite capable, so maybe it is possible to combine post-AA and upscaling to produce a relatively aliasing-free and upscaled image, in a single pass, creating higher quality results.

I don't really have any idea about what the current console scalers are actually doing, though.
All I can tell is that Ryse seems to do something pretty well as there's very little aliasing in the imagery and pixel counters seem to be unable to calculate the original resolution from them. We of course know that it's 900p as it's been officially announced, but the overall image quality is rather nice, if a little bit blurry.
It'll be very interesting to see how a military FPS game will look like using a similar approach. Although I also wouldn't be surprised if any PS4 Cryengine games were rendering at full 1080p and only the X1 versions had to use scaling...
 
SB's case most closely replicates actual experience. How often will the same game be released for the same system at two resolutions? If you're playing a game on a console, and can't tell it's not 1080p without comparing against the game running at 1080p somewhere else, have you really lost anything?
Bingo.

This is why AB tests are stupid. There are some things I can tell without any reference image, like plasma vs. LCD black levels. As long as the picture has a few swaths of black and I'm not in a very bright room, I can tell without side by side comparison. But for things like the difference between de2000 scores of 3 and 6 in a color test, I'd have no idea which is which without side-by-side comparison.

The same is true with audio. If you take 100 different random samples of music and play 50 on one decent speaker set and 50 on another, then I'm unlikely to prefer one equalized system over the other unless there are obvious differences in frequency response. The variance in music overwhelms the variance in accuracy.

For resolution? 720p vs 900p with both having 1080p HUD isn't easy to identify in isolation. If it was something like 720p noAA vs 720p 4xAA, or 30fps vs 20fps, then i'd see a difference clearly without needing a reference.
 
ive read a post written by a knowledgeable person on xbox one's scaler and he suspects that the scaler is pretty advanced and is capable of allowing the xbox one to render at 720p and have the pixel quality match or exceed native 1080p. which to me screams bs i mean there is a limit and i think 2.25 more pixels is past that limit. but wanted to get your opinions on this train of though?

Does anyone know if the hardware scalers on both machines are different? I'd expect them to be the same really but who knows. In any case the only way to get what you say to happen, than of 720p being "perceived" as better quality than 1080p is if they did more than just a 720p->1080p upscale and instead also threw in some post processing using gpu cycles saved from rendering less pixels to begin with. That assumes that everything up to that point was identical and the only difference is one box renders to 1080p, and the other renders to 720p and upscales + post processes up to 1080p. I say "perceived" in quotes because you won't be getting more detail this way of course, but you can potentially achieve looks that are more visually pleasing this way to the end user. For example, I'd bet anyone here that if you did the above test and simply hit the 720p version with a 10% contrast and sharpen boost in post process, that 51%+ of people in a blind test would prefer the 720p version.
 
Does anyone know if the hardware scalers on both machines are different? I'd expect them to be the same really but who knows. In any case the only way to get what you say to happen, than of 720p being "perceived" as better quality than 1080p is if they did more than just a 720p->1080p upscale and instead also threw in some post processing using gpu cycles saved from rendering less pixels to begin with. That assumes that everything up to that point was identical and the only difference is one box renders to 1080p, and the other renders to 720p and upscales + post processes up to 1080p. I say "perceived" in quotes because you won't be getting more detail this way of course, but you can potentially achieve looks that are more visually pleasing this way to the end user. For example, I'd bet anyone here that if you did the above test and simply hit the 720p version with a 10% contrast and sharpen boost in post process, that 51%+ of people in a blind test would prefer the 720p version.


I doubt that, the 1080p version would have reduced shimmering vs the 720p image.
 
I don't really have any idea about what the current console scalers are actually doing, though.
All I can tell is that Ryse seems to do something pretty well as there's very little aliasing in the imagery and pixel counters seem to be unable to calculate the original resolution from them.
It's a (I guess aggressive) post AA that's eliminating the jaggies so well. The softness works towards realism, so CryTek could get away with this. And at 900p upscaling to 1080p, the results aren't as disappointing as early post AA efforts on the current consoles. Overalli think it's a very effective compromise to get better pixels and still retain a perceptually-sharp, HD image. CryTek's solution is done in game though, and doesn't use the XB1's hardware upscaler, so it doesn't give a useful reference point. I guess COD does use it.
 
Hmm, they aren't using the built-in scaler? That's interesting, then again there's a lot of reason in merging various post processes into a single pass whenever possible.
 
Does anyone know if the hardware scalers on both machines are different? I'd expect them to be the same really but who knows. In any case the only way to get what you say to happen, than of 720p being "perceived" as better quality than 1080p is if they did more than just a 720p->1080p upscale and instead also threw in some post processing using gpu cycles saved from rendering less pixels to begin with. That assumes that everything up to that point was identical and the only difference is one box renders to 1080p, and the other renders to 720p and upscales + post processes up to 1080p. I say "perceived" in quotes because you won't be getting more detail this way of course, but you can potentially achieve looks that are more visually pleasing this way to the end user. For example, I'd bet anyone here that if you did the above test and simply hit the 720p version with a 10% contrast and sharpen boost in post process, that 51%+ of people in a blind test would prefer the 720p version.
Afaik, both scalers on the PS4 and Xbox One are basically the same, the only difference is that the PS4 has two display planes while the Xbox One has three display planes.

ColorCorrection2.png


These articles explain how display planes work on both consoles:

http://www.vgleaks.com/orbis-displayscanout-engine-dce/

http://www.vgleaks.com/durango-display-planes/

Microsoft seem to treat the hardware upscaler as one of the 15 coprocessor they listed in their Hotchips presentation though, and they also mention it has a higher quality than upscaler than Xbox 360's:

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

"Imagine you've rendered to a depth buffer there in ESRAM. And now you're switching to another depth buffer. You may want to go and pull what is now a texture into DDR so that you can texture out of it later, and you're not doing tons of reads from that texture so it actually makes more sense for it to be in DDR. You can use the Move Engines to move these things asynchronously in concert with the GPU so the GPU isn't spending any time on the move. You've got the DMA engine doing it. Now the GPU can go on and immediately work on the next render target rather than simply move bits around."

Other areas of custom silicon are also designed to help out the graphics performance.

"We've done things on the GPU side as well with our hardware overlays to ensure more consistent frame-rates," Goossen adds.

"We have two independent layers we can give to the titles where one can be 3D content, one can be the HUD. We have a higher quality scaler than we had on Xbox 360. What this does is that we actually allow you to change the scaler parameters on a frame-by-frame basis."
 
Bingo.

This is why AB tests are stupid. There are some things I can tell without any reference image, like plasma vs. LCD black levels. As long as the picture has a few swaths of black and I'm not in a very bright room, I can tell without side by side comparison. But for things like the difference between de2000 scores of 3 and 6 in a color test, I'd have no idea which is which without side-by-side comparison.

The same is true with audio. If you take 100 different random samples of music and play 50 on one decent speaker set and 50 on another, then I'm unlikely to prefer one equalized system over the other unless there are obvious differences in frequency response. The variance in music overwhelms the variance in accuracy.

For resolution? 720p vs 900p with both having 1080p HUD isn't easy to identify in isolation. If it was something like 720p noAA vs 720p 4xAA, or 30fps vs 20fps, then i'd see a difference clearly without needing a reference.
I find them very entertaining, and this one could be the most interesting generation in that sense yet.

In fact it is the most interesting generation for me because my first console ever was the original Xbox and I was used to have the most powerful console then, before things began to evolve.

We have two consoles built around different memory interfaces but similar CPU/GPU -not factoring the extra power of PS4's GPU here, I mean they are based off AMD GCN GPUs-, and with some extra touches here and there, and it would be interesting how both are going to fare when it comes to performance.

This relatively old article from Edge ended up being almost spot on, and two comments from developers stick out to me...; that the PS4 would have better launch games -technically wise- and that the eSRAM is a pain to work with -maybe because of the render targets?-.

http://www.edge-online.com/news/pow...erences-between-ps4-and-xbox-one-performance/

Developers themselves are being very contradictory on the matter, with some of them saying there is an obvious difference in capabilities, and others like Shinji Mikami, John Carmack and Keiji Inafune saying they are basically the same:

http://www.extremetech.com/gaming/1...cs-are-essentially-the-same-says-john-carmack

I don't know who to trust anymore to be honest, and the most trustable thing for me right now is time. The test of time will tell the truth in the end.
 
i learned so much about scaling in general, thanks for the responses. I always assumed both platforms scaler was just part of AMD's GPU. has AMD talked about the scaler on GCN cards in any detail? is it known why that dice used a software scaler for ps4 version of battlefield or is that a logical assumption from leadbetter?

while on the subject of digital foundry is there an article with crytek that explains why they decided to go with a software scaler over using the hardware scaler? would love to read the reasoning. could it be that the xbo's hardware scaler is sharpening the image and its something crytek didn't want?
 
They are basically the same in many respects so I don't see the problem.

One has more power than the other but 'basically the same' covers more aspects than it disagrees with.
 
I don't think Crytek ever said they're using a software scaler. They only said they're using their own hybrid SMAA 1TX which is morphological + temporal AA.
 
Last edited by a moderator:

You need to know how Carmack thinks to properly interpret his statement. His analysis is based on what systems can be implemented in general, like virtual texturing, virtual geometry and so on.
The exact parameters of the final image, like resolution or AA levels, are secondary issues to him at best, but probably even further behind on the list. He's also known to put great emphasis on control response times and other latency related features, and quite possibly online features as well.

So he'd probably call the two systems "basically" the same even if the XB1 was only able to render at 540p as long as it was running from the same dataset and had the same feature list in rendering and gameplay and online and such. He could still write the same code and just tweak a "few" parameters for the final output at the end.
 
Bingo.

This is why AB tests are stupid. There are some things I can tell without any reference image, like plasma vs. LCD black levels. As long as the picture has a few swaths of black and I'm not in a very bright room, I can tell without side by side comparison. But for things like the difference between de2000 scores of 3 and 6 in a color test, I'd have no idea which is which without side-by-side comparison.

The same is true with audio. If you take 100 different random samples of music and play 50 on one decent speaker set and 50 on another, then I'm unlikely to prefer one equalized system over the other unless there are obvious differences in frequency response. The variance in music overwhelms the variance in accuracy.

For resolution? 720p vs 900p with both having 1080p HUD isn't easy to identify in isolation. If it was something like 720p noAA vs 720p 4xAA, or 30fps vs 20fps, then i'd see a difference clearly without needing a reference.
What concerns me though, assuming the resolution difference is a result coming from performance difference rather than lack of optimization, is that it could show in future games, where one will be able to output better visuals than the other altogether. This is of course assuming that exclusive games on the PS4 will at some point wont mind targeting below 1080p or if multiplatform games will target equal resolution on both or that the performance difference is significant enough that more sacrifices will have to be made besides pixel density. If asset detail, physics and framerate are retained on both platforms and the difference is always isolated on resolution not many will notice or care as much
 
I don't think Crytek ever said they're using a software scaler. They only said they're using their own hybrid SMAA 1TX which is morphological + temporal AA.

i was just going by shifty's comment that they were using their own ingame scaler. sorry.
 
Back
Top