Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.

From the article:
post-process FXAA solution that has minimal impact on texture quality
I am still not sold about the FXAA. It does look too clean; the AA leaves zero nasty FXAA typical artifacts (and I know them too well). Be sure to display those pics natively.


Tombraider_Gamerside_Anti_Aliasing_detail_PS4_c.png

Tomb_Raider_Factory_PS4_aliased_detail.png

Tomb_Raider_Factory_PS4_aliased_detai_arml.png


I have never seen such a clean implementation of FXAA, maybe it's a derivative? It looks almost exactly like Uncharted 3 and TLOU post effect implementation which only slightly blurs only what needs to be blurred (a tiny part, like 4 pixels, of the aliased edge) and doesn't leave any unwanted artifacts/blur.

What do you think? Have you ever seen such a clean FXAA implementation?
 
Then perhaps it's not your typical FXAA implementation? FXAA has several settings that can be tweaked right down to numbers for coverage and sharpness.

Have you done a comparison to the PC shots (considering that the PC options do mention FXAA) :?:
 
Well that comparison video certainly put's things in a new light for me.
What a massacre. It'll all come out in the wash I hope. When they get the handle of the Xbox One.

I checked again and Lara's geometry seems to be the same on both versions, no more polygons on Xbox One's version. I'd swear I had seen less angular/polygonal fingers and shoulders.
 
What a massacre. It'll all come out in the wash I hope. When they get the handle of the Xbox One..

Now that would be an interesting topic for an interview with these Developers. How deep did they actually dig in to each console, the 18/12 CU could they utilize the extra 6 and did they use the ESRam at all.

From this analysis it seems they had to compromise on the XB1 version and did imho a very good job to make it run at steady framerate..
 
The in-game areas where both games drop frame rate (so you can attempt to meaningfully compare them) seem to show the PS4 being about 50% ahead.

That sort-of fits the CU difference between the two platforms (as opposed to CPU, ROP, memory BW etc). Not saying the relative performance is solely down to GPU ALU/CU differences, but it does kind of stand out.

Neither of these games seem to make particularly good performance tradeoffs. PS4 frame rate is spectacularly unstable and the B0ne chases the PS4's resolution instead of of focusing on maintaining 30 fps - which it could kinda do with doing.

Once again, the 1080p fans failed to notice that something they were examining frozen on a monitor - while smearing the ir noses against the screen and mouth-breathing - was actually running at "900p" and not "1080p". Again. It happened ... again.

Both of these games should be using dynamic framebuffers. God damn 1080p and the god damn 1080 crowd worshipping that number. God damn. Don't they realise that a lower resolution at a higher frame rate can still give them about the same number of pixels to hoot and rub over?
 
Once again, the 1080p fans failed to notice that something they were examining frozen on a monitor - while smearing the ir noses against the screen and mouth-breathing - was actually running at "900p" and not "1080p". Again. It happened ... again.

Both of these games should be using dynamic framebuffers. God damn 1080p and the god damn 1080 crowd worshipping that number. God damn. Don't they realise that a lower resolution at a higher frame rate can still give them about the same number of pixels to hoot and rub over?

Calm down. 900p on the bone is used in "some" cut scenes to maintain frame rate and many people pointed that out as being more blurry. Just because the developers chose the targets (Tress hair was a stupid decision) they did doesn't mean 1080 is bad as you are ranting about. What is the point of having more detail in games if there isn't enough resolution to resolve it?
 
Calm down. 900p on the bone is used in "some" cut scenes to maintain frame rate and many people pointed that out as being more blurry. Just because the developers chose the targets (Tress hair was a stupid decision) they did doesn't mean 1080 is bad as you are ranting about. What is the point of having more detail in games if there isn't enough resolution to resolve it?

I've never said 1080p is bad.
 
I'd prefer you not using the word 'bone' for that console, thank you very much.
I officially concur. XB1 is quick, easy, and obvious without any possible negative connotations. No need for xbone or bone, and a simpler name than the official XboxOne or abbreviated XBOne.
 
The in-game areas where both games drop frame rate (so you can attempt to meaningfully compare them) seem to show the PS4 being about 50% ahead.

That sort-of fits the CU difference between the two platforms (as opposed to CPU, ROP, memory BW etc). Not saying the relative performance is solely down to GPU ALU/CU differences, but it does kind of stand out.

Neither of these games seem to make particularly good performance tradeoffs. PS4 frame rate is spectacularly unstable and the B0ne chases the PS4's resolution instead of of focusing on maintaining 30 fps - which it could kinda do with doing.

Once again, the 1080p fans failed to notice that something they were examining frozen on a monitor - while smearing the ir noses against the screen and mouth-breathing - was actually running at "900p" and not "1080p". Again. It happened ... again.

Both of these games should be using dynamic framebuffers. God damn 1080p and the god damn 1080 crowd worshipping that number. God damn. Don't they realise that a lower resolution at a higher frame rate can still give them about the same number of pixels to hoot and rub over?
You didn't pay enough attention then. There were certain screens discussed here that looked noticeably blurrier on X1, and some suspected a lower res. It's not like the whole game is 900p, only some of the cutscenes AFAIK. People at GAF mentioned it before the DF article, too. Not to mention there wasn't any high quality direct feed footage comparing the two... only compressed youtube videos.
 
Last edited by a moderator:
I officially concur. XB1 is quick, easy, and obvious without any possible negative connotations. No need for xbone or bone, and a simpler name than the official XboxOne or abbreviated XBOne.

I like B0ne and 4Bone (for both consoles). Then again, I like the "Xbox One" and would buy one if there was genuinely good Kinect stuff, so I don't see it as a problem.

If it'll help the forum then rules about acceptable names might be a good idea. I'll stick to XB1 from now on.
 
I officially concur. XB1 is quick, easy, and obvious without any possible negative connotations. No need for xbone or bone, and a simpler name than the official XboxOne or abbreviated XBOne.

I never took Xbone or Bone to be a negative, most people use it as an acronym and its practically pop culture at this point but whatever I am not going to piss off the mods.
 
You didn't pay enough attention then. There were certain screens discussed here that looked noticeably blurrier on X1, and some suspected a lower res. It's not like the whole game is 900p, only some of the cutscenes AFAIK. People at GAF mentioned it before the DF article, too.

Not to mention there wasn't any high quality direct feed footage comparing the two... only compressed youtube videos.

Of the small number of people who talked about blur, most chalked it up to AA. The vast majority didn't notice.

The talk of resolution was practically none existent next to the talk exclusively of "1080p" and of frame rate. After being told about "900p" notice the difference in the amount of talk about resolution ...
 
I officially concur. XB1 is quick, easy, and obvious without any possible negative connotations. No need for xbone or bone, and a simpler name than the official XboxOne or abbreviated XBOne.

I shall 1up that with XO. 33% optimization! Extra points for being only letters too.
 
Of the small number of people who talked about blur, most chalked it up to AA. The vast majority didn't notice.

The talk of resolution was practically none existent next to the talk exclusively of "1080p" and of frame rate. After being told about "900p" notice the difference in the amount of talk about resolution ...
When the developer says both versions are native 1080p, and that we have limited low quality compressed footage to compare, AND that it is only on some of the cutscenes, it wouldn't be easy to tell 900p from 1080p. And yet, some STILL suspected a variable resolution on X1, mostly because the guys at GAF made their own comparisons with the gamersyde footage, which took some effort to make. The difference between 900p and 1080p isn't huge to begin with. So no, I don't think the 1080p fans failed anything. If anything, it proves that the difference is that easy to spot.
 
Last edited by a moderator:
I played the whole game on my PC at 900p. Shock! Horror!

If you played the whole game can you tell me if this moment
(when Lara has her hand cuffed behind her back)
is during a cutscene or in gameplay?

Because XB1 image is clearly sub-1080p in this one, not just low-shadows/alpha resolution (check the woods at the upper left and the board just at the right of Lara, we can count only 11 "steps" instead of 13 on the PS4 version which is almost perfect 900p/1080p ratio), images provided by Eurogamer.

They seem to use an additionnal strong blur to hide the obvious 900p lower resolution:

1920x-1

1920x-1
 
I'm 90% sure it's gameplay, it's been a good few months and at least a couple of full games since I finished it. I'm pretty sure I remember that part of the game though.

EDIT: scratch that, make it 100% sure, I remember now, you have to hide from the search lights behind walls, fences etc...
 
Status
Not open for further replies.
Back
Top