Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
I'm 90% sure it's gameplay, it's been a good few months and at least a couple of full games since I finished it. I'm pretty sure I remember that part of the game though.

EDIT: scratch that, make it 100% sure, I remember now, you have to hide from the search lights behind walls, fences etc...

Well then if that's true XB1 version uses dynamic 900p/1080p resolution also in gameplay I guess.

Digital Foundry must have missed it. But their article was nonetheless really detailed, fair and in-depth. I really liked it.
 
EDIT: scratch that, make it 100% sure, I remember now, you have to hide from the search lights behind walls, fences etc...
Yep, gameplay.

I have to agree with function, the race for 1080p is stupid IMO, it consumes valuable and much needed GPU resources from both consoles .. it also puts bigger pressure on memory capacity, which doesn't seem to be in great abundance as we once thought in the past. Consoles still use lower texture resolution than PCs. it could also be attributed to memory bandwidth. which just asserts the point even further. An upscaled 900p with a powerful quality of lighting, shadowing, Geometry and Textures will trounce a native 1080p with a lesser quality of them.
 
When the developer says both versions are native 1080p, and that we have limited low quality compressed footage to compare, AND that it is only on some of the cutscenes, it wouldn't be easy to tell 900p from 1080p. And yet, some STILL suspected a variable resolution on X1, mostly because the guys at GAF made their own comparisons with the gamersyde footage, which took some effort to make. The difference between 900p and 1080p isn't huge to begin with. So no, I don't think the 1080p fans failed anything. If anything, it proves that the difference is that easy to spot.

"Suspected" ... "made their own comparisons, which took some effort" ... "The difference between 900p and 1080p isn't huge to begin with" ... "If anything, it proves that the difference is that easy to spot."

Well apparently the difference can't be that easy to spot because a few people suspected then went to some effort to make comparisons to try and see a difference that isn't that huge to begin with ... and they *still* couldn't work it out for sure.

So no. I disagree. The people that do know what they're looking at are few and far between and they are almost always smart enough not to be "1080p" fans i.e. they understand that resolution (and uniform resolution at that) is only one factor amongst many.
 
Yep, gameplay.

I have to agree with function, the race for 1080p is stupid IMO, it consumes valuable and much needed GPU resources from both consoles .. it also puts bigger pressure on memory capacity, which doesn't seem to be in great abundance as we once thought in the past. Consoles still use lower texture resolution than PCs. it could also be attributed to memory bandwidth. which just asserts the point even further. An upscaled 900p with a powerful quality of lighting, shadowing, Geometry and Textures will trounce a native 1080p with a lesser quality of them.

And you know what the worst thing is? My outstanding, possibly sexist joke about Tress-FX's effect on frame times: "Now you can wait for a woman's hair to be ready in a game, just like in real life" has been completely overlooked!

DF can hire me as joke writer if they want, and for minimum wage too!
 
"Suspected" ... "made their own comparisons, which took some effort" ... "The difference between 900p and 1080p isn't huge to begin with" ... "If anything, it proves that the difference is that easy to spot."
You mis-understood my points. It took an effort to make the comparisons, not to see the differences. They had to dig through two non-like-for-like videos to make a somewhat proper comparison.

Not being hugely different and being easy to spot are two mutually exclusive things.

And again, are you forgetting the fact that the resolution doesn't dip that often? That makes it even harder to spot the difference.

All I'm saying is, when proper comparisons are made, it's pretty easy to see the difference IMO. But I agree that the difference between 1080p and 900p is not that big. While the PS4 has noticeably better IQ overall, I think that most people won't really see much of a difference or won't care much. The differences in framerate are much more significant, though.
 
Last edited by a moderator:
When the developer says both versions are native 1080p, and that we have limited low quality compressed footage to compare, AND that it is only on some of the cutscenes, it wouldn't be easy to tell 900p from 1080p. And yet, some STILL suspected a variable resolution on X1, mostly because the guys at GAF made their own comparisons with the gamersyde footage, which took some effort to make. The difference between 900p and 1080p isn't huge to begin with. So no, I don't think the 1080p fans failed anything. If anything, it proves that the difference is that easy to spot.

The GAF guys literally comb over XO media looking for ahem "differences", like a game of gotcha, though. So any XO shortcoming no matter how minor, they will find.

I doubt anywhere on the net even other than GAF noticed.
 
Yep, gameplay.

I have to agree with function, the race for 1080p is stupid IMO, it consumes valuable and much needed GPU resources from both consoles .. it also puts bigger pressure on memory capacity, which doesn't seem to be in great abundance as we once thought in the past. Consoles still use lower texture resolution than PCs. it could also be attributed to memory bandwidth. which just asserts the point even further. An upscaled 900p with a powerful quality of lighting, shadowing, Geometry and Textures will trounce a native 1080p with a lesser quality of them.

Maybe a bit OT but what's sad is that we have to make any choice. By the diminishing returns theorom, we should just have oodles of extra power sitting around so 1080P would be a "why not".

Instead as always, we struggle, searching for more power. Carmack was wrong!
 
I'm 90% sure it's gameplay, it's been a good few months and at least a couple of full games since I finished it. I'm pretty sure I remember that part of the game though.

EDIT: scratch that, make it 100% sure, I remember now, you have to hide from the search lights behind walls, fences etc...

Alstrong's deleted posts thinks the 900P may be in a transition to 1080P coming out of a 900P cutscene. Is that particular moment directly after a cutscene?

Staying consistent, the few GRAPHICAL differences on XBO bother me more than the 30-60 FPS thing...I assumed they'd be relatively identical.

I guess it could be a weird kind of progress for XBO, from say BF4, full RES drop, vs now, struggling kinda mostly at 1080P with a few gfx cutbacks.

But the internet seems to have reacted WORSE to the latter, which I find odd. In other words the net seemed more accepting of 720P/same gfx than 1080P/a few cutbacks. Noted.
 
Maybe a bit OT but what's sad is that we have to make any choice. By the diminishing returns theorom, we should just have oodles of extra power sitting around so 1080P would be a "why not".

Instead as always, we struggle, searching for more power. Carmack was wrong!
It's the way they chose to handle the specs of the current consoles. From the get go, we were hearing rumors of an HD 6670 GPU! Ridiculous and absurd .. there were even some people arguing that would be more than enough, which sounded even more ridiculous .. now we have relatively better GPU sets but with a castrated, barely holding together CPU.. In this round, console makers chose the cheap route over the expensive one that would actually advance technology. What's worse is they keep chasing some fancy check box expensive dreams like 1080p.

I guess it could be a weird kind of progress for XBO, from say BF4, full RES drop, vs now, struggling kinda mostly at 1080P with a few gfx cutbacks.
The XO had 3 full res drops : Ghosts, BF4 @720p , and AC4 @900p, all 3 are demanding games. Then we have NFS Rivals, Tomb Raider, FIFA14 and NBA 14 all at 1080p just like PS4. Obviously the latter games are less demanding than the first 3 .. and PS4 would have probably commanded a frame rate advantage in the rest of them too.

It is really astonishing how XO lags behind a mere 50% GPU advantage, but my bet is it's the lacking memory bandwidth .. they should stick to lower resolutions and cope with PS4 on the rest of the visual settings and frames .. PS3 did that too (albeit with a far narrower resolution/image quality gap). I believe this is the best possible strategy.

And you know what the worst thing is? My outstanding, possibly sexist joke about Tress-FX's effect on frame times: "Now you can wait for a woman's hair to be ready in a game, just like in real life" has been completely overlooked!
It hasn't, it's just that some people don't want to stir troubles because maybe their accounts are being monitored by other people with long hair! :D
 
I have to agree with function, the race for 1080p is stupid IMO

What makes it particularly dumb is that unlike frame rate, there are so many visual aspects of the render pipeline that work against 1080p in many circumstances. You have things like weak texture filtering, low res textures, motion blur, post process aa blur, post process softening, low res post process buffers, etc, all sorts of steps in the rendering stages that each reduce the bang for the buck that 1080p offers compared to say frame rate which is still noticeable in spite all of the above.

Having said that...Tomb Raider is one of those games that actually does benefit from high resolution if you have a big enough tv. I think I had posted in the pc forum that way back when I was playing it, it was the first game to make me want to buy a 4k display because I'd often stand there looking far in the distance planning what to do or where to go and noticing how things got blocky with "only" 1080p for this particular game.


And you know what the worst thing is? My outstanding, possibly sexist joke about Tress-FX's effect on frame times: "Now you can wait for a woman's hair to be ready in a game, just like in real life" has been completely overlooked!

DF can hire me as joke writer if they want, and for minimum wage too!

Not bad, but your faffing comment was better :)


It is really astonishing how XO lags behind a mere 50% GPU advantage, but my bet is it's the lacking memory bandwidth .. they should stick to lower resolutions and cope with PS4 on the rest of the visual settings and frames .. PS3 did that too (albeit with a far narrower resolution/image quality gap). I believe this is the best possible strategy.

I dunno I think it's more than just memory bandwidth, there's a pretty big gpu gulf between the two machines. This game really reflects that given how their hair simulation taxes what little gpu compute units there are on the new machines to begin with. PC benchmarks shows how light this game was on cpu so it's all about gpu compute, and given that the console textures are lower resolution than pc textures tells me it's less about memory bandwidth in this case and more about straight up gpu computational grunt.
 
PC benchmarks shows how light this game was on cpu so it's all about gpu compute, and given that the console textures are lower resolution than pc textures tells me it's less about memory bandwidth in this case and more about straight up gpu computational grunt.

I disagree, the game is light for something like a regular core i5, but for 1.6GHz Jaguar!?

see this
http://forums.anandtech.com/showpost.php?p=35371206&postcount=7

the built in benchmark doesn't reflect well the game performance on CPU bound areas.


but yes, TressFX AND 1080P for 768 GCN sps sounds optimistic, so is no surprise they had to cut down in other aspects,
 
I disagree, the game is light for something like a regular core i5, but for 1.6GHz Jaguar!?

I saw cpu benchmarking for the game in this article: http://www.techhum.com/tomb-raider-benchmark35-graphics-cards-and-15-cpus/, which for cpu had this chart:

TR-CPU.jpg


Seems like even a core i3 had no trouble with it, but if the built in benchmark is wrong though for cpu then perhaps the above is invalid.
 
Last edited by a moderator:
I saw cpu benchmarking for the game in this article: http://www.techhum.com/tomb-raider-benchmark35-graphics-cards-and-15-cpus/, which for cpu had this chart:


Seems like even a core i3 had no trouble with it.

Well, 6 available Jag cores at 1.6 ghz=3 @3.2 ghz, but of course it's actually even less efficient. It might compare to or be even less than the X2 550 in that chart, which notches barely above 60 FPS.

Then there's console optimization, though. But I bet the low absolute clock rate is problem.
 
I saw cpu benchmarking for the game in this article: http://www.techhum.com/tomb-raider-benchmark35-graphics-cards-and-15-cpus/, which for cpu had this chart:



Seems like even a core i3 had no trouble with it, but if the built in benchmark is wrong though for cpu then perhaps the above is invalid.

I have the very same i3 2100 from the graphic, on the built in benchmark sure, 78fps looks OK,
but if you actually play the entire game, with maximum level of detail I had sub 30FPS moments with extremely low GPU usage caused by CPU performance.

the link I posted from the Anandtech forum shows clearly how the built in benchmark is useless, also that haswell i5 at 800MHz is not to far off what you can expect Jaguar at 1.6GHz to achieve IMO (if the consoles were running windows and the PC version of the game)
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain

That is cute :)


@function: you did not immediately see that there was a difference in clarity between the X1 and PS4 version of Tomb Raider in certain scenes?!?

Imo, after all...this comparison dramatically shows how easy it is to spot the difference of 900p and 1080p...it is quite astounding imo and must be related to this particular game, as I can't remember the difference being this huge when I tested resolution back an forth on my PC.

But I do wonder why the devs did not implement dynamic resolution on the PS4 to stabilize framerate. Again, I think that a lot of this DF analysis points to a very limited time frame for the devs who did the port...and I still wonder if DF somehow can give a hint if the most demanding scenes where PS4 drops frames is CPU or GPU limited...maybe by using the PC version side by side for comparison.
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain

Yup. It's amazing all the hubbbub and if you actually go look at any two pics from the game they're more than likely IDENTICAL (see if you can determine which pic sports 66+% more power) (1 ,2). Whereas from reading enough you'd think one was the commodore 64 version.

While there are real and disappointing technical differences, There can be a big disconnect between reality and the web (really, GAF) narrative imo. Heck sometimes it's hard to discern a huge difference to the 360/PS3 version. While we can, I dont think "Mom" could.
 
Yup. It's amazing all the hubbbub and if you actually go look at any two pics from the game they're more than likely IDENTICAL (see if you can determine which pic sports 66+% more power) (1 ,2). Whereas from reading enough you'd think one was the commodore 64 version.

While there are real and disappointing technical differences, There can be a big disconnect between reality and the web (really, GAF) narrative imo. Heck sometimes it's hard to discern a huge difference to the 360/PS3 version. While we can, I dont think "Mom" could.

Yeah lets just ignore the 10 to 20 fps difference.
 
Status
Not open for further replies.
Back
Top