Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Yup. It's amazing all the hubbbub and if you actually go look at any two pics from the game they're more than likely IDENTICAL (see if you can determine which pic sports 66+% more power) (1 ,2). Whereas from reading enough you'd think one was the commodore 64 version.

While there are real and disappointing technical differences, There can be a big disconnect between reality and the web (really, GAF) narrative imo. Heck sometimes it's hard to discern a huge difference to the 360/PS3 version. While we can, I dont think "Mom" could.

DF didn't start providing analysis at the beginning of this new generation of consoles we've been reading and discussing these differences for years now...

Personally I'm impressed with the fact that the game is running at 1080P basically at launch and I'm encouraged that we see developers pushing for >30FPS and 1080P when its possible. And when you consider what COD looked like on 360 at launch (I still own it) versus what we see now this is going to be an exciting generation for both platforms. And this is backed up by MS's recent GPU reservation reduction, 8% more resources will translate into higher frame rates which is a good thing.
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain

GET THESE TECHNICAL ARGUMENTS OUT OF MY TECHNICAL FORUM
 
Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference.

So glad we have members whom can pull statistical data out of their own internalized universe and declare it fact. "If I don't care about it then most consumers wont!" are not objective points for discussion.

This thread is to discuss digital foundry findings, not peoples personal case of sour grapes.
 
Yeah I don't get it, these discussions have gone on since I've been here... what has changed? This is a tech forum, and a thread specifically meant to discuss tech articles from DF, who specialize in technical comparisons, which we here at B3D like to discuss. Even if that 'stat' were true, who cares what the majority of the market think? There have been multiplatform games that were far closer that were dissected in far greater lengths than this.
 
Last edited by a moderator:
I have the very same i3 2100 from the graphic, on the built in benchmark sure, 78fps looks OK,
but if you actually play the entire game, with maximum level of detail I had sub 30FPS moments with extremely low GPU usage caused by CPU performance.

the link I posted from the Anandtech forum shows clearly how the built in benchmark is useless, also that haswell i5 at 800MHz is not to far off what you can expect Jaguar at 1.6GHz to achieve IMO (if the consoles were running windows and the PC version of the game)

Ah ok well scratch that then, didn't realize the cpu benchmark was bogus.


...and I still wonder if DF somehow can give a hint if the most demanding scenes where PS4 drops frames is CPU or GPU limited...maybe by using the PC version side by side for comparison.

They could do like the Anandtech link HMBR posted where they could run the game on a cpu underclocked pc and a really fast gpu, and see if frame rates drop on that version at similar times when it drops on ps4 to see if there is a cpu limit in play.
 
You cant really compare the two versions like that since many things have changed. You cant see more geometry/effects in some scenes and less in others so it would not be a fair comparison.
 
Ah ok well scratch that then, didn't realize the cpu benchmark was bogus.




They could do like the Anandtech link HMBR posted where they could run the game on a cpu underclocked pc and a really fast gpu, and see if frame rates drop on that version at similar times when it drops on ps4 to see if there is a cpu limit in play.

Yeah, this is exactly what I thought about. This would be a really interesting test imo! Although of course it would be still difficult to extrapolate to the consoles...but just to get an idea which scenes are stressfull for CPUs...
 
Yeah, this is exactly what I thought about. This would be a really interesting test imo! Although of course it would be still difficult to extrapolate to the consoles...but just to get an idea which scenes are stressfull for CPUs...

Just wondering. Once developers start getting the hang of the CU's wouldnt that free up the CPU from the rendering and physics based tasks that stress it?
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain

Ok, so in motion, which would be up to 60 fps on the PS4 and never more than 30 on the XB1 you can't tell the difference.. which could be based on YT is limited to 30..

in motion -->which matters most when playing a game<--

What you can actually see with the DF video is just how much difference there REALLY is in motion, check the slowed version, the PS4 looks like high speed footage slowed down while the XB1 looks like the old stutter slowmotion.. THAT is what the YT video shows.. better motion on the PS4

You are making a strong case for the PS4 version of this game, and if you actually meant it the PS4 looks to be the platform for you.

When it comes to resolution and textures i would argue that the difference so far isn't that great, mostly thanks to both consoles having the same amount of easily accessible memory.. That is, no weak PS3 versions..
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain


All of this sounds like you're trying to justify your choice for a home console.
And the fact that you feel the need to justify your choice is, by itself, very revealing.





But I do wonder why the devs did not implement dynamic resolution on the PS4 to stabilize framerate.


Dynamic resolution in the xbone was only used for cinematic cutscenes and not during gameplay.
I think they only used it in scenes where the camera got close to Lara's TressFXed hair, which causes big performance drops (at least it did when I played the game in my HTPC with a GTX 660 Ti).
Perhaps the PS4 - which has ~45% more compute power than the xbone - simply wouldn't need to lower the rendering resolution in such scenes in order to maintain framerates well above 30 FPS.
 
Dynamic resolution in the xbone was only used for cinematic cutscenes and not during gameplay.

In the Eurogamer download high quality 1080p PlayStation 4 vs Xbox One video (172.2MB). At 4:55 even the "COVER" HUD text displayed during the Gameplay is clearly upscaled (right) when PS4 "COVER" hint is native:

Tomb_Raider_Cuffed_hand_PS4_1080p_X1_900p_HUD_det.png


At this precise moment Lara is running, it's 100% gameplay confirmed. Also the textures, geometry and assets are obviously upscaled, it's easily visible on the board right next to Lara.

Here another HUD text comparison during normal XB1 native image, the text on XB1 is perfectly identical and looks as native as PS4 text:

Tomb_Raider_Cuffed_hand_PS4_X1_1080p_HUD.png


It's just about being exact. The game dynamically changes the resolution on the XB1 version also during gameplay. And Digital Foundry did miss it.
 
Last edited by a moderator:
Dynamic resolution is smart. Thought we'd see that on almost every title this gen. Curious why they're not doing the same on PS4. If you could keep 60fps locked, why not?
 
The in-game areas where both games drop frame rate (so you can attempt to meaningfully compare them) seem to show the PS4 being about 50% ahead.

That sort-of fits the CU difference between the two platforms (as opposed to CPU, ROP, memory BW etc). Not saying the relative performance is solely down to GPU ALU/CU differences, but it does kind of stand out.

Neither of these games seem to make particularly good performance tradeoffs. PS4 frame rate is spectacularly unstable and the B0ne chases the PS4's resolution instead of of focusing on maintaining 30 fps - which it could kinda do with doing.

Once again, the 1080p fans failed to notice that something they were examining frozen on a monitor - while smearing the ir noses against the screen and mouth-breathing - was actually running at "900p" and not "1080p". Again. It happened ... again.

Both of these games should be using dynamic framebuffers. God damn 1080p and the god damn 1080 crowd worshipping that number. God damn. Don't they realise that a lower resolution at a higher frame rate can still give them about the same number of pixels to hoot and rub over?
Nuff' said!!

I'd bet 100€ people who played Ryse of Rome would never ever notice the game runs at 900p instead of 1080p if they weren't told. Such a clean looking game...
 
Dynamic resolution is smart. Thought we'd see that on almost every title this gen. Curious why they're not doing the same on PS4. If you could keep 60fps locked, why not?


The dynamic res doesn't seem to actually be working during even the heaviest moments of gameplay though (framerate drops under 30).
 
The dynamic res doesn't seem to actually be working during even the heaviest moments of gameplay though (framerate drops under 30).
I think we would need John Carmack for that. He did an amazing job on both Doom 3 for Xbox 360 and Rage.

I could notice the resolution drops in Doom 3 but they weren't annoying at all nor they hindered the experience in any way, I could understand when they happened.

The result.... a game with no framerate hiccups. Give me intelligent solutions like that any time of the day over compromised 1080p.
 
I've decided after looking at the Tomb Raider XB1 & PS4 comparison video, that in motion(which matters most when playing a game) I can't tell the difference between the 2 systems. If I had a 60" 1080p TV I might be able to tell from 1'-2' away, but not 10' away where my recliner sits. I give up caring about the graphical differences between these 2 systems. Guys have fun arguing the next 6-8 years arguing over every little pixel that 80-90% of the market won't be able to tell the difference. Xbox has the exclusives I like & the multiplatform titles look just as good as the other guy. The value proposition is whole another issue that shouldn't be discussed here anyway, but as far as the graphics, looks a wash to me.

Tommy McClain

You were looking at the wrong videos. Please take a look at Gamersyde videos, one is running 60fps most of the time, Xbox One is half of that most of the time. Visually, that makes PS4 version much more appealing, because motion fidelity is part of visual quality. As for still image quality, they are both very close, but you are getting double the amount of temporal information, and you'll easily notice that in Gamersyde videos (But don't use VLC as it has problems playing 60fps videos on most systems.)

The amount of people who notice 60fps is not as low as 10-20%... People do feel it. iOs interface runs at 60fps with a reason. Many phone makers ignored this and their high end phones had awful UI running at 15-30fps.
 
The amount of people who notice 60fps is not as low as 10-20%... People do feel it. iOs interface runs at 60fps with a reason. Many phone makers ignored this and their high end phones had awful UI running at 15-30fps.

Actually, iOS/Iphones have much less lag than all other phones (half!), that is probably what people feel as low frame rate.
 
Nuff' said!!

I'd bet 100€ people who played Ryse of Rome would never ever notice the game runs at 900p instead of 1080p if they weren't told. Such a clean looking game...

I'm sure if they had a 1080p version of the game to compare it to they would notice. It may look great as it is, but it would look even better at 1080p - albeit not by a great amount since the resolutions are incredibly close. The biggest benefit would not come from the higher number of pixels but from the removal of the need to scale and thus introduce blurring. Both resolutions would probably look near identical on a 720p display.
 
Status
Not open for further replies.
Back
Top