Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
If the ps4 version has a steadier framerate, why use trilinear filtering? I mean what's the performeance hit of anistropic filtering vs trilinear? I can't imagine it being very big.
 
Last edited by a moderator:
If the ps4 version has a steadier framerate, why use trilinear filtering? I mean what's the performeance hit of anistropic filtering vs trilinear? I can't imagine it being very big.

It doesn't. PS4 version uses Anisotropic filtering:

PS4_XB1_Thief_textures.png


PS4 version has a texture streaming problem yes, not a texture filtering one. When the textures are correctly loaded they are as sharp as the XB1 version.

Digital foundry are wrong on this

the PS4 uses trilinear filtering to treat floor and wall textures
and the POM missing on XB1:

Both platforms benefit from POM
which was already noticed by someone on this forum.
 

Better late than never, Eidos Montreal's rebooted Thief has been lying in wait since as early as 2008, where multiple redesigns and staff switch-ups have cast a shadow over its eventual quality. Our behind-doors preview of the game at E3 2013 showed a solid Unreal Engine 3 stealth title running at 1080p30 on PS4 - albeit with detailing and effects-work reminiscent of a last-gen title. But with the retail codes of Xbox One, PS4 and PC versions to hand, we see that this is the least of the game's problems.

All versions fu**ed at birth...
 
Apparently, the texture streaming problem doesn't occur at all for some PS4 users when the game is fully installed on a new 1TB HDD.

It occurs mainly during cutscenes in fact so the texture pop-in is easily verifiable.

Could be a PS4 HDD related deficiency. Is the included HDD the achilles heel of the PS4?
 
Apparently, the texture streaming problem doesn't occur at all for some PS4 users when the game is fully installed on a new 1TB HDD.

It occurs mainly during cutscenes in fact so the texture pop-in is easily verifiable.

Could be a PS4 HDD related deficiency. Is the included HDD the achilles heel of the PS4?

Disk fragmentation? As far as we know the HD is on a USB3 bus for some (f*cked up?) reason.
But those with a PS4 would know that the console constantly records gameplay and doesn't really clean up afaik , maybe not until it hits the limit. Anyway, there is ample room for fragmentation.
 
So basically AF > 1080p, better framerate and more comprehensive use of POM. Is this a new low for DF?

Think about it, what's the point of having a very small pixel if all the surrounding pixels are the same color? Resolution loses a chunk of it's purpose if it doesn't have good texture filtering, etc, to go along with it. For pom it's possible they use it more selectively on the xb1 version, so maybe they disabled it on some surfaces that take a lot of screen space on xb1 but left it on others hence why df thinks it's there on both versions.
 
I don't buy it. Some distant textures at certain angles versus all screens having more aliasing (bird cages, thin structure, fences, etc.), scaling blur and lower frame rates? Pick your poison, they are both crappy ports of a last gen game that was in dev hell. Maybe DF (Leadbetter) should stop from editorializing, let the data speak for itself and stop feeding the console wars, but I know he likes the clicks.
 
with normal maps and shadows popping in later on the Sony platform especially
http://abload.de/img/thief28slmz.gif

Watch until the end of the gif, there are indeed shadows pop-ins on... XB1!

This whole article is a mess and should be completely re-written, with a Thief already pre-installed on a clean non-fragmented HDD if possible on both consoles :D
 
Last edited by a moderator:
so why exactly is this game such a problem running on xb1 and ps4? A lot of these issues seem like bad porting and less to do with the actual hardware. Nothing in theif seems at all taxing to atleast get a consistent 30fps.

I can understand that the devs may not have optimized for multithreading on CPUs, but this is ridiculous. I hope there's a patch or something.
 
I don't buy it. Some distant textures at certain angles versus all screens having more aliasing (bird cages, thin structure, fences, etc.), scaling blur and lower frame rates? Pick your poison, they are both crappy ports of a last gen game that was in dev hell.

No need for me to pick any poison, I'm going to play it on pc :) But depending on circumstance af can definitely be more important than resolution. The color palette of thief seems to imply that aliasing won't be as much of a determent as texture filtering given that being a thief you will be hugging walls and other surfaces, hence texture detail becomes important. Specular aliasing will suck regardless of resolution until new software techniques are employed. Both console versions seem to have a crappy frame rate so whether it's bad or badder, either way it's bad. Either way I'd expect good af to be very important to this particular game. It's an interesting test case because af is a function of texture samplers, so sacrificing some resolution can let a machine go with stronger af. Which is better? It'll depend on the game.
 
nixxes did a great job on the PC and PS3/PS4 versions of TR...i wonder why they weren't put on the PS4 version again, they are doing the PC version after all.

It just seems like a lot of these devs weren't prepared to actually get optimal performance on the next gen consoles, its not a good situation to have. Hopefully everything gets straightened out.
 
Motion was decoupled from animation in the 2D sprite days, commonly characters would move at a fixed step and the animation would play at a lower rate, often holding different frames for different periods.

There was also a lot less input latency because the screen wasn't double buffered, you updated the sprites and scroll positions in the vblank.

I wrote a lot of genesis titles and they all ran at 60Hz, it was uncommon for that not to be the case. I used to hate supporting NTSC, because it meant you got 3.3ms less frame time and probably more importantly less time in the vblank.
Some of the early arcade games code are works of art.

Things have changed a lot back then it was about programming, now it's more about software engineering.

What games did you work on?
 
No need for me to pick any poison, I'm going to play it on pc :) But depending on circumstance af can definitely be more important than resolution. The color palette of thief seems to imply that aliasing won't be as much of a determent as texture filtering given that being a thief you will be hugging walls and other surfaces, hence texture detail becomes important. Specular aliasing will suck regardless of resolution until new software techniques are employed. Both console versions seem to have a crappy frame rate so whether it's bad or badder, either way it's bad. Either way I'd expect good af to be very important to this particular game. It's an interesting test case because af is a function of texture samplers, so sacrificing some resolution can let a machine go with stronger af. Which is better? It'll depend on the game.
But what's the performance penalty of AF? Again, the PS4 version runs better than the XB1 version, so why not include it on the PS4 version? It just doesn't make sense to me if the PS4 the truly does lack AF.
 
It just seems like a lot of these devs weren't prepared to actually get optimal performance on the next gen consoles, its not a good situation to have. Hopefully everything gets straightened out.
Exactly, it seems next gen was just an after thought considering the game has been in development for more than 3 years. I could even go on a limb and say next gen versions were direct ports from the PC version with limited optimization.

But what's the performance penalty of AF? Again, the PS4 version runs better than the XB1 version, so why not include it on the PS4 version? It just doesn't make sense to me if the PS4 the truly does lack AF.
Penalty for AF is next to 0 on modern GPU architectures.
 
But what's the performance penalty of AF? Again, the PS4 version runs better than the XB1 version, so why not include it on the PS4 version? It just doesn't make sense to me.

It's purely a function of texture samplers and cache. It's not really something a coder can do inefficiently, it's simply a state you enable or disable on the gpu, usually per material. So even if a machine has much better cpu, more gpu compute, more ram, etc, none of that will matter if it runs out of texture sampling power. In this case them having to drop af on ps4 implies they are running out of texture sampler...or it could imply that texture sampling is being used inefficiently in the rest of the chain. What makes judging af really difficult is that's it's benefit is usually lost on craptacular over compressed youtube videos but if you see 16x af and 4x af in person on your own tv the difference is huge. On internet videos it will be tougher to see.
 
Status
Not open for further replies.
Back
Top