Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Entropy said:
Current iPhone/iPad is roughly half the GFlops, in a more flexibly programmable GPU with better bandwidth/ALU ratio while using an extremely bandwidth efficient architecture.

Next year will see a transition to 20nm (which is where iOS devices will exceed PS360 graphics capabilities), and the year after that (perhaps more significantly) a transition to 16nm FinFET.

And? Even if the current ipad sported 360/PS3 performance you wouldn't see 360/PS3 quality level games for years.

Ipad/Android doesn't provide a market for $60 dollar titles. And there aren't many iOS/Android titles that have sold many multiples of sales of your typical AAA console title to justify the budget needed to produce those type of games.
 
But if the Renderer -> HDMI is at fault, why is it, that all the other footage on the web doesn't have this issue? Does HDMI provide color range sensing (I mean, maybe the elgatos do say they're limited RGB, the HDMI says it's limited to the renderer and it's alright... and the other way around the HDMI port still uses limited, but reports full range?)

Is there any analysis of other footage if it's limited or full range RGB?
I suspect it's a simple bug in the X1 firmware. If you set the range to auto or limited, the renderer and HDMI driver both use 16-235, and everything works as expected. If you set the range to full, then the renderer uses full, but the HDMI driver mistakenly negotiates 16-235.

I'd be very surprised if it isn't fixed by launch.
 
Wow. Given the circumstances and time constraints, what happen with DF should be expected from time to time.

Shit happens. You can't expect DF to be perfect especially considering that they are producing analyses on the web for gamers not academic journals.
 
Wow. Given the circumstances and time constraints, what happen with DF should be expected from time to time.

Shit happens. You can't expect DF to be perfect especially considering that they are producing analyses on the web for gamers not academic journals.

Yeah, but this is life and death. What would happen if people actually had to wait for a product to be released and usable before they decided if they were going to spend their hard earned money on it?!
 
http://www.eurogamer.net/articles/digitalfoundry-resolutiongate-the-fallout



It's worth remembering that Yukon was supposed to have >32MB EDRAM... whilst most targets were "matched" to varying degrees, that one was simply missed.


In the article, they seem to make the same fundamental mistake that everyone writing about BF4 is making. Their knowledge that PS4 has higher resolution has somehow convinced them that the PS4 version LOOKS better when that is blatantly not the case. In every comparison that I could find, the XB1 LOOKS substantially better. My only explanation is some sort of confirmation bias. To avoid that mistake, I made my choice before I knew which version was which.
 
Sure. My point is that the X1 histogram lacked the lower end of the PC.

http://www.eurogamer.net/articles/2013-10-29-battlefield-4-next-gen-vs-pc-comparison-gallery

used 46 to 48
I think that's a poor choice. The textures and possibly lighting are different. It's hard to match gameplay segments. Images 124 to 126 are from about the same frame in a cutscene, although the difference with PC could be a sudden light hitting the scene. The fact they haven't got exactly the same image to compare is a disappointment. Still, in this example, PS3's histogram matches the PC closer than XB1, although it's pretty much capped at 220. That's remarkably like a limited RGB range normalised to zero, but there are odd brighter pixels confusing the situation.

I wonder if the game is applying effects (tone mapping etc.) in post in the wrong colour space? I reckon there's about a 0.75 pixel gaussian blur or equivalent on PS4, the same as I see on Ryse, also at 900p. :???:
 
In the article, they seem to make the same fundamental mistake that everyone writing about BF4 is making. Their knowledge that PS4 has higher resolution has somehow convinced them that the PS4 version LOOKS better when that is blatantly not the case. In every comparison that I could find, the XB1 LOOKS substantially better. My only explanation is some sort of confirmation bias. To avoid that mistake, I made my choice before I knew which version was which.

They are both essentially identical other than one having a 50% higher resolution and marginally better frame rate, while the other has a very heavy sharpening filter applied. The remaining differences were caused by problems with DF's capture equipment, will be sorted by launch (missing AO on xb1) or are simply due to the playthroughs being different.

So while I respect your opinion, it's still just that. An unbiased analysis would suggest the PS4 has quite a significant advantage.
 
In the article, they seem to make the same fundamental mistake that everyone writing about BF4 is making. Their knowledge that PS4 has higher resolution has somehow convinced them that the PS4 version LOOKS better when that is blatantly not the case. In every comparison that I could find, the XB1 LOOKS substantially better. My only explanation is some sort of confirmation bias. To avoid that mistake, I made my choice before I knew which version was which.
Umm, DF admitted to messing up the captures. DF as well as all of the other media that have seen both in person have said that the PS4 is the superior version. I guess everyone is biased.

Even if you went off of the DF captures, it's arguable at best that the X1 looks visually better. If you prefer fake detail, ringing and aliasing from sharpening, and severe black crush with oversaturated colors, that's your opinion.

Claiming bias is not allowed here. I've been suspended for doing the same thing. :???:
 
I think that's a poor choice. The textures and possibly lighting are different.

Well...that's exactly my point, the assets in the PS4 version are quite different in many shots (16 to 18, ground texture) and few people are spinning that as "missing clouds". I looked at the ones you pointed out, the X1 still have a huge black bar on the left though, but then we already know that to be true.

Would be interesting to see the X1 with SSAO applied.
 
Last edited by a moderator:
They are both essentially identical other than one having a 50% higher resolution and marginally better frame rate, while the other has a very heavy sharpening filter applied. The remaining differences were caused by problems with DF's capture equipment, will be sorted by launch (missing AO on xb1) or are simply due to the playthroughs being different.

So while I respect your opinion, it's still just that. An unbiased analysis would suggest the PS4 has quite a significant advantage.


Did you miss where I said that I checked multiple sources? The latest that I check being ars show the XB1 with a substantially better appearance. You last comment is the point of my original post. Sure there are numerous technical advantages in the PS4 version. The XB1 is also doing something, but that is unquantified. The Gestalt is a much better looking XB1 version. I see two reasons for this difference, PS4 seems to have a piss poor scaler and textures on XB1 look much better, perhaps because of 1. This whole pre-launch period has been an interesting lesson in the power of memes to degrade all rationality.
 
I see two reasons for this difference, PS4 seems to have a piss poor scaler and textures on XB1 look much better, perhaps because of 1. This whole pre-launch period has been an interesting lesson in the power of memes to degrade all rationality.

I'm wondering where you got any information about scalers? There is no evidence that scaling has anything to do with it, all evidence is Dice's choice of post processing. Both consoles have hardware scalers and both are probably indistinguishable and probably AMD solutions.
 
Did you miss where I said that I checked multiple sources? The latest that I check being ars show the XB1 with a substantially better appearance. You last comment is the point of my original post. Sure there are numerous technical advantages in the PS4 version. The XB1 is also doing something, but that is unquantified. The Gestalt is a much better looking XB1 version. I see two reasons for this difference, PS4 seems to have a piss poor scaler and textures on XB1 look much better, perhaps because of 1. This whole pre-launch period has been an interesting lesson in the power of memes to degrade all rationality.

The xb1 version is upscaling with an aggressive sharpening filter. The ps4 version does not appear to be doing that. It's very unlikely that they are using the PS4's scaler - the HUD will almost certainly be native, so it'll be being done by the game. If you find sharpening more visually appealing, fine - great, set your TV to over sharpen everything. However some (arguably most) people don't like this look being forced upon them - but otherwise that's the key difference and it's not really that important compared to the significant resolution difference. This is a tech forum - ultimately we aren't all that interested in subjective opinion - and do not look kindly on it being used to prove 'confirmation bias' in the media.

[edit] It seems my fellow moderators were less patient.
 
All you have seen are compressed streams mostly at 30fps unless you were at the event. The aliasing difference is evident even in those. The "sharpening" on Xbox makes some textures even "sharper" than on 1080p PC shots

But lets just wait for the day one patch with all the fixes for both versions. DICE has surely seen this article and can look at issues on both platforms.
 
The xb1 version is upscaling with an aggressive sharpening filter. The ps4 version does not appear to be doing that. It's very unlikely that they are using the PS4's scaler - the HUD will almost certainly be native, so it'll be being done by the game.
PS4 has display planes like XB1. It's unsure whether devs have access to the display planes for HUD, or if the second plane is reserved for the OS. Without that knowledge, there's an argument that the scaler used in that component is upscaling the game and compositing it with the UI, and is doing a poor job.
 
http://www.eurogamer.net/articles/digitalfoundry-resolutiongate-the-fallout



It's worth remembering that Yukon was supposed to have >32MB EDRAM... whilst most targets were "matched" to varying degrees, that one was simply missed.

Perhaps the virtual OS is helping to downgrade. Not the same I would guess,but what do you guys think of this?

As with any server virtualization software, there is a certain amount of overhead associated with running the virtualization code required to support guest operating systems running on Hyper-V. The following list summarizes the overhead associated with specific resources when running guest operating systems on Hyper-V virtual machines:
http://msdn.microsoft.com/en-us/library/cc768536
 
Perhaps the virtual OS is helping to downgrade. Not the same I would guess,but what do you guys think of this?

While it's possible, I wouldn't expect any overhead of virtualizing the GPU to manifest itself as an effect on fill rate or shader performance. It's much more likely at add overhead to draw calls, which would manifest itself as a CPU time issue where reducing resolution wouldn't help.
Even there given the hardware has multiple ring buffers there is no real need for virtualization to introduce overhead.
I would surmise (having never used the hardware) either the issue is the way that the engines interact with the ESRAM resulting in a significant bandwidth issue, or just the limitations of the 16 ROPS.
If it's the former, it's likely that some of the defecit can be reclaimed as engines are targeted at the hardware, if it's the latter, it's likely things won't improve dramatically.
 
Perhaps it is a sign of things to come? Azure collapses. Again! (twice in one year... ouch)

Blue Sky of Death
That made me :LOL:



While it's possible, I wouldn't expect any overhead of virtualizing the GPU to manifest itself as an effect on fill rate or shader performance. It's much more likely at add overhead to draw calls, which would manifest itself as a CPU time issue where reducing resolution wouldn't help.
Even there given the hardware has multiple ring buffers there is no real need for virtualization to introduce overhead.
I would surmise (having never used the hardware) either the issue is the way that the engines interact with the ESRAM resulting in a significant bandwidth issue, or just the limitations of the 16 ROPS.
If it's the former, it's likely that some of the defecit can be reclaimed as engines are targeted at the hardware, if it's the latter, it's likely things won't improve dramatically.

Thanks for the info :yep2:
 
Status
Not open for further replies.
Back
Top