Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Dumb question, but how does one tell Shader Aliasing from standard Aliasing?
Shader aliasing is a subset of aliasing we began to see when pixel shaders were introduced. Until then, aliasing was almost always was seen at the edges of geometry (which MSAA does a very nice job of removing, and might be called "standard" aliasing). Shader aliasing happens when the shader has high frequency output (varying contrast within less than a pixel or between pixels), but not sampled enough per pixel. MSAA cannot deal with this. The solution would be to either do more samples per pixel, or use a shader that does not result in high frequency output.

Please correct me without hurting my feelings if I'm wrong.
 
.....On the console side I pay for the lack of flexibility in return for the devs doing all the hard work, I don't think paying for less flexibility and being mired in complexity is an attractive value proposition.

Devs shouldn't be building settings menus they need to either downgrade or optimise until they achieve at least 30fps at whatever res

And yet here we are with games in the low 20s/teens (or zero lol) and inconsistent frame rates when in combat. If anything, the devs are doing a terrible job and it seems more often or not consumers are just paying for less flexibility with no benefit.

The reality is that not everyone has best of the best dev houses, lengthy deadlines and top shelf talent. Shipping working games with half the stuff devs had planned in a working state is hard enough. On top of that, you have analysis of how well the game plays so consumers can make better choices causing more pressure between the dev/publisher/platform holder. If devs are "encouraged" to push a system farther than it can to appease the PR people then they aren't working in the consumers best interests so I think having performance options is a sane solution.

Having options isn't a bad thing no matter how anyone tries their hardest to spin it as a negative.
 
...

Faster hard drives boost Xbox One Fallout 4 performance
...

Of more interest to me is that it seems to confirm that the internal SATA port on the PS4 is throttled somehow as the XB1 loads faster from an SSD over USB 3.0 than the PS4 does (although the minor cpu clock speed advantage may play here too the gulf seems too wide IMO). Goddammit Sony disable the I/O speed limit!

Not exactly. As you can see the results are in fact quite balanced: With both machines equipped with an SSD, PS4 still loads faster in 3 tests, XB1 loads faster in 3 others tests:

6zjh4kv.jpg


It's more like on average the XB1 gains more from the SSD than then PS4. XB1 gains from 42% to 61% (average: 50%) reduced loading times versus 28% to 48% (average: 38%) for the PS4. It's worth pointing out that the PS4 OS always reserves some IO bandwidth from the internal HDD (even SSD).
 
Not exactly. As you can see the results are in fact quite balanced: With both machines equipped with an SSD, PS4 still loads faster in 3 tests, XB1 loads faster in 3 others tests:

A bit misleading the way you look at it, the XB1 loads faster on average across all (those) tests. PS4 wins it's 3 tests by a combined 7.8 seconds, XBO wins it's 3 by a combined 11.1 seconds. You'd need a LOT of tests to declare a definitive winner probably, but the basic results they give are faster for XBO.

I keep forgetting it's not SATA 3 on these boxes for some reason


Yeah, I think that's why sometimes XBO can benefit from a external drive over USB 3.
 
A bit misleading the way you look at it, the XB1 loads faster on average across all (those) tests. PS4 wins it's 3 tests by a combined 7.8 seconds, XBO wins it's 3 by a combined 11.1 seconds. You'd need a LOT of tests to declare a definitive winner probably, but the basic results they give are faster for XBO.

This is a parody of fan boys, right? You aren't seriously trying to claim a win or twist these results around, are you?
 
Just the facts ma'am.

Apparently it's ok for Globalisateur to parse the numbers to great specificity but, not me.

Also, I already caveat-ed it. Overall Xbox is a tad faster by those numbers though.
 
Last edited:

... Why did they ship it like this? I hear it is just as bad on PC side as well, thankfully i don't really care about COD to begin with.
 

... Why did they ship it like this? I hear it is just as bad on PC side as well, thankfully i don't really care about COD to begin with.
probably because of releases with a fixed date
So now we are at the point where the resolution is lowered across the board of multiplatform games. At least when they try to hit 60fps.
Framerate on xb1 seems to be really bad for a title that dynamically drops res like this.
 
probably because of releases with a fixed date
So now we are at the point where the resolution is lowered across the board of multiplatform games. At least when they try to hit 60fps.
Framerate on xb1 seems to be really bad for a title that dynamically drops res like this.
It dynamically drops res, but it runs at the lower-end (1280x900) most of the time according to DF (sometimes dropping to 1200x900).
What I find even more odd is that, while the PS4 version has a slightly better framerate than the XB1 version, it's still far from a perfect 60fps; and while it also has a dynamic res, it stays at full 1920x1080p most of the time. Why not drop the res more often if they're not hitting the target 60fps?

It seems like they didn't put nearly as much effort into optimizing the SP vs MP, where the framerate appears to be solid on both consoles.
 
If it were possible to electronically pixel count every single frame it'd be a great addition to the comparisons to show actual resolution for any given second.

They'd then be able to show pixels/second and the percentage difference between platforms on a like-for-like scene.
 
If it were possible to electronically pixel count every single frame it'd be a great addition to the comparisons to show actual resolution for any given second.

They'd then be able to show pixels/second and the percentage difference between platforms on a like-for-like scene.
It can be difficult because aliasing isnt limited to just geometry sampling.
 
If we assume we have two identical output renders (identical in geometry ) could you compare the two to see if one had higher fidelity than the other. More a case of spot the difference rather than quantify the difference? Ultimately most only want the numbers for ePeen waiving rights anyhow.
 
Is counting the aliasing steps the only method of determining render resolution?
 
Status
Not open for further replies.
Back
Top