Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Nor do they hold any licensing over Playstation, yet we still see games that are a match on both systems regardless of the power differential.

I realise it probably won't be a popular view, and there may not be anything in it, but it's very odd indeed.

Thats because pubs are in the business of making money not expressing the maximum potential of every piece of hardware they serve.

You know the biggest reason why pubs and devs aren't usually motivated to work in such fashion?

Because while gamers will pay more for more hardware performance, they haven't shown the willingness to pay more for software performance.

Why should a $50 PC game provide more than a $60 PS4/XB1 game? Allowing more resolution, AA, higher res texture is easily accommodated. But unless the tiny fraction of those that sport high end hardware are willing to accommodate the investment needed to by devs to push that hardware, don't expect wine from water.

Its like expecting to buy a 25 dollar steak from a restaurant and having it's quality be dependent on whether you are eating it on a paper plate or fine china.
 
Last edited by a moderator:
In the digital age, maybe devs/pubs should offer optional performance-dependent extras? For an extra £5 or whatever, unlock...um...some CPU taxing functionality. You could certainly add graphics options as buy-ins, and see what your users are willing to spend. I'd expect just aggro though.
 
Well... it wouldn't have not thought that Sony didn't pressure publishers for parity, it's just that we've heard about it "publicly" that MS does it.

But you can only apply pressure if you have leverage. And MS lost a lot of it in recent months.

Either way though... the obvious discrepancy between the 7770 and X1 results is... about what I would expect from DF these days. When "fairly steady 30Hz" is suddenly "significantly slower than" the X1... and the X1 analysis is "often giving us a performance at the lower end of the 20-30 range", then I don't really know how to respond.

We've seen this before, when a minute drop in framerate or disenganging vsync for a couple of frames made a "clear winner" in the past (and I am not pointing to any specific analysis here, it happened in favor of both systems, PS3 as well as 360)... it's not "scientific", really.

Don't get me wrong, I like reading these analysis', but at times, their verdicts seem out of place for what the measurements were. As if they need to say X or Y wins, even if the systems under test aren't giving them results that warrant it. Just to start yet another flamewar about it (maybe that is their intention, though^^, but that doesn't bode well for longevity, I guess).
 
Unless, dr3 on PC was develop with old engineering samples and unstable versions of windows, dx and gpu drivers, I am hard pressed to see it as a fair comparison of the hardware.

The launch titles weren't dev'd under the best of circumstances.
 
In the digital age, maybe devs/pubs should offer optional performance-dependent extras? For an extra £5 or whatever, unlock...um...some CPU taxing functionality. You could certainly add graphics options as buy-ins, and see what your users are willing to spend. I'd expect just aggro though.

SHHHHHHHHH

Don't give EA any ideas!
 
Unless, dr3 on PC was develop with old engineering samples and unstable versions of windows, dx and gpu drivers, I am hard pressed to see it as a fair comparison of the hardware.

The launch titles weren't dev'd under the best of circumstances.

if you want to talk about fair, you should never compare console ports, do you really think the same amount of resources is used to optimize performance on the Xbox and for the PC version?

also is not like they didn't release a huge patch for the xbox version, so they had time and worked a lot on it after the release, but it didn't change much!? (I think Digital Foundry tested it)
 
if you want to talk about fair, you should never compare console ports, do you really think the same amount of resources is used to optimize performance on the Xbox and for the PC version?

also is not like they didn't release a huge patch for the xbox version, so they had time and worked a lot on it after the release, but it didn't change much!? (I think Digital Foundry tested it)


Just because it not exactly equal does not mean any attempt at a comparison can be devoid of any sense of fairness.

If you want to do that you should add the caveat that the comparison has no bearing on the hardware involved.

The patch was released 2 month later to fix bugs and instability which highlight how bad the launch code was. I highly doubt if dr3 was targeting a holiday 14 launch instead of holiday 13 we would get what we got 8-10 months ago.
 
Do you any actual examples of 360 games getting netfed with the purpose of looking just like a PS3 version.

Or are you referring to extra resources needed to get the PS3 versions up to par.

Because it's funny that there is so many examples of 360 coming out in front of the PS3 from the df tests. So whoever was supposed to make the games look alike obviously didn't do a great job.

Early on parity was quite literally impossible between the two. Later on it became possible to a point if you just let the 360 idle more but even still was difficult to achieve because of the myriad of limits and bottlenecks on the ps3 which you would see in the df tests, vast online feature differences, less memory and so on. But eventually they were close enough for the masses to consider them "the same". It's not something that any dev wants to willingly do, but eventually we realized it was the smart business decision. I won't give any game examples but it happened a *lot* last gen. Hell I made a career of it, that's all I did for my last two years in gaming, working on "parity". You are seeing the same thing happen now, just the other way around. Resolution is being thrown in as a bone for the moment but I'm curious if even that difference will go away in a year or two. There's always indirect pressure from the console makers, like last gen where Sony would often ask more about the 360 version than their own version, asking where difference were, why they were there, when that would get fixed, and so on. There was never direct pressure to make things equal, but pressure was there. Combine that with modern day daily gamer outrage and the game makers are just better off making multi platform games look and run the same. Last gen that was very difficult because of the vast differences between the two machines, this gen it should be much easier.


In the digital age, maybe devs/pubs should offer optional performance-dependent extras? For an extra £5 or whatever, unlock...um...some CPU taxing functionality. You could certainly add graphics options as buy-ins, and see what your users are willing to spend. I'd expect just aggro though.

Yeah...that would go over real well :)
 
Early on parity was quite literally impossible between the two. Later on it became possible to a point if you just let the 360 idle more but even still was difficult to achieve because of the myriad of limits and bottlenecks on the ps3 which you would see in the df tests, vast online feature differences, less memory and so on. But eventually they were close enough for the masses to consider them "the same". It's not something that any dev wants to willingly do, but eventually we realized it was the smart business decision. I won't give any game examples but it happened a *lot* last gen. Hell I made a career of it, that's all I did for my last two years in gaming, working on "parity". You are seeing the same thing happen now, just the other way around. Resolution is being thrown in as a bone for the moment but I'm curious if even that difference will go away in a year or two. There's always indirect pressure from the console makers, like last gen where Sony would often ask more about the 360 version than their own version, asking where difference were, why they were there, when that would get fixed, and so on. There was never direct pressure to make things equal, but pressure was there. Combine that with modern day daily gamer outrage and the game makers are just better off making multi platform games look and run the same. Last gen that was very difficult because of the vast differences between the two machines, this gen it should be much easier.




Yeah...that would go over real well :)

We've had developers pipe in over the past year when the power differences btw XB1 and PS4 were realized to say that developers don't deliberately down tune for parity which BTW matches much of what we saw last generation but now you're going to tell us what we saw and hear is not reality.... :LOL:
 
We've had developers pipe in over the past year when the power differences btw XB1 and PS4 were realized to say that developers don't deliberately down tune for parity which BTW matches much of what we saw last generation but now you're going to tell us what we saw and hear is not reality.... :LOL:

You don't down tune, you just don't take advantage of the hardware as much as you could. It's win all around because it's cheaper and easier dev side, keeps gamers happier, and keeps console makers happier. Being "equal" puts you on less shit lists. Plus no one in their right mind would tell you what is really going on in a public interview. C'mon now, do the math already. Interviews are pr pieces like anything else, they are meant to increase your public exposure in a good light, not to use as a platform to rant on about whatever complaints and issues you may be having. If you want that then you go to bars with fellow devs and talk shit over drinks. That's when you hear what people really think...and no, none of that ever makes it public.
 
We've had developers pipe in over the past year when the power differences btw XB1 and PS4 were realized to say that developers don't deliberately down tune for parity which BTW matches much of what we saw last generation but now you're going to tell us what we saw and hear is not reality.... :LOL:

Deliberately down tune, I think not. But uptuning, thats pretty much a given.
 
If you want that then you go to bars with fellow devs and talk shit over drinks. That's when you hear what people really think...and no, none of that ever makes it public.

You think? Valve were pretty open on their feelings of working with the PS3 last gen. Also, I'm pretty sure a lot of developers were pretty open about their views, so much so that it's common knowledge among gamers that the PS3 was a total bitch to work on.

Thinking about it, it's also quite clear this gen how the Xbox One's ESRAM size is particularly difficult to work with.
 
Again most third party titles looked and ran better on the 360 for most of the last generation, not optimizing makes sense but then again when you look at 360 exclusives they weren't visually head and shoulders above the multi-platform versions with few exceptions.

Similarly we see 900P versus 1080P this generation quite a bit which seems odd if developers are truly trying to produce titles to the least common denominator... and I'd add when we've seen titles have similar performance and people have questioned parity the most common response has been developers don't do stuff like that.
 
Valve is big enough to speak completely openly. Other negative comments tend to be pretty PR constrained. eg. PS3 being a bitch to work with came across in interviews as,
"There's a crazy amount of power in there. The trick is to find ways to tap it. The development environment also does things differently and presents its own set of challenges. Sony are constantly improving things though. We've got some fabulous speed-ups when we've made use of the SPUs. Shifting our vertex work over to some SPU tasks, we've seen a three fold increase if vertex work we do per frame."
What the Sony fanbot reads:
"Cell is supercomputer! Lazy devs can't use it; theyz stoopid. Something something do your job. Sony are gods. AWESOME SPU POWER. 3x better than 360."
What the dev was saying:
"This things effin' impenetrable! Shit, late nights ahead. F*** me, why the f*** won't the f****** devkit f****** work?!?!! One pointless update a week - how's about dealing with our high priority bug fixes?! Man, we've worked hard on this, and I want some frickin' recognition. RSX is a dog but we've managed to get the whole system to yield results that just shouldn't have been this hard."
Illustrative only, not based on any experience with PS3, nor contact with PS3 devs, and applicable to any machine and console company in various ways. Point being, there's a reason why they do interviews, and it's mostly a business one. You'll find few devs giving an interview to a gaming site to their buying public out of interest. The words won't be honest if honest is going to come with negative consequences - that's basically how adults work, manipulating the information they give to maintain best outcomes for themselves.
 
Wasn't it also quite well known that Xbox 360 was usually the "lead platform"?
 
You think? Valve were pretty open on their feelings of working with the PS3 last gen. Also, I'm pretty sure a lot of developers were pretty open about their views, so much so that it's common knowledge among gamers that the PS3 was a total bitch to work on.

Thinking about it, it's also quite clear this gen how the Xbox One's ESRAM size is particularly difficult to work with.

Gabe Newell is like John Carmack, both are rich enough to do whatever they want and not really care. Most people don't fall into that category though. Also there's nothing wrong with publicly saying that one piece of hardware is more challenging to work on than another.


Again most third party titles looked and ran better on the 360 for most of the last generation, not optimizing makes sense but then again when you look at 360 exclusives they weren't visually head and shoulders above the multi-platform versions with few exceptions.

Similarly we see 900P versus 1080P this generation quite a bit which seems odd if developers are truly trying to produce titles to the least common denominator... and I'd add when we've seen titles have similar performance and people have questioned parity the most common response has been developers don't do stuff like that.

I don't want to dwell on the past, currently though 900p to 1080p is about what you would expect from the first batch of games. Personally I consider that "parity" because to the masses it's a negligible difference, both versions will for the most part look and run identically. But I never expected that resolution would become such a big bullet point to forum posters and the media alike, so now I wonder if there that resolution difference will remain. It would be ironic if gamer outrage actually would lead to the the ps4 gpu idling more with multi platform devs deciding to target 1080p for everything.
 
It would be ironic if gamer outrage actually would lead to the the ps4 gpu idling more with multi platform devs deciding to target 1080p for everything.
There'll be enough opportunity to crank up a few GPU effects to eat idle cycles. PS4 will get the same game with at least a little more lighting quality and/or AA quality and GPU particles physics improved a bit. Exclusives will make the most of the machine. The similarity with the PC is going to work well for Sony and devs in adding inconsequential improvements.
 
Wasn't it also quite well known that Xbox 360 was usually the "lead platform"?

Initially yeah because it's tools were orders of magnitude better. Later on though it made more sense to lead on ps3 because that gave you instant parity on the 360 with little effort. Today it would make sense to lead on the xb1, that would give the ps4 version automatic parity. However people have been alluding to the fact that ps4's dev tools are better in which case I'd expect everyone to lead on ps4 until that is rectified. There really is no reason to suffer using crappy dev tools, that just costs you more time and money for nothing.


There'll be enough opportunity to crank up a few GPU effects to eat idle cycles. PS4 will get the same game with at least a little more lighting quality and/or AA quality and GPU particles physics improved a bit. Exclusives will make the most of the machine. The similarity with the PC is going to work well for Sony and devs in adding inconsequential improvements.

Yeah most likely. On paper the gpu difference between the xb1 and ps4 is quite large but in reality it doesn't take much at all to eat up all that difference in compute power. People can see that on pc already where just a couple of settings differences can easily consume expensive sli gpu setups. Initially I thought it would have been similar graphics on both xb1/ps4 with one being 900p and the other 1080p, but with reolutiongate maybe we will transition to 1080p on both with some element of visuals kicked up a slight notch on ps4.
 
When Albert came out and said there is no way MS would conceded 20% power to Sony and that we wouldn't see a difference in games this year, most gamers assumed that PS4 GPU performance would not be taken advantage of. At that time many developers and comments here on B3D said that was ludicrous, that developers would not force parity and cited last generation as evidence.

Now your saying the exact opposite hence my comments. What you're suggesting doesn't seem to line up with what we've been experiencing so far this generation or what we say the majority of last generation.

I could see a developer spending more time on XB1 version to get results but its also true that almost anything they do to optimize there will lead to better results on the PS4. Other than memory what optimization can be done that won't benefit the PS4?
 
Status
Not open for further replies.
Back
Top