Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
You can keep waiting for the dust to settle, I'm going to trust the enormous amount of evidence we already have.




people can beat their chest with numbers and at the end of the day with the systems being mediocre as they are, one strength metric will not tell the story...

these machines as expected, will be very close in rendering of games this gen.

Not to mention your evidence is incomplete and very very early.
 
It's one of those articles that everyone should read even though it really does just state stuff that you'd think would be common sense by now. Heck I've been posting much of that since 2005, mostly to little effect though. That's because unfortunately people for whatever reason can't understand the concept of the whole being bigger than the sum of the parts and instead will fixate on one spec and run with that, no matter how irrelevant it may be to the bigger picture. They can understand it with cars where they realize that a big horsepower car from the 60's will get out lapped by a modern lower horsepower car because the former can't turn or stop, but when it comes to game consoles they just can't make the leap past individual specs. Alas it means that articles like the above will have little to no effect on anyone's thoughts or opinions on the matter, even though you'd think by now that most people would have figured it out. It's a noble attempt by the dude writing the article, but most likely a wasted effort.

Too true. Gaf is already crucifying the article calling it another PR fluff piece or another example of EG sniffing MS butt.

It's too bad because more articles like this could be very interesting giving devs a platform to speak freely.
 
Too true. Gaf is already crucifying the article calling it another PR fluff piece or another example of EG sniffing MS butt.

It's too bad because more articles like this could be very interesting giving devs a platform to speak freely.

Yeah, especially since the dev did not do any comparison between both machines...and if one wants to step into interpretation land, one could easily conclude that he did even praise the PS4 architecture, which seems to be one of the most balanced consoles up to date...whereas the quite limited size of the ESRAM in the X1 is obviously a major bottleneck, which may be the cause for the launch rumors we hear and the initial problems we may (or may not...until no hard evidence is available) see in multiplat games.

Good article, keep them coming.

@Joker: do I remember wrong, or didn't you always back then favor a better GPU over a better CPU. Iirc, the PS4architecture is exactly what you always wanted back then...right?
 
It's one of those articles that everyone should read even though it really does just state stuff that you'd think would be common sense by now. Heck I've been posting much of that since 2005, mostly to little effect though. That's because unfortunately people for whatever reason can't understand the concept of the whole being bigger than the sum of the parts and instead will fixate on one spec and run with that, no matter how irrelevant it may be to the bigger picture. They can understand it with cars where they realize that a big horsepower car from the 60's will get out lapped by a modern lower horsepower car because the former can't turn or stop, but when it comes to game consoles they just can't make the leap past individual specs. Alas it means that articles like the above will have little to no effect on anyone's thoughts or opinions on the matter, even though you'd think by now that most people would have figured it out. It's a noble attempt by the dude writing the article, but most likely a wasted effort.

The forum warriors are going to miss the point and use this article to attack or defend the brands. It's unfortunate they won't be able to read it for what it is. I mean, we're talking about one of the lowest forms of human beings. They seem to be beyond hope.

Overall, it's well written, but not much different than what several devs have posted here, which went entirely ignored or distorted into flame bait. Good read though.
 
It will likely be a year before we can really determine the real world difference between the consoles. Remember the whole 30 fps vs 60 fps of early Madden in the PS3 and 360? That wasn't indicative of the PS3's lack of power. And it got resolved eventually.

But the problem is first impressions matter, and if MS had less mature tools than Sony for a console being released at the same time, MS still messed up big time.
 
It will likely be a year before we can really determine the real world difference between the consoles. Remember the whole 30 fps vs 60 fps of early Madden in the PS3 and 360? That wasn't indicative of the PS3's lack of power. And it got resolved eventually.

But the problem is first impressions matter, and if MS had less mature tools than Sony for a console being released at the same time, MS still messed up big time.

Yeah, that is exactly what I meant...if you judge launch games, you won't get the right answer. Madden this gen is quite the example and shows that launch games are not representative...one has to wait until the dust settles and devs and the manufactures had time to enhance their engines/tools.
 
Movies aren't interactive.

That's irrelevant. Real life is interactive and has "unlimited framerate" but when you record it onto storage media it's not interactive anymore. That doesn't mean frame rate becomes less important when being played back. Watch a real life video recorded at 60fps vs 30fps...there's a huge difference. Whether higher framerate is artistically better is up to debate as The Hobbit has shown.

All else being equal - which in the case of games, of course, it won't be due to the fact that there are hardware limitations - a higher framerate is always better.

Isn't that kinda like saying if I had wings I'd be able to fly? Of course I don't have wings so I can't actually fly unless I wore a wingsuit, or jumped on a plane, heli etc. Bottom line is all computing devices have hard limits. That doesn't mean the choice of going 30fps is due to said limit as it could easily be an artistic choice just as any arbitrary frame rate. There is never enough power because you could keep throwing stuff at the hardware until it comes to a crawl. If my goal as a game director was to stick to 15fps and make the graphics, animation, physics etc the best possible then that is a design choice. Same if I decided I wanted to stick to 48fps or 60fps.

When talking about unlimited artistic fidelity there is no such thing as "unused processing power left over for higher frame rates". If there is processing power left over for higher frame rate then your art is the limiting factor which is not possible if your goal is CGI quality stuff. Most people can't wrap their head around this concept.
 
Last edited by a moderator:
That's irrelevant. Real life is interactive and has "unlimited framerate" but when you record it onto storage media it's not interactive anymore. That doesn't mean frame rate becomes less important when being played back. Watch a real life video recorded at 60fps vs 30fps...there's a huge difference. Whether higher framerate is artistically better is up to debate as the The Hobbit has shown.



Isn't that kinda like saying if I had wings I'd be able to fly? Of course I don't have wings so I can't actually fly unless I wore a wingsuit, or jumped on a plane, heli etc. Bottom line is all computing devices have a hard limit. That doesn't mean the choice of going 30fps is due to said limit as it could easily be an artistic choice.

This is complete rubbish. No developer chooses to run the game at a lower frame rate for artistic reasons, it's a technical limitation pure and simple. Anyone even remotely entertaining the notion that a developer would specifically choose 30fps over 60fps for artistic reasons when given the choice of either for free is incredibly naive.

In 25 years of PC gaming have you ever seem just 1 game that deliberately limited the frame rate to 30fps "because it's better" regardless of hardware capability? Of course not because it's a ridiculous concept.
 
@Joker: do I remember wrong, or didn't you always back then favor a better GPU over a better CPU. Iirc, the PS4architecture is exactly what you always wanted back then...right?

We would have killed for the ps4's setup back in 2006, no doubt about that. It's almost like a dream setup, a bunch of x86 cores, reasonable gpu, a chunk of gddr5, good tools, what's not to like? Although the bickering back then was really split 50/50 on hardware and software complaints, hardware being the ps3 riddled with various bottlenecks and software being ps3 tools in a horrific state or not available at all. Clearly they learned and fixed all of that which is great, hopefully means less crunch for the developers....although somehow I doubt that.

If given the choice I do always favor more gpu, but we're getting to the point that it will take a mountain more gpu to really see the difference. It's like you see the difference between 16 colors and 256 colors quite easily, a bit less going from 256 to 32k colors, then less 32k to 16 million, etc. Each successive step takes much more computational power just to break even and yet the visual difference becomes less and less. I certainly don't subscribe to the theory that the extra gpu grunt on ps4 is wasted or unbalanced, I can't imagine there not being situations where they wouldn't be helpful. But I just don't see it being substantially enough to where it makes a huge visual difference especially given the typical persons inability to see differences in resolution once you get past a certain point. I was thinking back to console generations long gone and there have always been ways to pick one better than the other, but this coming generation strikes me as one where you'll be able to put both versions of the games in front of 100 people and 99 times they will not be able to pick a favorite.


Scott_Arm said:
The forum warriors are going to miss the point and use this article to attack or defend the brands. It's unfortunate they won't be able to read it for what it is. I mean, we're talking about one of the lowest forms of human beings. They seem to be beyond hope.

That sad part of that is that so many more devs would post on these forums if it wasn't for them. There's actually quite a few really smart people in various industries posting on b3d but I notice how they largely avoid the console forums, which is a shame but I totally understand why as it wouldn't take long for them to be vilified.


This is complete rubbish. No developer chooses to run the game at a lower frame rate for artistic reasons, it's a technical limitation pure and simple. Anyone even remotely entertaining the notion that a developer would specifically choose 30fps over 60fps for artistic reasons when given the choice of either for free is incredibly naive.

In 25 years of PC gaming have you ever seem just 1 game that deliberately limited the frame rate to 30fps "because it's better" regardless of hardware capability? Of course not because it's a ridiculous concept.

Yeah, I mean I suppose I get him wanting to defend their choice in public but c'mon, no dev would ever chose 30fps over 60fps just for artistic integrity. 60fps is a holy grail of sorts, we all froth at the mouth just at the thought of attaining it.
 
It will likely be a year before we can really determine the real world difference between the consoles. Remember the whole 30 fps vs 60 fps of early Madden in the PS3 and 360? That wasn't indicative of the PS3's lack of power. And it got resolved eventually.

The problem with that comment is that compared to PS4 and XBOX One, the PS3 and 360 were drastically different architectures.

RAM differences aside, we should be able derive how these consoles should perform in comparison to each other in a more simplified and accurate manner.
 
This is complete rubbish. No developer chooses to run the game at a lower frame rate for artistic reasons, it's a technical limitation pure and simple. Anyone even remotely entertaining the notion that a developer would specifically choose 30fps over 60fps for artistic reasons when given the choice of either for free is incredibly naive.

As I said most people can't wrap their head around this concept.;)

Reread what I wrote and think long and hard about it....

If I chose two color graphics for a game it is art direction "choice" or a hardware limitation? In the case of Ryse do you think if they could run the game at 60fps with the same graphics fidelity they would do so? If yes then why? How do you know they wouldn't throw more stuff at the console to bring it back down to the 30fps "target" and make the graphics even BETTER than it is currently??? That my friend is called a design choice.;)

In 25 years of PC gaming have you ever seem just 1 game that deliberately limited the frame rate to 30fps "because it's better" regardless of hardware capability? Of course not because it's a ridiculous concept.

Ignoring the time and costs involved have you ever seen a PC game made with unlimited art, physics, AI? Of course not because a game like that wouldn't even be able to run at 1fps...;)
 
Last edited by a moderator:
The problem with that comment is that compared to PS4 and XBOX One, the PS3 and 360 were drastically different architectures.

RAM differences aside, we should be able derive how these consoles should perform in comparison to each other in a more simplified and accurate manner.


common meme that is wrong imo


There is a lot more to it than that. You can not compare them regardless how much that practice might make some feel superior or more comfortable. the article alone addresses how you can not do that
 
Joker, do you also/still have the problem that you do a tec analysis of the onscreen content while gaming...like the dev said?

Just wondering, do you sometimes get the blues and miss all this high tec stuff going on in cutting edge game development, or are the 'bad' memories (like crunching time) overshadowing everything cool/positive ?
 
This is complete rubbish. No developer chooses to run the game at a lower frame rate for artistic reasons, it's a technical limitation pure and simple. Anyone even remotely entertaining the notion that a developer would specifically choose 30fps over 60fps for artistic reasons when given the choice of either for free is incredibly naive.

In 25 years of PC gaming have you ever seem just 1 game that deliberately limited the frame rate to 30fps "because it's better" regardless of hardware capability? Of course not because it's a ridiculous concept.

Hm...not sure if I agree with you on this point. 30fps looks and feels quite different, and imo a dev can choose to use this temporal resolution to transport a certain feeling and achieve a certain look (also due to being forced on a closed spec platform).

It is similar to a black&white cinema movie released today...although not a technical problem, the decision is by choice...although one could argue that technically colored is always better than b&w.

So I can see that Crytek felt that they could deliver a better cinematic experience going 30fps...of course, including also way better pixel output quality which is also certainly needed for a 'cinematic' feeling.

So I honestly don't see where the problem is with Cervat Yerlis comment.
 
do you also/still have the problem that you do a tec analysis of the onscreen content while gaming...like the dev said?

You should have heard the conversations that took place in party chat when running through Halo 4. Plenty of Tech-Analysis and ridiculing for such horrible choices leading to very inconsistent visual experiences. :LOL:
 
As I said most people can't wrap their head around this concept.;)

Reread what I wrote and think long and hard about it....

If I chose two color graphics for a game it is art direction "choice" or a hardware limitation? In the case of Ryse do you think if they could run the game at 60fps with the same graphics fidelity they would do so? How do you know they wouldn't throw more stuff at the console to bring it back down to the 30fps "target" and make the graphics even BETTER than it is currently??? That my friend is called a design choice.;)

I fully understand your argument, I just think it's wrong. There is no artistic value to be taken from running a game at a low framerate. Not unless you consider high control latency and sluggish visuals to be art. Your comparison to video footage is also flawed. Video natuarlly blurs one frame into the next because the source (real life) is in motion. Video games do not work that way. Anyone can easily test this for themselves by comparing 24fps game footage to 24fps TV footage. Clearly the TV is much smoother.

Ignoring the time and costs involved have you ever seen a PC game made with unlimited art, physics, AI? Of course not because a game like that wouldn't even be able to run at 1fps...;)

This has absolutely nothing to do with my argument. PC developers and IHV's could easily add a 30fps limit into games or drivers if there was any perceived benefit. There isn't so they haven't.
 
In such an example the artistic merit wouldn't come from the 30 fps per se, but from doubling the work that could be performed on each pixel.

For a PC example, think of the PC gamers who chose to run Crysis at 25 fps on very high instead of 50 fps on medium.

My preference is normally for 60 fps even though that will necessarily mean lower detail or resolution, and things seem to be changing for the better next gen, but prior to CoD fisting every other FPS most devpubs automatically went for 30 fps. *shrugs*
 
In such an example the artistic merit wouldn't come from the 30 fps per se, but from doubling the work that could be performed on each pixel.

For a PC example, think of the PC gamers who chose to run Crysis at 25 fps on very high instead of 50 fps on medium.

My preference is normally for 60 fps even though that will necessarily mean lower detail or resolution, and things seem to be changing for the better next gen, but prior to CoD fisting every other FPS most devpubs automatically went for 30 fps. *shrugs*

This guy gets it...;)

I fully understand your argument, I just think it's wrong. There is no artistic value to be taken from running a game at a low framerate.

But the fact is there is artistic merit in choosing 30fps over 60fps because at 30 you could increase the per pixel fidelity which is an artistic choice.

Not unless you consider high control latency and sluggish visuals to be art.

30fps is not sluggish for a 3rd person sword fighting game...it's good enough for most people especially when you're spending the extra processing on better per pixel quality. Many people prefer better graphics at 30fps vs mediocre graphics at 60fps.

Your comparison to video footage is also flawed. Video natuarlly blurs one frame into the next because the source (real life) is in motion. Video games do not work that way. Anyone can easily test this for themselves by comparing 24fps game footage to 24fps TV footage. Clearly the TV is much smoother.

The comparison is not flawed. Videogames can use motion interpolation to make it appear as if it's running at a higher frame rate if the developer chooses that direction.

This has absolutely nothing to do with my argument. PC developers and IHV's could easily add a 30fps limit into games or drivers if there was any perceived benefit. There isn't so they haven't.

This has everything to do with your example, you're just not recognizing it. Current PC graphics can't render movie level CGI in real time that's why all PC games can/do run at higher framerates. If the most powerful PC setup could only render CGI level games at 30fps you would see some games capped at 30fps because going higher would decrease fidelity. If Crytek made Ryse for PC and deliberately capped it at 30fps to allow better per pixel fidelity then that is a design choice which is no different from X1.
 
Last edited by a moderator:
This guy gets it...;)



But the fact is there is artistic merit in choosing 30fps over 60fps because at 30 you could increase the per pixel fidelity which is an artistic choice.


This has everything to do with your example, you're just not recognizing it. Current PC graphics can't render movie level CGI in real time that's why all PC games can/do run at higher framerates. If they were powerful enough to render CGI level games at 30fps you would see games capped at 30fps because going higher would decrease fidelity. If Crytek made Ryse for PC and deliberately capped it at 30fps to allow better per pixel fidelity then that is a design choice which is no different from X1.

So now you're saying that the choice of 30fps over 60fps is to allow greater pixel fidelity. That's fine and a perfectly valid developer choice. Its not however what you were saying originally which was that 30fps could be better than 60fps all other things being equal. I.e. its a choice based on artistic preference and not technical limitations.
 
Status
Not open for further replies.
Back
Top