Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
As for the GPU you mention, that’s a hell of a GPU. A PC making justice to it would be much more expensive than the Xbox One.

It's only a midrange GPU costing $250. A whole system centered around that could cost no more than $600-700 or 25-50% more than the XB1 and sporting more than 25-50% more performance.

but the actual life span of a PC is shorter than a console. Especially now when next generation of consoles blur the line in terms of performance vs PC performance. Now the games can run at superior than HD resolutions and better framerates.

The lines are less blurred now (in terms of raw performance) than they have ever been at this point in a console life cycle. It's true though that the useful life of a PC as a gaming machine is generally shorter than a console. But that's only because developers (and vendors) stop supporting older hardware after certain amount of time because no-one uses it any more.

The only console which could handle 1080p in the PS2/GC/X-box era was the original Xbox. I wonder about the framerate but that’s fact. Now we have consoles which are several times superior than those and you need PCs as 200% more powerful to really notice a difference.

Why? The PS4 is much less than twice as powerful than the XB1 but the differences between console versions are quite noticeable. Even 20% more power can be noticeable in for example a more stable framerate, or the ability to run a higher quality AA setting.

And now when that line is blurred, surpassing the capabilities of a console is a tough task.

Not really, it's actually incredibly easy. More so than it has ever been at this point in a console life cycle. Basically pick up any GPU at the $200 price point or above and you're matching or exceeding console performance. In March 2006 wou'd have had to spend at least twice that on the GPU to compare as well to the Xbox 360.

What can make PCs to look as if they have a longer life span is that you can upgrade it, am I right? So you can even keep the same case. This can make you feel the illusion of thinking you are basically using the same rig...

But would be it the same rig you bought 4 years ago? Certainly not.

That kinda goes without saying. Yes upgrading extends the useful gaming life of your PC but no-ones claiming that it's still the same PC. Partially the same yes but not completely. You could argue it's spiritually the same though ;) In that context I'm still using the same PC I was 2 decades ago!

You just don't realise how much money you have spent on upgrading the rig to get it to play newer games at 30fps.

Decent CPUs (for gaming) are like 100+$. Then there is the mobo. RAM (many PC gamers have the habit of filling all their DIMM’s ‘cos it’s cool :p ).

Then there is the GFX card, some GPUS are crazy expensive, like the Titan. A decent soundcard, and the 5.1 system laying around. SSDs… Display… PSU… UPS…

Not to mention the cash spent on moddeing your rig with the round cables, better fans and also those cold cathode lights.

And then the software! Your OS… you do pay for all these stuff, don't you?

So once you try to sum up all the expenses you realise how much you paid for your 4 years old rig (half the life span of the Xbox 360), to run Crysis 3 at 30 fps today. I’d say the expenses are a lot more than a 500$ console.

Most of the hardware you listed above is unnecessary for a basic console equivalent gaming PC and only included to artificially inflate the price. There are plenty of threads for this kind of thing already on the forum that discuss this kind of thing in detail (most of them closed) so I won't go further into it here. Happy to discuss it further though if you want to open a thread in the appropriate section.

At the pace of which the hardware is improving these days the PC Digital Foundry is using will become obsolete in no time. Whereas the PS4, Wii and Xbox One will improve.

The software (games) will improve, and that same improved software will run on the DF PC as well. The console hardware won't get more powerful and the PC hardware won't get less powerful. That DF PC becoming "obsolete" is only in respect to new PC hardware and the higher quality graphics settings in games that come along with that new hardware.

The console equivalent game settings will just get progressively lower down the scale of any given game. So yes, in 3 years the DF PC may be playing games at "low" settings but those low settings will be equivalent to console settings where today those same settings would be categorized as "high".

Sure, eventually (5 or 6 years down the line) the Kepler architecture will go out of driver support just like DX10 is now 8 years after it's initial release. At that point the GPU really will become obsolete for PC gaming but that's a long way off yet and may even be beyond the life of this console generation.

And then the Xbox One is a console meant to be backwards compatible forever, which will only expand the life cycle of the console.

Except in 5 or 6 years when it's replacement released it will no longer be able to play the latest games just like an older PC. So this attribute doesn't change the balance of the equation.

By that I meant that I do enjoy sitting around with friends or siblings playing a game, it is a different type of multiplayer experience rather than being connected to a server hundreds of kilometres away that many people you’ve never laid eyes on do also connect to.

Agreed, this is my main motivation for owning a console.

Additionally, it is fun to sit in a comfortable seat and take turns at playing a SP game. That doesn't work the same with most PC games.

You can take turns playing a single player game on a PC every bit as easily as you can on a console - provided it's connected to a TV of course.
 
Just for the record, people have a hard time discerning differences in audio. Case in point in this blind test:

thats a flawed test
i could show a test pattern on a titan and on an old 2d card and people wouldnt see the difference
its about features,
does the card process room geometry/ can it wave trace/ can it calculate reverb/ doppler/ can it pitch shift ect ? can it do hrtf in the vertical plane

not sure exactly what shape can do but so far its only been used to reduce game install size
 
Curious. 792p is questionable but a few posts later, lego at 1200p (to follow the convention of calling 960x1080, 1080p with no caveats) is awesome because of the free supersample? Wouldn't the same ss praise hold true for 720/768p displays and the inevitable upscale for higher end displays. Or is there a floor for how much HD+ a resolution can be before the free supersample perk.
This sounds to me like the 1080p vs less than 1080p debate.
An example of this is Ryse. I have yet to see a game that looks as good as that, on consoles, and maybe even on PC. A visual spectacle like anyone has ever seen.
 
Not when it would have to drop framerate or eyecandy or both.

Yeah but when we see games that are the same and one has the better resolution without sacrifices which one would you pick? Also if one platform has the resources to run the same game with the same assets same framerate and better resolution which platform do you think has the most resources to output the better visuals overall if devs decide to sacrifice some resolution?
 
Yeah but when we see games that are the same and one has the better resolution without sacrifices which one would you pick?

Talk all about the theoretical situations all you want, but that's not the situation that's in play. What is in play is a series of trade-off in one game for one platform.
 
What is in play is a series of trade-off in one game for one platform.
one word
titanfall
sub 20->60fps @ 792p prolly averaging about 40fps with a lot of tearing
remember this is the game where the dev's have been all on about 60fps solid
so why didnt they choose eg 640p

personally if it was up to me, I would let the user choose what res they run the game at, just it happens on a PC
if they want high res with low framerate then so be it & visaversa
no AA vs X AA let the user choose

edit - quick google
Some new details on Respawn Entertainment’s Titanfall have come to light. The game runs at a solid 60fps as revealed by the developer and it is fully V-Synced. That means you’ll never see a torn frame on the screen.
what happened?
 
Performance analysis: inFamous: Second Son

http://www.eurogamer.net/articles/digitalfoundry-2014-infamous-second-son-performance-analysis

There's been some discussion about the performance of the game, particularly in terms of a fluctuating frame-rate, and what quickly becomes evident is that Sucker Punch has opted to continue the strategy it used on its PlayStation 3 titles: a solid v-sync working in combination with a completely unlocked frame-rate. The difference here is that while the previous titles in the series would frequently drop beneath the 30fps threshold, it takes a mass of action and GPU-intensive post-processing effects to truly impact inFamous: Second Son's performance. Bearing in mind the high levels of detail, and the overall complexity of the rendering pipeline, that's a stunning achievement.

Certainly, the game is a visual feast. Similar to Guerrilla Games' latest work, inFamous operates with a materials-based deferred renderer, which not only allows for a multitude of dynamic light-sources, but also lights the scene according to the physical properties of the objects present - for example, reflectivity and the roughness. An energy-conserving model like the one used here treats light as energy, calculating how light spreads across the surface over the material according to its physical properties. The results can be absolutely beautiful to behold - reflections in particular (what looks like an expert blend of pre-baked and full real-time) can look sublime.

Also worthy of note is the implementation of state-of-the-art anti-aliasing, believed to be a variant of SMAA T2X, as found in Crysis 3. This is one of the best post-process anti-aliasing techniques we've seen, combining a new take on MLAA with a temporal element. Edge-smoothing is phenomenal, and while there is some ghosting, it is not any kind of real distraction during gameplay.
 
Talk all about the theoretical situations all you want, but that's not the situation that's in play. What is in play is a series of trade-off in one game for one platform.
I didnt question the trade offs between resolution and visual detail and I do recognize that Ryse managed to output certain results due to lower resolution. The point is these trade offs would have been less if the XB1 had more performance.
But there is nothing theoretical about it when have real examples already in multiplatform games where both games are largely the same but have different resolution outputs. Nor there is anything theoretical about one platform having more resources to output more details and effects or better framerate than the other under a fixed resolution. Again we have real examples. Unless we want to completely negate any performance advantage. AFAIK no one here claimed that the higher resolution is the gospel of better visuals.
 
Status
Not open for further replies.
Back
Top