Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Okay, still... they know what they get, good CPU and a lot of video memory, that's not a luxury you have with the Xbox One and the eSRAM.

No it's the exact opposite. With a PC you have no idea what you're going to get. You just develope the game to be as scalable as possible and hope whatever hardware configuration can cope. Titanfall is a perfect example of that with it's use of uncompressed audio to accomodate low end dual core CPU's. The XB1 is the exact opposite in that the developers know EXACTLY what hardware they need to target and have access to that hardware at a much lower level than afforded by DX11 on the PC.

As for video memory, you couldn't have picked a worse example. That's probably the X1's only big advantage over mid range gaming PC's. With the X1 you know you have 5GB of memory available to do what you want with. On the PC you have to build the game in such a way that it can run on GPU's with as little as 1GB of dedicated video memory.

If used well the Xbox One would give that PC a very hard time. It is in the article though, the settings doesn't match those of the console, but the experience is better....

Why would the XB1 give that PC a hard time? Do you have any idea how they actually compare on a technical level or is this just your opinion based on what you wish to be true rather than hard evidence?

The GTX760 doubles and in some cases more than doubles the GPU in the XB1 in almost every way. The only exception is memory bandwidth but the XB1 onlt has an advantage there when you assume the esram is used to 100% of its theoretical potential (which would not be the case in the real world). Obviously it also lacks in the size of local video memory available.

I'm not sure what you're saying with the second part of your comment? The settings are the same apart from texture resolution which is lower on the PC (on account of it being limited to 2GB video memory). Other than that the PC runs at a much higher resolution and has a higher framerate.

But in the future there'll be newer consoles that can beat the current home pc. It's a cycle. Then the pc outpaces consoles again.

So your argument is just wait another 8 years for the next round of consoles to be more powerful than PC's? And even if someone was willing to do that, they are likely to be disappointed. Just as when this generation launched it was already significantly outpaced by gaming PC's of the time it's pretty likely that if there is a next generation of consoles the same situation will be true again. Bottom line is that there will likely never again be a console launch where the console is more powerful than gaming PC's

Give me specialiased hardware any day of the week, 'cos yes, that's really nice, a PC CPU could produce great sound but -dunno what bkillian might think- many of the possibilities of SHAPE couldn't be replicated on a PC.

Actually it's the other way around. You can do pretty much anything on a CPU where as SHAPE will be limited to what's pre-programmed into the hardware. As far as I understand it SHAPE doesn't allow the developer to programme custom audio routines into engines unlike both a CPU and True-Audio so in that respect, although it's very fast, SHAPE definitely wouldn't be a substitute for those solutions.
 
Although easy to state next-step resolutions as marginal increases, they actually represent significant percentage increases. And even then, the visual results are questionable (how much better really is 1080p than 900p in the eyes of most gamers?). Titanfall is already struggling at 792p. What do those 21% extra pixels get you that contributes in a noticeable way on screen? If it's a compromise that doesn't benefit the experience at all, it was the wrong one. 720p with higher framerate or whatever would likely be a better experience.

I quite agree! I was trying to use the original language to demonstrate that once you step outside the 720/1080 mindset you basically have choices about detail vs load.

Ryse looked amazing at 900p, but the frame rate suffered. Lower might have been better.

Of course, if the resolution isn't the bottleneck here, an extra 20% pixels could be a freebie. Respawn may have been targeting 720p and found they could give a little extra. We don't really know.

We don't know, but it's fun to speculate and I'd speculate that while in many cases the resolution does cause a lower frame rate, the most grievous and sudden drops (the <20 fps ones) are CPU related and that they probably figure they might as well get the extra detail.

I don't disagree with Inuhanyou's thinking though. 792p is a marginal increase that'll lead to more blurring on 720 native sets and no significant visual advantage on other displays, so one has to wonder why choose that resolution? I won't go so far to suggest that it's only to avoid the 'last gen 720p' label, but I wouldn't try to counter argue with every resolution increase being marginal, because they're not. Especially compared to 720p which is an option for any game wanting to target smooth, high framerates.

I disagree with Inuhanyou and yourself on this one. 900p is roughly as marginal and increase over 792 as 792 is over 720.

Additionally, most 720p sets (and indeed most tv sets) aren't set to 1:1 pixel mapping, and many '720p' sets aren't actually 1280 x 720p. Plasmas - god's own tv choice - are all 768, for instance. And even my glorious 768 Plasma defaults to 1080p input.

I'd speculate that the number of 1280 x 720 panels receiving a 1280 x 720 input, that have been set to 1:1 pixel mapping, and that are going to be used as an Xbox One gaming display, is insignificant compared to the number of sets that will benefit from having a more detailed image.

792p will provide more detail for any set inputting at 1080 and will also provide a degree of super sampling for 720/768 tv sets. It will even 'overpower' the overscan on those 720p panels that that aren't set to 1:1 (which will be, like, almost all of them).

DF should do an article investigating this! :D
 
That’s why Full Range sucks so much, despite DF treating it as if it was the Holy Grail, which is not the way to go.
If the source material is full RGB, your device can send full RGB and your display is capable of full displaying full RGB (and calibrated), then you will get a fuller image going full. Using RGB limited will result in colour space remapping which will result in loss of 12.5% of your available range.

I don't know what Xbox offers but PlayStation offers limited (forces limited always), full (forces full always) or automatic. Automatic is the best, it'll check the display device (DVI, HDMI only) and see if it's capable of full range. If it is and the source material is full it'll use full, if not, it'll use limited. You can test this by checking your TVs service menu.
 
No it's the exact opposite. With a PC you have no idea what you're going to get. You just develope the game to be as scalable as possible and hope whatever hardware configuration can cope. Titanfall is a perfect example of that with it's use of uncompressed audio to accomodate low end dual core CPU's. The XB1 is the exact opposite in that the developers know EXACTLY what hardware they need to target and have access to that hardware at a much lower level than afforded by DX11 on the PC.

As for video memory, you couldn't have picked a worse example. That's probably the X1's only big advantage over mid range gaming PC's. With the X1 you know you have 5GB of memory available to do what you want with. On the PC you have to build the game in such a way that it can run on GPU's with as little as 1GB of dedicated video memory.



Why would the XB1 give that PC a hard time? Do you have any idea how they actually compare on a technical level or is this just your opinion based on what you wish to be true rather than hard evidence?

The GTX760 doubles and in some cases more than doubles the GPU in the XB1 in almost every way. The only exception is memory bandwidth but the XB1 onlt has an advantage there when you assume the esram is used to 100% of its theoretical potential (which would not be the case in the real world). Obviously it also lacks in the size of local video memory available.

I'm not sure what you're saying with the second part of your comment? The settings are the same apart from texture resolution which is lower on the PC (on account of it being limited to 2GB video memory). Other than that the PC runs at a much higher resolution and has a higher framerate.



So your argument is just wait another 8 years for the next round of consoles to be more powerful than PC's? And even if someone was willing to do that, they are likely to be disappointed. Just as when this generation launched it was already significantly outpaced by gaming PC's of the time it's pretty likely that if there is a next generation of consoles the same situation will be true again. Bottom line is that there will likely never again be a console launch where the console is more powerful than gaming PC's



Actually it's the other way around. You can do pretty much anything on a CPU where as SHAPE will be limited to what's pre-programmed into the hardware. As far as I understand it SHAPE doesn't allow the developer to programme custom audio routines into engines unlike both a CPU and True-Audio so in that respect, although it's very fast, SHAPE definitely wouldn't be a substitute for those solutions.
What can I say. You're a PC snob... That was overkill. PC gamers have always been to me the *rich upper class* of gaming. Literally, pbjliverpool you always expect your games to be longer, deeper, better-looking, easier to configure and control...mainly just plain better than anything else.

As for the sound I dare you, run a sound benchmark on any wimpy little 2GHz CPU, and compare it to what SHAPE could produce. Then there is the GPUs...ugh... The Xenos 2, so to speak, kicks the pants off many PC's GPUs. Regardless, PC games sometimes are amazing, not saying otherwise. And PC hardware is better still. Besides that, you can upgrade. For the Xbox One yes, you are right the games are ALL built around one hardware profile, but is is an evolving console too, when they free up an extra 8% GPU then things might get even more interesting. They don't risk incompatibility issues by doing that. You are informed about the PC world and your PCs are built to be better, I am not discussing that, but you have a long story of having the best of the best GPUs, which leads me to the point that DF comparisons aren't fair. You are very happy with those comparisons because you love the PC, okay.., I forgive you for that.

Besides, in my opinion there is a fine line between standing 3 meters away from your TV using a handheld gamepad than standing 1 meter away from a PC monitor playing with desk mounted mouse and keyboard combo, so it is not really comparable. Yet DF insists on that, and talks crap about the Xbox One. Again, unfair.

Those are very biased comparisons that don't make people happy, at least not me. You know the terms they mention and talk about because you are geeky like most PC gamers -not saying that in a negative tone- and you understand what AA is, or SMAA, AF, what is a bug -console games weren't usually buggy in my Xbox days-, and so on and so forth.

The point is that with a console, you pop a game in, it's ready to go.

EDIT: forgot to mention that the PC doesn't have to deal with a small pool of RAM /like the esRAM/, unlike the Xbox One

DSoup and Function, I gotta go soon, but I will just say that something which isn't compatible with every single device out there isn't worth it. Sure, I play all games on my desktop and laptop PCs at full range, but the displays aren't standard displays nor TVs, they are Standard RGB compatible, I am sure, although I didn't test it.

Xbox One is a console made for TV, so their recommendation telling people that enabling Standard RGB might be beneficial is understandable. My TV from 2013 doesn't have a service menu, not that I know of, the manual don't mention it.
 
DSoup and Function, I gotta go soon, but I will just say that something which isn't compatible with every single device out there isn't worth it. Sure, I play all games on my desktop and laptop PCs at full range, but the displays aren't standard displays nor TVs, they are Standard RGB compatible, I am sure, although I didn't test it.
By your logic games should not support anything more than basic stereo sound because surround isn't standard for everybody. Similarly it was folly for PS3 and 360, let alone PS4 and One, to support 1080p because not everybody has a 1080p TV.

There are still cruddy displays out there but the Samsung I bought for my PS3 in 2007 supported full RGB as did the Sony I replaced it with in 2010. I do find your attitude somewhat bizarre, coming from somebody who posted and posted and posted about their attempts to calibrate their TV for the best possible image quality.

What's the point of having, IIRC, a 36-bit 69 billion colour panel if you're going to remap a 16.7m colour space into an effective 11.2m colour space? :???:
 
If you are going to use a TV to play, Limited range is undoubtedly the best choice. If you are going to use a computer monitor it is a matter of preference. Still.. full Range sucks quite a bit.
Full range is The Future. I doubt many TVs sold now, or over the past few years, don't support it*. And every TV going forwards will. So why gimp your colour space to a standard set in the ancestry of TV? Overscan and limited range should be rid of. Modern TVs can work without it and that provides a better experience. Every console should support that as a standard, and optionally extend legacy support for those still running older displays. Like not putting composite out into your console but assuming people will be using HDMI, and then provide a legacy solution for the few who can't handle HDMI.

Microsoft recommend on their Xbox.com site to use Standard Range, ‘cos for a TV it is best, and you will never have problems with that range. Why is it the best advice to NEVER use full range on a TV?
That's an easy support solution. "We recommend an inferior picture because it's easy for the techno-illiterate to use. Most folk can't get their heads around setting up Full RGB on their console input and setting this on the console will lead to black crush. Best tell them not to tamper. They don't even calibrate their TVs anyway, nor even use the correct picture mode, so they're not going to notice Full RGB."

So why choose a limited range TV and what problems may arise if you don’t? First, basically movies , videos and all the material you see on DVD or Blu- Ray format is encoded in YCbCr and Limited range ...
The video playing software can easily interpolate the limited RGB range to the full RGB range. That's no real reason at all.

If in doubt always use Limited range and the image will look good to everyone regardless of the TV.
Exactly. For the mainstream who don't really understand what they're doing, use limited. But games capped to limited is stupid legacy thinking. How are we to have progress if we keep targeting outdated standards and lowest common denominators?

I disagree with Inuhanyou and yourself on this one. 900p is roughly as marginal and increase over 792 as 792 is over 720.
Yes, but I don't know when the choice of 792p was ever on the cards. I'm assuming that devs are still thinking terms of TV inputs - 720p or 1080p. So those are they resolutions they target (for maximum 1:1 fidelity), and then they detour from those targets for performance reasons. Ryse's 900p was 1080p cut down a bit. Titanfall's 792p is 720P plus a bit. I don't think any dev would look at 900p and then think a drop to 792p is okay as the loss in resolution is marginal, because they'll be comparing it to the 1080p benchmark.

If we lose those exact resolutions, we may start to see more generic resolutions selected by performance analysis. This early on though, I'm confident that the resolution choices are really 1080p and maybe 720p, with deviations.

I'd speculate that the number of 1280 x 720 panels receiving a 1280 x 720 input, that have been set to 1:1 pixel mapping, and that are going to be used as an Xbox One gaming display, is insignificant compared to the number of sets that will benefit from having a more detailed image.
That is true, I'll grant. AFAIK most 720p sets are 768p. Still, there's going to be some native 720p sets out there, whereas there are zero 792p sets. If they made the game 768p, I could understand (does XB1 support arbitrary resolutions over DVI?), but 792p is really odd.

792p will provide more detail for any set inputting at 1080 and will also provide a degree of super sampling for 720/768 tv sets.
I don't think any small fraction of supersampling has any positive benefit to IQ. You need at least a decent amount to notice.

I'd certainly like to hear the reasoning behind the res choice. It may be a case of the maximum size that'd fit, and that was the limit of reasoning. Maybe looking at a broader picture, a different choice would be made (better AA at a lower res).

* Edit: I may be very wrong on this. Wouldn't be the first time the industry has dragged its feet pointlessly. Can't find a list of TVs supporting Full RGB
 
What can I say. You're a PC snob
You made poorly reasoned arguments, he addressed them...you then respond with this. Really, do you have any idea what forum you're on?
Then there is the GPUs...ugh... The Xenos 2, so to speak, kicks the pants off many PC's GPUs.
Well, yes...and? What is the point of this comment? What's relevant is price/performance, the GPU's that the Xenos2 "kicks the pants" are either ancient or integrated.
but is is an evolving console too, when they free up an extra 8% GPU then things might get even more interesting.
Indeed.

8% more.
 
Literally, pbjliverpool you always expect your games to be longer, deeper, better-looking, easier to configure and control...mainly just plain better than anything else.

Better looking and easier to control (insofar as higher framerates and more control peripheral options) I'll give you. Not sure about the rest though,

As for the sound I dare you, run a sound benchmark on any wimpy little 2GHz CPU, and compare it to what SHAPE could produce.

That's a straw man argument. No-one mentioned "wimpy little 2Ghz CPU's" (however you define one of those). We are talking about an AMD FX 6300. A CPU with a base turbo clock speed of 3.5Ghz and a turbo clock 4.1 Ghz. With well over double the per core performance of the Jaguars in the XB1 and the same number of available cores to the game, that CPU should quite easily be able to spare a full core or two to audio.

I'm not saying for sure that's is or isn't enough to outperform SHAPE in game related audio. I don't know enough about SHAPES relative performance in that regard and neither do you but their's no doubt that the CPU would offer a lot more flexibility for developers to programme custom audio solutions should they so wish.

And besides, why bring up SHAPE (or Kinect) in the context of Titanfall? Is Titanfall demonstrating some more impressive audio solution than that which is available on the PC? If not (and there is no evidence of that) then it's of no relevance to the DF article being discussed.

Then there is the GPUs...ugh... The Xenos 2, so to speak, kicks the pants off many PC's GPUs.

This is another Straw man. No-ones saying that Durango isn't more powerful than some PC GPU's. For example it's more capable than AMD GPU's from at least the Radeon 7770 downwards and Nvidia equivalents.

But we're not talking about some random PC's GPU's, we're (or rather DF is) talking specifically about the GTX 760. And that GPU has 72% more shader performance than Durango on paper and 142% more pixel fill rate, geometry performance and texturing performance. And that's assuming the XB1 has access to 100% of it's GPU's resources which you're 8% comment points out that it doesn't. So even when it gets that 8% back, the extra performance (and more in fact since the XB1 will still reserve a small percentage of GPU time) of the 760 detailed above still stands.

which leads me to the point that DF comparisons aren't fair.

I don't see why they are unfair. They quite clearly state the spec of the PC that they are comparing too and they've gone into detail about both the spec and cost of that PC in the past. It's a £500 PC so yes it costs more and lacks some functionality of the XB1 (and vice versa) but they aren't moving the goal posts or comparing the console to a PC that costs 10x as much. It seems to me like a pretty fair comparison.

Besides, in my opinion there is a fine line between standing 3 meters away from your TV using a handheld gamepad than standing 1 meter away from a PC monitor playing with desk mounted mouse and keyboard combo, so it is not really comparable. Yet DF insists on that, and talks crap about the Xbox One. Again, unfair.

I'm not sure what you mean here? What are DF insisting on and how is it unfair?

EDIT: forgot to mention that the PC doesn't have to deal with a small pool of RAM /like the esRAM/, unlike the Xbox One

Nevertheless, it's a known quantity of the XB1. Developers know it's there and can target it's strengths and weaknesses specifically, unlike the DF target PC which was never targeted specifically by the game developers. And thus is at a clear disadvantage in terms of optimization.
 
Lego the Movie next-gen Digital Foundry face-off:

http://www.eurogamer.net/articles/digitalfoundry-2014-lego-the-movie-next-gen-face-off



Well, they're wrong. Both use the exact same AA but PS4 has a better horizontal resolution. I suspect native 1920x1080 resolution for XB1 vs 1920x1200 for PS4:

Good find.

You should seriously consider contacting them about resolution checks. Someone else doing it would be extremely refreshing. Although the attitude could use some work.

On those images it's obvious the horizontal resolution is different, like in 5 seconds I knew PS4 had a better horizontal resolution. How could they miss the PS4 supersampling?
Half the game is under DOF and post-AA on top of that. It's great that you can see a difference (such eyesight, so amaze, much nitpicking), but given the circumstances, I don't agree that it was "obvious" at all.


It's a curious difference. If they're ok with supersampling, why not go for something higher? The bottlenecks should not be on the CPU-side, and the GPU differences are widely known. I suppose we'll never know all the reasoning (if there is one). They did mention a bit more tearing in certain spots on PS4, which probably could have been solved with native 1080p instead. Given that they've already hit the checkbox 1080p feature, there's no reason to make the output quality suffer with tears or wacky scaling factors.

That it's [strike]a fairly typical PC resolution[/strike]* does raise a question about it being a configuration mistake. The PC version itself has a number of issues in the effects department, and... who knows. Maybe they were rushed to hit a deadline (the actual film release). *shrug*

*Does look like it might be 1280 vertical instead http://forum.beyond3d.com/showpost.php?p=1834184&postcount=3902
 
Too bad you can't disable v-sync in the next gen Lego games like you could on some last gen. This would give a good indication of how much headroom they have.
 
What forum do you think you're on? Because this is a tech forum and the analysis that playability suffers due to a framerate varying from 60 to single digits and gobs of screen tear is most certainly a proper analysis.

I've been on this forum for a very long time. I know exactly where I am. I never said it wasn't a proper analysis. The video from digital foundry is mostly selected clips of heavy actions scenes and the framerate fluctuations for the most part between 40-60fps, with occasional drops to 30fps and very very rare drops below that. It is representative of the game.

All I'm saying is that the few people saying the game shouldn't have been released yet are wrong. It plays great, and it's a lot of fun. No, it isn't a technical marvel, but a framerate between 40-60fps isn't too bad. The drops to 30 are certainly still playable and the odd drops below are so rare and short that it doesn't affect the game in the larger view of things. Some people may not think those framerates are good enough, coming from PC, but there aren't many consoles games that do better.

I'm definitely interested in hearing more about why they made certain choices, or what they could have done better. Titanfall 2 should be very interesting, because I'm expecting they have a lot of room for improvement.
 
I'm definitely interested in hearing more about why they made certain choices, or what they could have done better. Titanfall 2 should be very interesting, because I'm expecting they have a lot of room for improvement.

And likely on additional platforms. I would expect they would use Frostbite 3, much better engine.
 
Too bad you can't disable v-sync in the next gen Lego games like you could on some last gen. This would give a good indication of how much headroom they have.

It would certainly make analysis more interesting for any game; performance profiling, but without the dev tools, so more like general trends across scenes. XD
 
I've been on this forum for a very long time. I know exactly where I am. I never said it wasn't a proper analysis. The video from digital foundry is mostly selected clips of heavy actions scenes and the framerate fluctuations for the most part between 40-60fps, with occasional drops to 30fps and very very rare drops below that. It is representative of the game.

All I'm saying is that the few people saying the game shouldn't have been released yet are wrong. It plays great, and it's a lot of fun. No, it isn't a technical marvel, but a framerate between 40-60fps isn't too bad. The drops to 30 are certainly still playable and the odd drops below are so rare and short that it doesn't affect the game in the larger view of things. Some people may not think those framerates are good enough, coming from PC, but there aren't many consoles games that do better.

I'm definitely interested in hearing more about why they made certain choices, or what they could have done better. Titanfall 2 should be very interesting, because I'm expecting they have a lot of room for improvement.

Many PC owners will be playing this game at lower FPS than the console version, if i have to turn down the settings on a Nvidia 680 to get a near perfect 60 FPS at 1080p i would expect plenty to be way lower than me (what does 680 equal card cost today?). Simply because it's more or less the norm to play at the Native resolution that your monitor have. Of course those with lower res monitors is easier off. My friend that plays at 1440p started out at with a pretty low FPS. He turned down pretty much everything (Disabled AA among other things) to get to a acceptable FPS

However, on the XBOX1 there is a hardware scaler and a choice of a lower res to get FPS (they should have gone for 720 and great upscale imho). But more importantly, everybody that plays the game on the XBOX1 gets the same experience, and that (imho) is worth something when it comes to a MP game. When things get heavy and FPS suffers, everyone suffers. And the cheating part on the PC is almost a deal breaker. If i could play this with Mouse+Keyboard on a console i think that would be where my time would be spend.

My CPU is around 44% or so the game doesn't use all 4 cores. The sound excuse is among the weirdest i have heard in a FPS discussion. If you play this game on a CORE2DUO decompressing sound is not the major issue for a low FPS, i would say everything else is. And the sound in the game is very limited as well, maybe because of the uncompressed sound, so they have to stream it? The number of sound channels is way to limited, if i have a voice that is telling me about my Titan that is about to be ready it often get cut off by another voice over telling me something else.. weak...
 
Many PC owners will be playing this game at lower FPS than the console version, if i have to turn down the settings on a Nvidia 680 to get a near perfect 60 FPS at 1080p i would expect plenty to be way lower than me (what does 680 equal card cost today?). Simply because it's more or less the norm to play at the Native resolution that your monitor have. Of course those with lower res monitors is easier off. My friend that plays at 1440p started out at with a pretty low FPS. He turned down pretty much everything (Disabled AA among other things) to get to a acceptable FPS

However, on the XBOX1 there is a hardware scaler and a choice of a lower res to get FPS (they should have gone for 720 and great upscale imho). But more importantly, everybody that plays the game on the XBOX1 gets the same experience, and that (imho) is worth something when it comes to a MP game. When things get heavy and FPS suffers, everyone suffers. And the cheating part on the PC is almost a deal breaker. If i could play this with Mouse+Keyboard on a console i think that would be where my time would be spend.

My CPU is around 44% or so the game doesn't use all 4 cores. The sound excuse is among the weirdest i have heard in a FPS discussion. If you play this game on a CORE2DUO decompressing sound is not the major issue for a low FPS, i would say everything else is. And the sound in the game is very limited as well, maybe because of the uncompressed sound, so they have to stream it? The number of sound channels is way to limited, if i have a voice that is telling me about my Titan that is about to be ready it often get cut off by another voice over telling me something else.. weak...

Slight tangent here: Apparently TF should use as many as up to 6 cores if you have them available (assuming the PC version is in step with X1). Have you checked to see that some of your cores aren't 'parked' ?
 
Many PC owners will be playing this game at lower FPS than the console version, if i have to turn down the settings on a Nvidia 680 to get a near perfect 60 FPS at 1080p i would expect plenty to be way lower than me (what does 680 equal card cost today?). Simply because it's more or less the norm to play at the Native resolution that your monitor have.

You're running at 86% higher resolution and achieving a better frame rate than the XB1 (according to DF the XB1 version is far from near perfect) so I'm not sure how applicable it is to compare your experience to the experience of others who only want to match the XB1's experience.

The fact that you had to turn down a few settings (AA and textures I believe?) doesn't say much about what other PC gamers will have to spend on a GPU to achieve XB1 level frame rates.

Sure some may choose superior image quality over frame rate but if frame rate is all you care about then even if we specify at least matching the XB1's graphics settings and resolution as a minimum then anyone with a GTX 660 or above should be just fine. And lesser GPU's would be fine too at the expense of graphics quality and/or resolution.

This video shows a 760 achieving a 63fps average at max settings, 1080p and 4xMSAA:

http://www.digitalstormonline.com/u...-gtx-780-ti-gtx-780-gtx-770-gtx-760-idnum168/

You really don't need much GPU to match the XB1's performance and image wuality in this game.
 
Slight tangent here: Apparently TF should use as many as up to 6 cores if you have them available (assuming the PC version is in step with X1). Have you checked to see that some of your cores aren't 'parked' ?

Hmm gonna look again, but when i checked i only saw around 44, maybe it just gets it things done.

You're running at 86% higher resolution and achieving a better frame rate than the XB1 (according to DF the XB1 version is far from near perfect) so I'm not sure how applicable it is to compare your experience to the experience of others who only want to match the XB1's experience.

The fact that you had to turn down a few settings (AA and textures I believe?) doesn't say much about what other PC gamers will have to spend on a GPU to achieve XB1 level frame rates.

Sure some may choose superior image quality over frame rate but if frame rate is all you care about then even if we specify at least matching the XB1's graphics settings and resolution as a minimum then anyone with a GTX 660 or above should be just fine. And lesser GPU's would be fine too at the expense of graphics quality and/or resolution.

This video shows a 760 achieving a 63fps average at max settings, 1080p and 4xMSAA:

http://www.digitalstormonline.com/u...-gtx-780-ti-gtx-780-gtx-770-gtx-760-idnum168/

You really don't need much GPU to match the XB1's performance and image wuality in this game.

I had no idea i wanted to compare the experience, i just wanted to point out that the XB1 version isn't necessarily worse of than the PC version. Those with 1080p monitors will need a powerful GPU in order to get the required FPS. Those on the PC will live with cheaters, those with the XB1 will get the same experience in MP.

And how nice a video that shows me how the game runs, i could run it on my own rig.. no wait i did :)
I got plenty of frames, way more than 60, it's the drops that i want to avoid.
 
Those with 1080p monitors will need a powerful GPU in order to get the required FPS.

Why? How is playing at a non-native resolution on the PC any different to playing at a non-native resolution on the XB1 where the standard output device will be a 1080p TV?

Why does the PC have the native resolution requirement applied to it while the XB1 doesn't? If your answer is a better scaler then do you have some evidence of that?

I got plenty of frames, way more than 60, it's the drops that i want to avoid.

Fair enough but by specifying that you need to get your minimum frame rate over 60fps you're going way, way beyond the performance benchmark set by the XB1 so I'm confused why you're using that level of performance as a yardstick for how much GPU power you'll need to match the XB1 experience.
 
I'm confused why you're using that level of performance as a yardstick for how much GPU power you'll need to match the XB1 experience.

I am not, i am saying that many PC owners will see the same framerates and lower that the XB1 owners are. You are messing up 2 things, my own personal experience with the game and how much i have to turn off to get a smooth 60 fps.

And XB1 experience vs what many PC users will experience. And you will need a beefy PC to get a good experience on the PC.
 
I am not, i am saying that many PC owners will see the same framerates and lower that the XB1 owners are. You are messing up 2 things, my own personal experience with the game and how much i have to turn off to get a smooth 60 fps.

And XB1 experience vs what many PC users will experience. And you will need a beefy PC to get a good experience on the PC.

If you're not using your own performance experience with the 680 as some kind of example of how much power you'll need to match the XB1 experience the following statement was a pretty strange one to make:

Many PC owners will be playing this game at lower FPS than the console version, if i have to turn down the settings on a Nvidia 680 to get a near perfect 60 FPS at 1080p i would expect plenty to be way lower than me (what does 680 equal card cost today?).
 
Status
Not open for further replies.
Back
Top