*spin* "Low Settings" on PC ports vs Consoles

That would have been true if the console's architecture were vastly different from PCs, as in the past. Now PCs and consoles share the same underlying X86 architecture, whatever applies to consoles will also apply to PCs.

I still think it's very significant (advantage) the fact they can optimize the game for the performance characteristics, since all the millions of consoles will run exactly the same

Yeah, PS3 was delayed, when it eventually came out, PCs had moved on technically. but few people actually had 8800GTXs, the vast majority of PC players had Geforce 7s and 6s, or Radeon X19xx and X18xx.
When X360 came out, we -average PC players- suffered in 3 games:
1-Tomb Raider Legends: which was a real chore to run even on a GTX 8800. With amazing visuals and state of the art technical prowess with the full armada of DX9 at the "Next Generation" game preset.

2-Splinter Cell Double Agent, same as Tomb Raider, it has a "Next Generation" preset too. at the time of release it didn't even support Geforce 8 family of GPUs, and it required a patch to run on them.

3-Call Of Duty 2: same as the above two. the directx9 path of the game (effectively a next gen preset) was so taxing on anthing but the 8800s.

Oblivion and Rainbow 6 Vegas also gave PCs trouble, but to a milder extent. PCs didn't recover until Geforce 8 family and Intel C2Ds became wide spread. Granted PC versions of these games had better image quality(texture resolution, AA, AF, draw distance, shadow resolution) but you get the idea.

Only very high end PCs, now, even medium PCs are better than consoles. just slap a GTX 760 on a core i5 and you are good to go.

at the time the PS3 was released you had the older high end GPUs with lower cost options like 7900GS as a very affordable card (sub $200) which had decent performance on these games, also the x1950pro and so on,

(cod 2 dx9 http://www.legitreviews.com/xfx-geforce-7900-gs-480m-video-card-review_384/8
splinter cell max http://www.anandtech.com/show/2130/7)

few months after the 8800GTX you had the market flooded with more affordable and very fast 8800GTS 320MB, 6 months later or so you had 8600GT's everywhere, and while the 8600GT was way down on specs it was giving a hard time for the "high end" geforce 7s on more complex DX9 games already, and the Core 2 vs K8 also had a huge effect on CPU pricing during the following year.

now for the 360 things were a lot harder for the PC, but still, I think you already had competent options on the PC if you didn't expend the most to go with high end,

now many PC gamers still had old Radeon 9600s and such, so running new 2006 games was really painful, but the "x1800gtos" and such already offered nice performance on these games without paying high end cards money just a few months later,

but I understand your point, right now it's clear the 760 + any i5 is superior and affordable, going back to 2006 and comparing things, with single core CPUs still being a factor, the consoles with more exotic hardware and so on is not that simple.
 
even medium PCs are better than consoles. just slap a GTX 760 on a core i5 and you are good to go.
a Medium PC has a gf card costing US$250 (based on newegg price)

250 just the the videocard, mediumrange, you serious

based on benchmarks it seems to perform about the same as a ps4
 
I still think it's very significant (advantage) the fact they can optimize the game for the performance characteristics, since all the millions of consoles will run exactly the same
I would expect that to be vastly more significant than the fact that the CPUs happen to use the same ISA.
 
The blanket statement that pc's will have way superior graphics is a pet peave of mine, my pc sure as hell aint going to have better graphics. Even digital foundry uses it in there verdicts, they say things like the pc version of a game is better because of ssd and what not but the last time I checked my pc doesn't have a ssd. There is a big difference between a normal pc and a $800 high end pc.

Another thing, whats going to be the difference between low and ultra because if its mainly image quality it doesn't matter as much, yes when console games were sub 720p and blur fests it made a big difference but now that console gamers have 1080p with reasonable post aa its not such a big deal for console gamers.
 
a Medium PC has a gf card costing US$250 (based on newegg price)

250 just the the videocard, mediumrange, you serious

based on benchmarks it seems to perform about the same as a ps4

Medium/mid range spans a range of perfomance. The 760 is most certainly in the medium range as confirmed by it's name (the 60).

However it's also pretty much at the top end of said range and overkill if all you want to do is match console capability.

A better comparison would the the $150 R7 265: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127790

This GPU is marginally faster than the one in the PS4 in most respects so should give a very comparable experience.
 
I would expect that to be vastly more significant than the fact that the CPUs happen to use the same ISA.

Take the GPU in my above post and pair it with a 6 core AMD CPU though and how much room does that leave for console optimisation?

Baring in mind that said GPU has the same number of CPU cores, CPU ISA and SIMD instruction set as the consoles while the GPU has the same size caches, the same ratio of functional units (as the PS4) and an almost identical feature set.

Sure there's still going to be some room for console specific optimisations. But clearly those optimisations are going to be far more applicable back to the PC's than they ever have been in the past.

Compare that to optimising code for execution on Cell or to make use of the 360's edram. Those kind of optimisations were hopelessly incompatible with PC's of the day but things are very different now.

And when DX12 hits which from the sound of it may actually be identical of very near identical to the XB1 version, console specific optimisations are going to be very thin on the ground compared to just a year ago.
 
Another thing, whats going to be the difference between low and ultra because if its mainly image quality it doesn't matter as much, yes when console games were sub 720p and blur fests it made a big difference but now that console gamers have 1080p with reasonable post aa its not such a big deal for console gamers.

I assume by extension that you consider to the XB1 and PS4 to be even since the resolution differences between them (generally in the region of 900p to 1080p) are also no big deal?

Incidentally, I'd agree if that were the only difference. However once you add in framerate improvements and additional graphical effects which seem to be present in many of even the first generation games then advantages seem more significant. And of course that's before we add in the impact that VR/3D has on the resolution/graphics/framerate triad.
 
I assume by extension that you consider to the XB1 and PS4 to be even since the resolution differences between them (generally in the region of 900p to 1080p) are also no big deal?

Incidentally, I'd agree if that were the only difference. However once you add in framerate improvements and additional graphical effects which seem to be present in many of even the first generation games then advantages seem more significant. And of course that's before we add in the impact that VR/3D has on the resolution/graphics/framerate triad.

What I was trying to get across with that paragraph was that the mainstream gamer might not care about those differences but I know full well that a high end pc is and will always win out in a hardware race. I also dont think people should say that the pc version will be superior. It might be on yours but that doesn't mean it will be on mine. They should always catagorise it with high end or whatever.
 
I am currently playing Sly Cooper on PS3.

The game can be considered a very low settings game (or a PS2 game but HD and at 60fps) but I find it more graphically attractive than: Bioshock infinite and Remember me on PS3. Those 2 games really push graphically the PS3, like a PS3 on ultra settings.

But Sly cooper has by far the better image quality (MLAA) when the 2 others games are wrecked by a very strong FXAA. I finished Bioshock because the game is great but couldn't continue playing with Remember Me after 1 hour. Most of the high resolution textures shimmer constantly because of FXAA and of course all the details are blurred, it's really awful and really distracting.

FXAA blurs everything and is worse in motion because of the FXAA induced texture shimmering and pixels shimmering of the sub-geometry seen when in motion. It's such a terrible post effect solution, whenever I play a game with a strong FXAAed game all I see is the blurring, the lack of clarity, the textures shimmering, the sub-geometry shimmering...
 
Medium/mid range spans a range of perfomance. The 760 is most certainly in the medium range as confirmed by it's name (the 60).
Perhaps its name is but at its price ~400nz, Ive always brough medium range GFX and have never brought anything over 250nz

This GPU is marginally faster than the one in the PS4 in most respects so should give a very comparable experience.
Are there any sites that do benchmarks between pc & ps4/xbone. digital foundry do but dont disclose what PC they use for some strange reason, i.e. I have no doubt the hardwares the same or better in the PC but you'll be seeing more OS/gfx driver overhead as well
 
Perhaps its name is but at its price ~400nz, Ive always brough medium range GFX and have never brought anything over 250nz

Nvidia has x30 and x40 ranges (low end), x50 and x60 (mid range) and x70, x80 and Titan for the high end. Cosidering the much higher costs of the 3 higher ranges I'd say it's failry reasonable to consider the 760 to be at the top of the mid range.

Are there any sites that do benchmarks between pc & ps4/xbone. digital foundry do but dont disclose what PC they use for some strange reason, i.e. I have no doubt the hardwares the same or better in the PC but you'll be seeing more OS/gfx driver overhead as well

Obviously none directly. You have to infer performance. For example DF did a direct comparison between the 260X and the PS4 with the 260X being a little slower but not by a huge amount. The 265 meanwhile is a fair amount faster thanks to having double the memory bus. The best comparison I can find for the 265 is to look at Battlefield 4 benchmarks where it's managing 58fps on high at 1080p. PS4 achieves a similar rate at roughly high settings and 900p.

http://www.anandtech.com/show/7754/the-amd-radeon-r7-265-r7-260-review-feat-sapphire-asus/8
 
But wouldn't the PS4 hold up better in the long run than a comparable GPU (say an R7 265 or 750 Ti) with console optimizations etc? Obviously you can upgrade your PC whenever you want, but I'm speaking in terms of value.
 
But wouldn't the PS4 hold up better in the long run than a comparable GPU (say an R7 265 or 750 Ti) with console optimizations etc? Obviously you can upgrade your PC whenever you want, but I'm speaking in terms of value.

it was clearly the case for a PS3 vs Geforce 7,

but, the Geforce 7 was abandoned on the PC very quickly, perhaps even before the PS3 was released the focus was already on G80, while it looks like "GCN" is going to be the main architecture for AMD for a long time, and they will focus their optimization on GCN....

but who knows, "GCN 1.0" cards (like the 265) are already showing disadvantages when you have for example Mantle scaling worse and having more bugs with it compared to "GCN 1.1" cards (290, 260)

talking about value, f you look at the old gen, the cost with more expensive games/accessories and services (specially for the 360) not to mention the bizarre failure rates from early hardware probably don't leave an average PC with a few upgrades over the years looking to bad.

but I think low cost PCs were clearly competitive with the consoles only later in 2008-2009 perhaps
 
But wouldn't the PS4 hold up better in the long run than a comparable GPU (say an R7 265 or 750 Ti) with console optimizations etc? Obviously you can upgrade your PC whenever you want, but I'm speaking in terms of value.

I'm sure there's some level of optimisation possible and there are plenty of people on these forums who can give a much better answer than I but I'ds have to question just how far you can optimise a game for the GPU in the PS4 compared with say the 265 which is a virtually identical GPU (right down to cache sizes, ratio of functional units, raw performance characteristics etc...). Obviously that's less applicable to Nvidia architectures though.

There's also the option of going outside of the DX11 spec on the consoles to do things in more efficient ways than are possible on the PC. But DX12 should equalise a lot of that - at least for developers who have the time to take advantage of it.
 
Well I'm not the most educated guy in this area, but from reading developer comments, it seems that being a fixed box where the hardware doesn't change is a pretty big deal for them in terms of optimization. The overhead issues on Windows seems to be going in a better direction with Mantle or DX12 etc., but I still feel that the PS4 is a considerably better value than a comparable GPU today.
 
I'm sure there's some level of optimisation possible and there are plenty of people on these forums who can give a much better answer than I but I'ds have to question just how far you can optimise a game for the GPU in the PS4 compared with say the 265 which is a virtually identical GPU (right down to cache sizes, ratio of functional units, raw performance characteristics etc...).

Several ACE's up Liverpool's sleeve. ;) :p :runaway: ( who knows what that really amounts to)
 
I'm sure there's some level of optimisation possible and there are plenty of people on these forums who can give a much better answer than I but I'ds have to question just how far you can optimise a game for the GPU in the PS4 compared with say the 265 which is a virtually identical GPU (right down to cache sizes, ratio of functional units, raw performance characteristics etc...).
You could optimise for that GPU, but no PC game is going to because it's a small part of the whole range of GPUs. Code for PC has to be fairly generic to run on any old GPU, with some rare added features or refinements. That's the whole issue with PC utilisation. The micromanagement option isn't viable. It's becoming increasingly less viable for consoles too due to size and complexity of the code base, but as long as only the middleware vendors are worrying about that, the can really fine-tune their engine for each platform (and then have the individual games hack it up as they tweak it to their own needs...).
 
I was more coming from the angle of the developer optimising the engine for specific cache sizes, or a specific balance of shader vs ROP vs geometry vs memory bandwidth resources. If the engine is fundamentally balanced for particular resource ratio's (the PS4's) then those ratio's would also apply for the 265 since it's more or less the same GPU. Obviously that balance would be less applicable to other GCN GPU's that have different resource ratio's, although optimisations for the idiosyncrasies of the GCN architecture such as cache size/speed, feature set etc... should benefit all GCN GPU's. Obviously I may just be greatly oversimplifying the matter though.
 
Back
Top