The real PC equivalent of the PS4 console?

Skaal

Newcomer
Hi Guys,

i know there are other threads wich cover some of the Issues i want to talk about here but since i have a feeling that i bring a Point wich is generally not considerd (enough) i made a new Thread for it.

So here is the Thing. We all know those DF Videos where they test the PS4 against a i3\ GTX750ti System in wich the PS4 most of the times losses in performance.

My Standpoint on that is as follows:

Iam a strong Advocate of the Theory that a Console should and must have a performance Edge over a PC. Of course with similar components. It is a must! The OS the diverted Ram, Directx all that must ad up to an performance Bottleneck in a gaming PC.
So i think that the use of an i3 as the CPU in such Tests are unfair. The CPU in the Test PC System should be alot weaker than a i3.

What about the Athlon 5150? I know it has only 4 Cores and the PS4 has 8 but otherwise is is the perfect fit for a comparison. Or not?!

Especially with older Games wich are not fully take advantage of more than 4 Cores. Like BF4, Assassins Creed Unity and Syndicate and Raise of the Tomb Raider.


I found 3 Videos on youtube wich show those 3 games running on a PC with an Athlon 5320 and a GTX750ti
and cannot even match PS4s image quality, FPS, in those games and even using 720p Resolution.

BF4

Assassins Creed Syndicate


So my main Question would be - in Case of games that dont perform better with more than 4Cores on PC, is the comparision of a PS4 and a PC system with either the Athlon 5150 or even the 5320 valid?

Greetings
 
PS4 has two 4 core Jaguar modules, though. So even if you are only using 4 cores, you could use 2 from module 1 and 2 from module 2. That means that the cache and schedulers would be less of an issue on PS4 because each module has it's own. Also, PS4's system memory is faster than the system memory of anything a 5150 supports, and the fillrate is higher than a 750Ti, which is also limited by it's 2GB framebuffer.

I thin the reason why the i3 and 750ti have been used as comparisons isn't because or performance but more because you could build a computer for roughly the cost of a console with those parts.
 
There is no cpu equivalent on PC.

Ps4's Gpu is technically a 7870 with 2 cores disabled and downclocked. With the benefit of having as many ACE units as the 7970, unlike the desktop 7870. A 750 ti definitely isn't as good, 1050 ti 4GB is better though. On paper the 1050 ti has less bandwidth, but in reality with Pascal's memory compression it's probably, worst case scenario, not any worse than PS4 with its shared pool of memory.

Keep in mind not all ps4 games are made with low level access to the hardware in virtue of getting every last drop of the machine. In fact I suspect there are a big number of games that might not even utilize 4 of the jaguar cores. Take a look at how Re5 runs on Ps4, that is definitely not a low level coded port my dudes. So when you say, PS4 HAS to be getting more out of the hardware than its PC equivalent, well.. no it doesn't. But it's a safe bet that something like Uncharted 4 is getting more out of PS4, than a high level coded game on a PC's 7870 could.

see colon said : "I think the reason why the i3 and 750ti have been used as comparisons isn't because or performance but more because you could build a computer for roughly the cost of a console with those parts."

Indeed.
 
I thin the reason why the i3 and 750ti have been used as comparisons isn't because or performance but more because you could build a computer for roughly the cost of a console with those parts.
But it ended up matching the PS4 games in quality and fps most of the time already.
 
While I have always found the PC versus console experiments very interesting, I think they are somewhat misleading.

As has been indicated here already, these systems, PS4/Xbox One and there more powerful counterparts, all punch above their weight when used for that intent. A general use PC with those i3/750Ti, at least in my opinion would never be able to replicate a game like Spider Man to the extent the PS4 is doing, a tailor made game playing to the strengths of the closed system. At least that is how I see it, but I'm no programmer, I just feel you need to step up the PC hardware a bit when running full Windows OS to get the same effect of these consoles, so I can understand why the i3 was chosen, and for affordability in the equation.

Skall, your post did make me think about some of the latest arcade titles that had come out within the last few years. A lot of them are using relatively modest hardware, but putting out some impressive visuals. Probably because they are running stripped down versions of Windows or a flavor of Linux, where you can get a way with much less beefy hardware. I was really surprised to find out that Crusi'n Blast was only use a 750Ti, I thought for sure what was going to be using a Maxwell card of some flavor.
 
I'm a firm believer that i3's, Sandy and Ivy Bridge in particular are reasonably good equivalents to the console x8 Jaguar cores in the consoles. Westmere would be a stretch because of it's weaker SIMD implementation. Battlefield 1 is the only game that had issues running on the i3-2120 based machine I used to have, it's definitely a quad-core game. However, I know more recent i3's didn't have issues. Everything else like GTA5 with all the pedestrian and traffic options at max ran fine, with few dips below 30 FPS.

Base PS4 graphics are broadly equivalent to a Radeon 7850. It has a couple less enabled compute units, but higher clock to match the PS4's ~1.8 TFLOPS. Console optimization, would put it at best 7870 (~2.5 TFLOPS) territory. I used to have an R9 270X, a rebranded 7870 with faster memory, and it always seemed a bit of a step ahead of the PS4 in multiplatform games.

Base Xbox One is broadly equivalent to a Radeon R7 260: same number of CUs, TMUs, and ROPS using a partially disabled Bonaire GPU die. However, it is faster clocked by default, and enjoys ~200 GFLOPS extra horsepower over the original Xbox One and only about 100 over the Xbox One S. A 7850 definitely would give you a high enough leg up in theoretical GFLOPS to keep it above the XB1 in game performance. Currently I'm using an RX 550 2GB, and I feel it's fairly equivalent similar to Xbox One performance. While my particular version maintains about 1.1 TFLOPS with default boosts, it does use more efficient Polaris architecture, and overclocks easily to 1.4 GHz for another extra ~300 GFLOPS if I want to.
 
@davew hah I had no idea crusin survived after the n64 era :p From youtube it kind of looks like 6th gen ish era cgi, not bad.

750 ti may be in the Kepler 7xx series but it's actually maxwell 1.0 architecture. The 9 series is maxwell 2.0.
 
I miss when Digitalfoundry still tried to match the base PS4 experience. Last time I noticed they had their i3 and 750 Ti in their comparison was Rise of the Tomb Raider, where that PC only could push 13-25 FPS with comparable settings to Xbox One.
https://www.eurogamer.net/articles/digitalfoundry-2016-rise-of-the-tomb-raider-pc-face-off

Another mention was for Deus Ex MD, where they recommend a GTX 950 for 30 FPS at console equivalent settings.
https://www.eurogamer.net/articles/digitalfoundry-2016-deus-ex-mankind-divided-face-off
 
Anything DX 11 on PC will have a higher API overhead than PS4/X1, which will particularly affect slow but wide CPUs like the 8 core arrangement in consoles.

DX12 might fare better than DX11, depending on the implementation.
 
thx for your contributions so far! On the i3 - i have not so good CPU internal knowledge. Could someone poit out what exactly make it so fitting for an jaguar comparision? And might there be some other CPU? To me an i3 still appears to powerfull.
 
Will the real PS4 equivalent please stand up, please stand up, please stand up?

Nothing on PC is going to be equivalent to hardware in a closed box environment like PS4, simply because of things like lower OS overhead and being solely focused on games and media, whereas a PC is designed to multitask and be as general purpose as possible.
 
Last edited:
thx for your contributions so far! On the i3 - i have not so good CPU internal knowledge. Could someone poit out what exactly make it so fitting for an jaguar comparision? And might there be some other CPU? To me an i3 still appears to powerfull.
It's not really fitting, but a sandy bridge i3, worst case scenario for the i3, might be comparable to the jags assuming all 6 cores are utilized on the latter. But if the same number of threads are running on console and PC the i3 pulls ahead. The i3 is just the closest thing in price to a console you could build at the time, as was already stated.

The non enhanced consoles are probably closest to a moderately clocked core 2 quad or phenom II. The enhanced consoles are somewhat in between that and a first gen i5. You can always try and compare data on something like http://cpu.userbenchmark.com/Compare/Intel-Core-i3-2130-vs-AMD-Athlon-5350-APU-R3/m424vsm10020

Really though the platforms aren't really comparable. One's a digital only platform with wider options, but also incompatibility issues and consoles have physical media and ease of use. So I really wouldn't worry about what a "fair" comparison is and just play what you like.
 
Last edited:
But what pc hardware équivalent can run UC4, HZD, spiderman, DC.... RDR2 ! :mrgreen:
I would say something like an 8 core jaguar clocked somewhere near 1.6GHz. But a part like that doesn't exist on PC. I think looking at system requirements on PC ports of PS4 and XB1 games is telling. Most games require a core 2 quad, or an FX8000 series, or an i3, so I think anything in that range is probably about where we're at in real world performance. Maybe a bit lower, in fact, as I've seen plenty of youtube videos with people running games that require an 8 core FX on 6 core ones just fine. Also, other than DC (which I'm having trouble figuring out what games you are referencing), all of those games are 30FPS. Those games look great but the framerate reflects the computational speed of the system as a whole.

I think it's important to remember that PC gamers can be a bit... dramatic I think is putting it kindly. Look at the few games recently that have been shipped with a 30FPS max framerate and you'll see how they are regarded as having terrible performance and being unplayable. It even happens sometimes with games that max out at 60. So in the hypothetical, if a game like Horizon came out on PC with identical or even better visuals but a max 30FPS cap, it wouldn't be held in the same high regard as it's console counterpart, regardless how good of a game it is.
 
you missed the point, i was not talking about graphical fidelity, but exclusives, it's the main selling factor for the ps4, if every ps4 game was released on PC, the value would be a lot less interesting for sure !
 
you missed the point, i was not talking about graphical fidelity, but exclusives, it's the main selling factor for the ps4, if every ps4 game was released on PC, the value would be a lot less interesting for sure !

Dont think he missed the point as a pc thats equivalent in hardware, lets say a downclocked FX8000 series cpu and a 7850 would be able to match or exceed PS4 non pro performance in games like GoW and HZD, if they were avaible on the platform and were optimized, offcourse. Those games arent on pc and even if they would, they wouldnt be optimized to it as they are on PS4. Not that that would be a problem though, as pc gamers want 60fps atleast anyway and native 1080p/4k, i think if HDZ was on pc, atleast a 7970 with something like a fast i5 or i7 would be better suited. Thats hardware from 2012.
Personally, if HZD would be on windows i would be able to give my i7 6700k/GTX980/G16gb pc a workout and play the game on perhaps much higher settings with 4k if possible. Seeing the Pro sports a GTX970 eequilevant GPU i dont see that as something impossible.
 
lol i thought this thread was about that new AMD athlon processor that just went out and it was a PS4 in PC format.
 
Dont think he missed the point as a pc thats equivalent in hardware, lets say a downclocked FX8000 series cpu and a 7850 would be able to match or exceed PS4 non pro performance in games like GoW and HZD, if they were avaible on the platform and were optimized, offcourse. Those games arent on pc and even if they would, they wouldnt be optimized to it as they are on PS4. Not that that would be a problem though, as pc gamers want 60fps atleast anyway and native 1080p/4k, i think if HDZ was on pc, atleast a 7970 with something like a fast i5 or i7 would be better suited. Thats hardware from 2012.
Personally, if HZD would be on windows i would be able to give my i7 6700k/GTX980/G16gb pc a workout and play the game on perhaps much higher settings with 4k if possible. Seeing the Pro sports a GTX970 eequilevant GPU i dont see that as something impossible.

I agree on the FX-8150 sentiment: modules being made up of dual integer cores sharing a pair of 128 bit FMAC FPU units. Clock it down and you probably have a reasonable fascimile.

As for HZD on the PC, that's probably the generation's most unsuitable game to port to PC since it's using the GPU for procedural generation and placement of game world objects. Could be a problem even if you port it to DirectCompute to run on either AMD or Nvidia. However, Guerilla said in it's own presentation that it only used on average 250 µs of GPU time, which is hardly anything at all. On a much narrower set of SIMD engines (like on a CPU) I guess you could run extra CPU cycles where the GCN GPU could do it in fewer, but doing it efficiently might require setting an entire CPU core or two to the job to keep things snappy. I also wonder if GCN's Async compute scheme played a role in how fast it could be done. I'm sure this whole scheme saved Guerilla a good host of CPU power, hence there would've been no point to do it on the GPU array.
 
Last edited:
lets say a downclocked FX8000 series cpu a
The floating point performance is lower per clock on an FX 8000 by about half, though. So you would need double the clock speed to match the theoretical FP performance, but then your single threader performance would also be about double. AMD has never released a desktop PC part that's exactly like the CPUs in Xbox One and PS4.
 
Back
Top