Current Consoles vs High End PCs

Status
Not open for further replies.
example is witcher 2 and Need for Speed Something (cant remember the title, the series that made by burnout guys). My pc cant run it in decent frame-rate despite having spec that more powerfull than Xbox 360.
i have radeon HD 4770, 4 core CPU

This is simply a factor of Max PC setting /= console settings. PC ports will often ramp up shadow resolution, texture resolution, reflection update frequency and lots of other things that might not be massively obvious but will sap plenty of power.

If you run at the same settings and resolution as the consoles then your PC is always going to provide massively more performance even given the constraints of the PC API and lack of hardware specific optimisation.
 
It'll be interesting to see how PC games compare to next gen console games.

I can't see why they wouldn't handle them perfectly from day 1 just as they did at the start of this generation.

They already have the power (or certainly will by the time the consoles launch anyway). The reason we aren't seeing next gen like graphics now is a development constraint, nothing else.

Next gen as I alluded to above the PC's will actually be in a much better position than they were last generation as there'll no longer be an expectation to render at a higher resolution, and the DX11 API is far more streamlined than DX9 was.
 
The timing of Solarus's post is good as it jives with what I was saying in a different thread. Maybe it's totally true and he doesn't see the difference, even on a game like Crysis 2, and pc and console do look the same to him. It's what I've been saying, there are more and more people out there that just can't tell. To me a difference as simple as 2xaf vs 16xaf is utterely massive, it's a night and day difference and one of the many reasons I game on pc as pc games look dramatically better than console with all the improvements they provide. But that's just me, and the reality is that guys like me are becoming more and more the minority as every day passes. I personally know many people like Solarus that just can't tell. Yeah it's utterly mind boggling to me how one can look at a pc game and a console game and view them as remotely comparable but my opinion is irrelevant, that's where things are headed more and more, these are the new breed of gamer that will be shaping the hardware that comes. It's the same reason I've been arguing that in the future it will take a rediculous amount of gpu improvement for the majority of people to be able to see a difference in visual fidelity, and why in the far future smaller devices will be able to compete with other devices that offer 50x gpu power, even to the core gamer market.
 
And yet, last fall, I saw many examples on my pc with e7200@3.2ghz,8800gts, when this card was destroyed by xbox360. Examples: MW3 at 15f-20ps at similar resolution to console, skyrim at 30 but with huge dips , and total unplayable disaster in battlefield 3 (playable only in tank and jet levels) even at lowes settings and 1024/768.

The only way for this to be true is if you are running the PC games at much higher settings than the console. The following benchmarks show that even the integrated HD4000 can easily handle BF3 at low settings and console+ resolutions and the 8800GTS would easily outperform an HD4000.

http://www.firingsquad.com/hardware/Intel_Core_i7-3770K_Ivy_Bridge_Performance_Review/page14.asp

http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/14

This time will be diffrent, this engines will be optimized from the ground up towards consoles. And if the architectures which they choose( for example even havier MT like ibms 4 threads per core or some kinde of fusion type design with strong point on low latency high bandwidth connection , interposers or maybe edrams) will have strong points where pc bottlenecks are, there will be nasty few years on pc ports side...

Today's games are already optimised from the ground up for consoles, that's not going to change significantly next generation - engines are still going to be built on PC workstations. What will change is the PC's ability to rely on a much more efficient API (DX11) than it's had to do this generation (DX9), a lack of requirement to use more power running at higher resolutions because consoles will also be targeting 1080p and most importantly, the far greater relative limitation on power draw for the consoles this time round. I expect all this all adds up to even mid range PC's faring well in next gen console games from day 1 and the gap widening rapidly from there.
 
Last edited by a moderator:
Better frame rates
Better IQ
Higher quality assests
Higher resolution
Mods
Cheaper games
Online supports more players
Better physics

The list is endless....

And the op's comment about current consoles being equal to a high end PC.... I've not laughed that hard for a while... And when he says the only difference he can see in BF3 is resolution then he has a serious eye problem.

A 9600GT can play all console ports at better settings and frame rates then a console.
 
So all in all the 360 and ps3 really are near equal to top end PCs, why do people spend so much money just to play console games at a higher resolution? the next gen consoles are needed to get a leap in graphix. PCs will struggle to run games next generation that consoles will do with ease. If next gen graphix are at 1080p then at that point what is the use of getting the PC version of a game and spending a bunch of money to play it at a little higher than 1080p and having to cut off some features? I guess what i mean is for all the power these gpus have and all the slides nvida and amd show about how powerful they are it still takes the consoles to push graphix and not pc. I mean witcher 2 and crysis were made for pc but look just as good on 360 and like i said earlier even have better lighting on hardware that came out in 2005. then you have games like gears of war 3, uncharted 3, god of war 3 that look much better than any pc exclusive. whats going to be the point of a gaming pc next generation. it seems that each generation the consoles close the gap. this generation the gap was nearly non existent in terms of graphix differences betwen the consoles and pc, next geneation itll cost more for pc gaming to keep up with ps4 and xbox 720 than just to buy the two consoles. someone asked me why samaritan wasn't demoed on a 360 or ps3. I honestly think it could be done on these consoles but the reason it wasn't was becase it had features epic didn't know to or had enough time to figure how to program on the consoles. the consoles are comparable because of their custom hardware and that's why there still isn't after all these years a major difference between this gens consoles and the pc.

not to say i don't like pc gaming, I have a 480gtx graphix card it came with my computer, I jsut don't see how the pc is suppose to be so much more powerful but the games look the same minus the resolution.

I'm sorry Solarus but you have absolutely no idea what you're talking about. There's nothing magical about console hardware, modern PC hardware really does have vastly more available power. It's just not used very well. If it were used well then the graphical differences would be vast.

Current top end PC's are probably already more powerful than the next generation consoles. But since they're running game engines designed for 7 year old console hardware it just doesn't show all that much to the indiscriminate eye. Throw on a game designed for a DX11 based next generation console and today's top end PC's will likely handle it just fine.

It certainly isn't going to cost a lot more to match console performance next generation, if anything it's going to be much easier/cheaper for all the reasons I put in my post above. In fact if consoles really do focus on SOC next generation then you may see GPU's integrated into Intel CPU's within a couple of years of console launch that will match their performance! That's not to say the graphical gap will be any bigger than it was this generation of course. There are too many factors at play to judge which way it will go at this stage. But the potential will certainly be there.
 
You are missing Joker's point.

The average Joe, personified here by Solarus, cannot see any difference. If they can't appreciate any difference, why should they sink more money into a new platform ?

It means two things:
1. New console needs to be cheap, if there is little perceived improvement in the gaming experience, there is little incentive to spend extra cash over a PS360.
2. The improvement in hardware needs to be substantial so that it becomes evident to the masses that the new gen is better.

Cheers
 
Putting aside the fact that it doesn't as others have already shown, whether a game can be done on a platform says nothing about the relative power of that platform. You have to take into account other factors like the level of graphics the game is putting out, the efficiency of the engine, the level of optimisation made for each platform and API overhead.

...

The simply fact is that today's high end PC hardware is 10-20x more powerful than what's in the consoles depending on what you measure. That's not open for debate, it's a known fact. That power isn't used anywhere near as efficiently as what's available in consoles but it is there and if someone whereto try and use it fully you'd see a game that current consoles couldn't hope to look close to.

I'm sorry Solarus but you have absolutely no idea what you're talking about. There's nothing magical about console hardware, modern PC hardware really does have vastly more available power. It's just not used very well. If it were used well then the graphical differences would be vast.

Current top end PC's are probably already more powerful than the next generation consoles. But since they're running game engines designed for 7 year old console hardware it just doesn't show all that much to the indiscriminate eye. Throw on a game designed for a DX11 based next generation console and today's top end PC's will likely handle it just fine.

It certainly isn't going to cost a lot more to match console performance next generation, if anything it's going to be much easier/cheaper for all the reasons I put in my post above. In fact if consoles really do focus on SOC next generation then you may see GPU's integrated into Intel CPU's within a couple of years of console launch that will match their performance! That's not to say the graphical gap will be any bigger than it was this generation of course. There are too many factors at play to judge which way it will go at this stage. But the potential will certainly be there.

How many X should be removed from PC for the API overhead, inefficiency, and unoptimized references you mentioned? With said inefficiencies, unoptimization and overhead, shouldn't (the game you mentioned) crysis 2 run better on consoles considering crysis 2 sought to fix those problems from crysis 1, so both the PC and console don't have as much of those crisis 1 problems? With said problems you mentioned hindering PC performance, wouldn't that mean the console has the advantage compared to similar PC hardware since the console doesn't have those problems?

Just curious if you've factored in some of that overhead in the performance ruminations. And yes, I did mean to spell it crisis earlier since it targetted future PC hardware that caused the perpetual upgrade cycle. And that followed with the "can it run crysis" for years, along with much complaints that the game wouldn't run at max settings on current hardware at the time, and therefore this all points to a true PC game free of consolitis right? Heaven forbid a game runs well and looks great on a range of configurations.
 
The average Joe, personified here by Solarus, cannot see any difference. If they can't appreciate any difference, why should they sink more money into a new platform ?

Give for example Naughty Dog 10x the power of PS3 and (even) he will notice the difference, 100% guaranteed.
 
Compare BF3 on 360 and PC. They look the same only difference is resoluton.
I could have taken you a bit more seriously if you claimed that "most" games don't look any different other than resolution (which, by the way, is pretty important to IQ...), but calling out BF3 in particular is just factually false. BF3 PC doesn't even look like the same game as BF3 console even if you intentionally normalize the resolutions. Do you own both or are you just watching youtube videos or something? I'm trying to give you the benefit of some reasonable explanation beyond just overt trolling.

As has been pointed out, even CPU-integrated graphics can match console performance these days, so you're effectively claiming that the difference between that and a GPU with 10x the power budget is not visible to you. If that's truly the case, I'm sorry for you, but it'll save you a ton of money in that you'll probably never have to upgrade your hardware ever again.

So yeah, I guess we'll have to leave it at "a lot of us can tell the difference", and not in a small way.
 
I used to have an 8800GTS 320 too... and to my knowledge, it can play most current games (at least at 720 without AA) quite well, still. I have upgraded since, though. Even Crysis 2 runs rather well on low to medium settings (which is higher than what the consoles produce) at 1680x1050. The fps do dip, but they do on consoles, too.

Since last fall it just could not (with exeption of mass effect 3 and batman). And crysis 2 runs at about 30fps (with huge dips and freezes ) with lowest settings and definitely not and at 1680x1050. Strange thing, it's running better than MW3 on this set.


A 9600GT can play all console ports at better settings and frame rates then a console.

Maybe it could in 2008 but in 2011/2012 it just can't.
 
The only way for this to be true is if you are running the PC games at much higher settings than the console. The following benchmarks show that even the integrated HD4000 can easily handle BF3 at low settings and console+ resolutions and the 8800GTS would easily outperform an HD4000.

http://www.firingsquad.com/hardware/Intel_Core_i7-3770K_Ivy_Bridge_Performance_Review/page14.asp

http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/14

I said : " (playable only in tank and jet levels)" , and you are linking benchmarks from "thunder run"(tank level):D. Anything other than this level( maybe train), with only two shooting AI's and performance is unacceptable ( muliplayer basically unplayable). Yo can see in this movie ( from 3:00)
http://youtu.be/ylYwTWV1aHg

Guy claims it is running at 720p medium . Look at that dips ( or more accurately freezes), look at those shadows, which are somehow much worse than console version. Moreover scenes in this movie are nothing intensive, add few more solders, runing and grenades and you are seeing slide show. Ironically MW3 was even worse.



Today's games are already optimised from the ground up for consoles, that's not going to change significantly next generation - engines are still going to be built on PC workstations. What will change is the PC's ability to rely on a much more efficient API (DX11) than it's had to do this generation (DX9), a lack of requirement to use more power running at higher resolutions because consoles will also be targeting 1080p and most importantly, the far greater relative limitation on power draw for the consoles this time round. I expect all this all adds up to even mid range PC's faring well in next gen console games from day 1 and the gap widening rapidly from there.

I was talking about beginnig of last gen, remember quake, oblivion ports?, Those were pcgames/engines fast ported to xbox 360. Remember all the shock /complaining about in order CPUs, multi threading and necessity of rebuilding engines etc?.And by the time they were somehow optimized, and pc cached up with brute force... This will not be the case this time. Another thing is full switch to console centric development. Look at skyrim which stock version is closer between pc and 360 than oblivion was six years ago...
This time with this console centric development, new hardware which may have strong points where pc's are weak (interposers with very low latency connections/ high bandwidth, some crazy threading like 4 way IBM in P7/a2, edrams. I bet minimum one of these will end up in consoles.) AND add much smaller/ maybe none( from prespective of average pc)power advantage... There are truly none developers interested in truly pushing PC anymore. Another thing, there are surely powerful forces, namely console manufactures which will depend on seeing these boxes as truly next gen by publics end even geeks and may persuade devs to making console versions stand out for some time/not making pc version/making conversion from last gen machines.
Remember pc versions of Just cause, GRAW, not releasing force unleashed,cod 3, bad company "because pc is to slow". I feel that combination of all this, will make few years of nasty ports on PC ( an probably there will be no light in the tunnel for pc gamers like crysis was last time...)Console exclusives level , let alone higher will be no match for small pc devs.

About API. I don't listen too much these reassurances about efficiency, Dx11 pc-console parity? yeah right , just like parity of dx9 with console and pc at the beginnings... Gosh, I remember all these buzzwords : dx10,stream out, SM4.0, geometry shading and fake screens from flight simulator, and now these "monster" can't even play b3 like a console... Few years later we have third generation of dx11 cards and not a single game have efficient, groundbreaking implementation of any features . Consoles will gain new efficiencies too, and soon we will see new excuses about new API needed, and at the same time excuses about fragmentation of pc base which will be cause of not using it...
 
Guys calm down. I think Solarus knows there is huge the diffrence in hardware, Moores law is still relentless . He probably means how little of this is translating to accutual games.

Just look at this(or comparison of pc and console version of Medal of honor from last page), top of the line console and pc graphics at the of last gen

Timesplitters future perfect
http://www.youtube.com/watch?v=-9g9fHF3z_Q

FEAR
http://www.youtube.com/watch?v=-0d85UexLzw


No contest, even on low quality video.Even IQ gap was much larger last time... And this was norm 4-5years after last console launches, today after nearly 7 years , it's not even funny... No wonder people claiming they can't see the difference... Do you really see improvements of this scale in any pc comparison videos? If there was mearly one game with this gulf, no one would argue that he can't see diffrence.
When i look at this DF comparison videos and articles with paeans about pc versions and think about specifications of current PCs i only think something is terribly wrong.

Pc gamers /graphics enthusiasts should not glorify this minor(optically) , mostly effortless improvements and accustom devs/pubs to this minimal efforts and unoptimized smoke and mirrors(no pun intended:D), but rather loudly complaining . Maybe then someone would see niche and make exception and really push envelope.
 
:rolleyes:

Crysis 1 has hugely cut down GFX on consoles vs the PC version!
http://www.youtube.com/watch?feature=player_detailpage&v=AcKLjgWl7tM#t=229s
Second level overlooking the village, seems a bit different doesn't it?!


Witcher 2 also does not look even close to as good and the lighting system is the same or worse, just changing the lighting does not make it technically better!
http://www.youtube.com/watch?v=cDQbJ6oQznw
http://www.eurogamer.net/articles/2012-04-13-the-witcher-2-360-pc-enhanced-non-enhanced-720p-gallery

the witcher 2 video looks the same except for the tent area, and they probably changed it for artistic reasons. other than that the lighting itself looked just as good if not better on 360. i don't think the inclusion of nuclear bloom on pc means its better. The digital foundry article even says the lighting is better more natural on 360, then you had pc gamers getting mad that the better lighting wasn't in the EE version of witcher2 is telling.

I don't want to turn this into some gamefaq style fight of pc vs console, these are just things that i and others notice. despite wticher being a pc game built for pc it runs just as well on 360. with the supposed power difference i expected alot more. i want to see something that just is impossible for consoles to do.

also isn't crysis on consoles using cryengine 3? i don't think there can be an argument of which version has better lighting.
 
the witcher 2 video looks the same except for the tent area, and they probably changed it for artistic reasons. other than that the lighting itself looked just as good if not better on 360.
Lol what? Look at picture 38 (among others)! And by the way that's depth of field you're seeing, not bloom.

with the supposed power difference i expected alot more. i want to see something that just is impossible for consoles to do.
Battlefield 3 high/ultra, especially multiplayer. Just disregard every other game and keep it simple. If you don't see the difference in that example then we're beyond the realm of talking sense, so let's just close the thread.
 
Last edited by a moderator:
Since last fall it just could not (with exeption of mass effect 3 and batman). And crysis 2 runs at about 30fps (with huge dips and freezes ) with lowest settings and definitely not and at 1680x1050. Strange thing, it's running better than MW3 on this set.




Maybe it could in 2008 but in 2011/2012 it just can't.

My memory can be a bit hazy (has been a while since C2... but I do remember playing it first at low and 1680x1050 and later at higher settings... but I am not sure if I lowered the resolution to 720P to raise framerate to a bearable level...

After that I did upgrade for mere 250€ (i.e. new CPU and new GPU), which now performs better than ever. That's less than the 8800GTS cost back in early 2007. (I was lucky that Asus made a bios update to allow the Phenom IIs)

In any case... yes, the game dips a lot, but it does so on the consoles too... A LOT. And that was with a PC build a week before PS3 came out in Europe, too. It was quite a high end system for its time (though not super high end, by any means), but only cost 50% more than a PS3.

I did play Bulletstorm, too. Ran alright, though that's a game that's best experienced at stable 60Hz, if anything.

My point is, most games coming to PC these days (with few exceptions) are console to PC ports. The consoles haven't had any upgrades in terms of performance either. And with "optimization" and good code, you only get so far. In theory (if the PC ports are well made, not like GTA4, which I can't max out on my 2011 system) all games made for PS360 should all still run adequately on my old PC. I can't testify to it anymore (as I don't use it)... but my laptop (5650 Radeon) is comparably fast (a bit slower than the 8800GTS). And that can also run most PC games maxed at 720P. Again, not all of them, and not all of them perform ideally, but most do. And it's also ALWAYS a matter of how good a port is. Assassin's Creed (the series) for example has massive CPU requirements for what it does. GPU not at all. On the other hand, Rage runs away with stable 60Hz (mostly CPU bound, too).

BF3 I haven't really tried, as I refuse to buy into Origin. The beta ran well on my new PC though... and yes, it does look massively better on PC than it does on consoles. Not that it really matters, as the low resolution and no filtering for consoles results in a blurry mess on a bigger TV.
 
And lets not get off-topic into the typical "well consoles were cheaper/better when they came out" argument that never ends and quite frankly is pretty boring... it's more fun to talk about the claim that they are comparable to current, *high end* PCs as was made in the original post :p

So enough of this lame consoles vs core 2/8600 comparisons. Lets talk consoles vs i7/GTX 680. The claim is that the latter produce an equivalent experience :)
 
Last edited by a moderator:
I understand the hardware itself is more powerful by default. but i don't see that difference in power in games at all. i don't own battlefield 3 on pc i have the 360 version so i can't comment on multplayer, but i was expecting a huge leap in singleplayer since itd be more focused on graphix over multplayer, but it looks the same.

Lol what? Look at picture 38 (among others)! And by the way that's depth of field you're seeing, not bloom.

in the video posted when geralt walks out the tent on the pc version it looks like a nuclear winter due to the bloom. i'm pretty sure the dof doesn't have anything to do with the game looking like batman painted it yellow to poke fun at green latern,
 
Status
Not open for further replies.
Back
Top