Next-gen console > Current PC? (at launch)

I should mention that I don't own a 360 or PS3. We certainly get some more pretties playing multi-platform games on a monster PC but it that's not exactly a surprise considering just how much more powerful what we're running is. I don't think that this matters to very many people though. It's still the same gameplay.

Oh, no it's not!
Controls determine just how we interact with the gameworld, And that's important.
Fighting the controls breaks immersion and gameplay far more than compromises in graphics ever could.
While I guess it's a sword that cuts both ways depending on genre, I could never accept console controls for aiming/looking in games with a simulated 3-D environment.
 
A console can never be more powerful than a PC considering that a console is device designed taking considerations of several constraints like size of the box, power consumption, retail price, etc. in other words, it is a cost/size/energy efficient device, whereas a PC setup is free of all the limitation factors of a console, you can have type of power supply, case size, cooling solution, where your pocket is the limit. Now, that game developers actually take advantage of those PC setups is a whole different beast, since they normally target mid range PC´s, and as you said, they rarely develop PC only games anymore.
 
Back when I had a 17" laptop with a 2GHz Core 2 Duo and a 500MHz G71 (practically the same GPU as RSX?) with 512MB GDDR3, I used to play most of the earlier PS360 multiplatform titles just fine. COD:3, Dead Space, FEAR, Oblivion, Quake 4 and many others played with over 40FPS with everything maxed out @ 1280*800 or higher (it had a 1920*1200 screen).

I don't know how that system would fare in later titles because I eventually sold the laptop, but I wonder if it'd eventually play most games until now and at what difference in IQ to the current gen of consoles.

Generation upon generation, I get the feeling that the "code-to-metal" advantages are overrated for practical situations. Maybe this only shows in flagship AAA exclusive titles?

Sure, I believe there are advantages to coding to the metal, but are they really meaningful?
I look at that "chunks of geometry" limitation on PC mentioned in the bit-tech article, but while I believe that may be true, does it make a difference? Or are both systems bottlenecked by something else well before getting to that number of "chunks of geometry"?

It doesn't look to me like it's meaningful at all, because I see the current mid-end PCs playing multiplatform titles with much higher-resolution textures, higher-polygon models, 4x higher resolution, higher MSAA levels, better pixel shading effects, all while providing a much higher framerate (I'm thinking Skyrim with mods here).
As far as I can see, it just feels like having a PC with a graphics card that is 8x faster than Xenos will just allow me to have 8x more stuff going on in the same game, at the same framerate.

I don't know what the "chunks of geometry" thingie means at all, but it doesn't look like having 10x better performance of that in consoles is revealing to be of that much help.

And honestly, IMO Rage is such an isolated case that I blame its performance problems 99.99% on id software alone. Not on drivers, not on inefficient APIs, not on AMD/ATI, but on id software.
Besides, the game doesn't even look all that good and from what I played it feels more like an awkward attempt of trying to shove a (totally cliché'd) story/setting into quake 3 than anything else. When I played it, it felt as flawed as any other game that was in development hell for too long.

Furthermore, Carmack also said the PS Vita would be able to output graphics as good as smartphones with 2x better theoretical performance because of low-level optimizations alone, but I don't see that either.
I don't even see current or future games for the Vita having much having better graphics than NOVA3, Shadowgun Deadzone, Real Racing 2/3, etc.. All of which my smartphone with a supposedly weaker Adreno 225 seems to play flawlessly @ 1280*720, whereas most Vita titles don't even render at its native 960*540 resolution?


That said, here's what I think:
The way I see it, coding to-the-metal vs. DirectX/OpenGL API in the PC doesn't bring "generational leap" advantages, far from it.
If the rumours on next-gen consoles are true, I think anyone with a ~3GHz Core i3, ~4GHz 3-module Piledriver CPU and a HD7850 or a GTX 660 should be able to play anything that goes into Durango at the same IQ settings.
For Orbis, I'd say the same thing applies for the same CPUs and a HD7870 or a GTX660 Ti.
I think this applies unless there are some surprises to gameplay physics enabled by HSA that cannot be reproduced by our current PCs.
 
I dunno. Great things have been accomplished with a half-bandwidth, half-fillrate GeForce 7900 that sucks at shader model 3 and has wimpy geometry capabilities. Partly because they can use the system however they see fit.
 
Oh, no it's not!
Controls determine just how we interact with the gameworld, And that's important.
Fighting the controls breaks immersion and gameplay far more than compromises in graphics ever could.
While I guess it's a sword that cuts both ways depending on genre, I could never accept console controls for aiming/looking in games with a simulated 3-D environment.
First person games do play best with a mouse but gamepads are really not that big of a deal. Gamepads work very well with most genres.
 
Have you seen what they've managed to make a half-bandwidth, half-fillrate GeForce 7900 do in PS3?

I saw what they managed to do with a half-bandwidth, half-fillrate G71 + 3.2GHz Cell combo.. and as I said, I couldn't see how that compares to what the PC people can do with a full G71 @ 500MHz and a 2GHz Core 2 Duo.

All I know is that I could play most earlier multiplatform titles with a higher resolution than the console counterparts.
I'm pretty sure that the developers have enabled much better performance from the PS3 during its abnormally huge lifetime, but I have my doubts that it couldn't be done on that same laptop with similar IQ levels.
 
Last edited by a moderator:
I saw what they managed to do with a half-bandwidth, half-fillrate G71 + 3.2GHz Cell combo.. and as I said, I couldn't see how that compares to what the PC people can do with a full G71 @ 500MHz and a 2GHz Core 2 Duo.

All I know is that I could play most earlier multiplatform titles with a higher resolution than the console counterparts.
I'm pretty sure that the developers have enabled much better performance from the PS3 during its abnormally huge lifetime, but I have my doubts that it couldn't be done on that same laptop with similar IQ levels.

I think the benefit of being able to heavily optimise code for a specific architecture is real but it doesn't seem to reveal itself until a year or two into a consoles life. As you say the early years of this console generation seemed to translate just fine to the G7x architecture and it was only later that those GPU's started to struggle to keep up. Of course that can also be attirbuted to older architectures falling out of both driver and developer support after a few years so it would be pointless to look at the performance of G71 in modern games since it wouldn't be representitive of its actual performance had it remained a properly supported architcture on the PC platform.
 
I saw what they managed to do with a half-bandwidth, half-fillrate G71 + 3.2GHz Cell combo.. and as I said, I couldn't see how that compares to what the PC people can do with a full G71 @ 500MHz and a 2GHz Core 2 Duo.


I doubt that a Core 2 could do what Cell does to compensate for G71. I also doubt that DirectX, drivers and PC architecture could allow the CPU and GPU to work that closely together.
 
Has always this thread has fall down on to PC vs Consoles war, responding the thread maker question yes next next gen consoles will be better then high to middle range PC's (not ultra high PC's tho), but not in hardware, and they don't need to, because consoles are a closed system they can be more efficiently exploited by game developers then the PC that his a zoo of hardware systems and configurations, and so in lesser hardware they can do amazing things, of course eventually they will it a barrier and so PC's will again, gain the edge in graphics and power.

Btw imo this new consoles will be the last ones, but this is another matter entirely.
 
Has always this thread has fall down on to PC vs Consoles war, responding the thread maker question yes next next gen consoles will be better then high to middle range PC's (not ultra high PC's tho), but not in hardware, and they don't need to, because consoles are a closed system they can be more efficiently exploited by game developers then the PC that his a zoo of hardware systems and configurations, and so in lesser hardware they can do amazing things, of course eventually they will it a barrier and so PC's will again, gain the edge in graphics and power.

Your underestimating the massive power gulf between current PC hardware and the new consoles this time round.

Yes consoles have an efficiency advantage but even the most optimistic estimates put it roughly 2x and that level of advatage won't be seen for at least a year or two into the consoles lives. Meanwhile high end (not ultra high end) PC's of today are already sporting twice the power of Orbis and there'll be another generation released before the next new consoles.

So no, consoles won't offer more performance than current high end PC GPU's either on paper or in real terms. A year or two after their launch they may offer an experience similar to what 670 or 7950 level GPU's do in a PC but by then the 8xx and 9xxx generations will be on the market.
 
Maybe I'm just caught up on the good old days when we had situations of seriously gimped versions of PC games on consoles. Half Life 2 and Doom3 on Xbox were obviously poor compared to PC. Or Dreamcast's "Unreal Tournament" with pathetic downsized maps. N64/PS1's attempts at Quake 1/2. Now the games are designed for the consoles instead.
 
And that's why come Nov 2013, a 16 core PC with as much VRAM as the consoles have RAM will still won't be able to play the newest console port with better graphics. If you build your engine to a fixed spec, you'll never have enough flexibility to scale to a wider spectrum than if you had built it for a flexibile spec where the consoles are just one rung of it. You'll have 1080p movies instead of in-engine cinematics forcing you to see blurry cutscenes on your 2500x1600 PC. Same thing all over again, just increment all the variables a tad.

pjbliverpool: You're making too much sense. Remember that if the PC version is an afterthought once the console versions have gone to cert they can always blame it on API ineficiencies! Hot damnit, can't you all see!??! If we had a new API we could irradicate hunger!

There are recent games that make my i7/R6970 idle! I have to fire up Crysis 2 on max (or GTA4 *cough*) to hear my Radeon blare. I'm still on 12.6beta drivers because I have no need of the extra perf and I only updated from 11.10 because of the Rage issues. Back in the day I'd install every single driver version that came out for that extra 1% increase that was gobbled up by the margin of error.

I have games with loading screens I barely see on my SSD. Do developers even know about SSDs? If the loading screen hints are important shouldn't they find a way to show them to users if they happen to have a piece of equipment that's 10 orders of magnitude faster than their console counterpart?

/rant
 
I have games with loading screens I barely see on my SSD. Do developers even know about SSDs? If the loading screen hints are important shouldn't they find a way to show them to users if they happen to have a piece of equipment that's 10 orders of magnitude faster than their console counterpart?

/rant

Haha yeah I have that same problem even with a standard 7200rpm HDD. It's quite funny in a way.

BTW if you're having trouble using up all that power you should give 3D gaming a try. My poor 670 OC is run ragged by it at 1080p.
 
BTW if you're having trouble using up all that power you should give 3D gaming a try. My poor 670 OC is run ragged by it at 1080p.

Well, with the Radeons it's through some middleware software and my experience with 3D on a store's PS3's killzone 3 wasn't exactly breathtaking to make me want to try it. If I'm forced to upgrade next year because of the damned API inefficiencies I might go nVidia and play around with it.
 
Has always this thread has fall down on to PC vs Consoles war..
Thanks for the attempt, anyway.. :)

The consensus I'm getting is "PCs are better", but I also see a lot of back-and-forth and "we don't really know". I gather the answer will be to judge performance and appearance of each game as it's released, and decide from there. Yes, some of them are no-brainers.. the KB/M argument for shooters is one that makes choosing a platform for FPS a very easy decision for me. But I also have a gamepad on my PC for pretty much everything else, especially those games that were designed around it, like third-person action/adventure games.

As I said, right now, the decision is easy.. I can play Arkham City at 1080p, 60fps, in stereoscopic 3D, something the PS3 can only dream of doing with that game. But my guess is that for the first year or so of next gen, the games will be pretty close.
 
Well, with the Radeons it's through some middleware software and my experience with 3D on a store's PS3's killzone 3 wasn't exactly breathtaking to make me want to try it. If I'm forced to upgrade next year because of the damned API inefficiencies I might go nVidia and play around with it.

I don't think the PS3 handles 3d the same as PC. It probably uses some kind of depth buffer solution which won't be as good quality as true 3d using two independantly rendered frames.

The performance implications for true 3d are basically a halving of framerate unless you significnatly reduce some other rendering factor so that's probably out of reach for PS3, at least in high end games like Killzone.

True 3d when it works well on a big screen monitor is breathtaking. I can honestly say that IMO the benefit is as great as an entire console generation jump - maybe greater if comparing last gen to first gen games. One of the greatest benefits on a big screen monitor is how 3d (somehow) gives the effect of a much bigger screen. On my 27" from around 2 feet can genuinly feel like playing on a wall sized projection sometimes. I've no idea how that works though.
 
As I said, right now, the decision is easy.. I can play Arkham City at 1080p, 60fps, in stereoscopic 3D,

Only if you're rocking a high end SLI config or turn down some of the graphics settings.

I've got 680 level performance in my PC but at max settings and 1080p AC can't even break 30fps in parts when running in 3D.

I do wonder how much of that is down to physX being set to max though. Maybe I'll try turning it own later.
 
It doesn't run as well as the first game did, it doesn't always hold at 60fps, but it stays there more often than not, and never dips below around 45 or so. That's on a 680. I think I do have something turned down, but I forget what it is. Motion blur or something like that. Unless I'm completely remembering wrong, I haven't played it in a while.. I do remember being surprised that it wasn't a rock-solid 60fps like the first game was. I only used it as an example, but you get my point.. my PC can do things way beyond what my PS3 can.

I don't think the PS3 handles 3d the same as PC. It probably uses some kind of depth buffer solution which won't be as good quality as true 3d using two independantly rendered frames.
I'm not sure about Killzone, having not played it, but quite a few games actually do render "true" 3D on PS3, rendering two complete frames. There is a hit, though.. Wipeout HD goes from 1080p60 to 720p30 in 3D, frankly not really worth it.

Assassin's Creed III does use reprojection, and it shows.. the implementation is extremely poor, and it's a real shame considering how much a game like that can benefit from real 3D rendering.
 
My humble PC can run a GPU limited game like Trine 2 at 1080p 60fps in 3D, which is quite a bit beyond what my PS3 can do.

On the other hand, the PC does not have great looking games like Uncharted 2/3, God of War 3/Ascension or Gran Turismo 5.

Killzone btw is proper 3D, but has to halve resolution to maintain even half decent a framerate. GT5 is slightly better but generally for any multi platform game bar a few my PC (550Ti) can manage 1080p at 60fps, and often in 3D (though many games then won't manage 60fps unless I dialdown some details)
 
Back
Top