The Next-gen Situation discussion *spawn

It really depends on who is watching, and what he is wathing on.

Well I look at it this way, I finished Crysis 2 last night and I'm playing the new Batman on ~500 watts of pc hardware and they both looked damn nice, to my eyes far beyond console gaming. The console folk in about a year in a half will have ~200 watts of hardware to play with on their new consoles. If todays ~500 watts of pc hardware meets with a resounding meh and/or little difference to ps3 games of today then I can't imagine how ~200 watts of console hardware will impress them 1.5 years from now. Even if you assume consoles are twice as efficient so that ~500 watts of pc grunt is more like ~250 watts of console equivalent wattage, it still implies 2013 consoles won't be vastly different than what a pc can do right now. Which in turn leads me to wonder if the core gamers today that don't see much different between pc/ps3 graphics will be dissapointed come 2013.
 
Well I look at it this way, I finished Crysis 2 last night and I'm playing the new Batman on ~500 watts of pc hardware and they both looked damn nice, to my eyes far beyond console gaming. The console folk in about a year in a half will have ~200 watts of hardware to play with on their new consoles. If todays ~500 watts of pc hardware meets with a resounding meh and/or little difference to ps3 games of today then I can't imagine how ~200 watts of console hardware will impress them 1.5 years from now. Even if you assume consoles are twice as efficient so that ~500 watts of pc grunt is more like ~250 watts of console equivalent wattage, it still implies 2013 consoles won't be vastly different than what a pc can do right now. Which in turn leads me to wonder if the core gamers today that don't see much different between pc/ps3 graphics will be dissapointed come 2013.

I got a nice little heater for the room here as well, all classic PC, Raid 0 SSD drives etc etc... I still have just as much fun playing PS3 games, and i still think that some of them can hold their own vs PC games. Not in the details or the funky highend shader stuff, but the complete package of some the AAAA titles really do stand their ground.

And i see no problem with 350 Watt consoles or more they just need two things. Powersave modes for non gaming, less taxing games. And silent effective cooling.

As a sidenote, my i7 rarely goes beyond 25% when i game, so imho the PC has alot left on the CPU side, but that will most likely just stay that way. Since everyone is relying on GPU's. Which is why i am not to worried about weak CPU's in the next round, as long as they re-invest everything saved in the GPU.
 
I got a nice little heater for the room here as well, all classic PC, Raid 0 SSD drives etc etc... I still have just as much fun playing PS3 games, and i still think that some of them can hold their own vs PC games. Not in the details or the funky highend shader stuff, but the complete package of some the AAAA titles really do stand their ground.

Well it's the "visually looks the same" type comments that blow me away, you can read that all over the web. This makes me wonder if maybe the engine makers are wasting their time. All the stuff they are making that run on pc today is what is intended for the consoles of tomorrow. High resolution, better dof, better motion blur, better anti aliasing, better texture filtering, better shadows, 60fps, etc, all the stuff the pc does far better today than consoles do are what's intended for the consoles of 2013. Yet if I understand this right to the console people it's all meh, can't see the difference, pc isn't much better. I have to admit that's totally shocking to me! I don't get it but let's say it's true and console folk don't notice any of those improvements, so what do console people notice then? Is it all about art style and texture sizes? That might make sense since pc games by and large use the same texture set as their console counterparts. If that's the case then the gpu selection on the next consoles should be all about lots of texture samplers and lots of ram. All the rest that pc improves today, stuff like better shadows, 60fps, less aliasing, etc, all of that seems to be falling on deaf eyes as it were. It's unbelievable to me that a modern pc game is considered in the same graphics league of a 2005 console, but then again it does lend credence to my own age old argument of diminishing returns. Maybe people really just can't see it and it's all just about texture resolution.

If that's the case then all the game makers have to do is take the pc games of today, don't bother with the super fancy stuff like 60fps, etc, remove all that since console guys don't care about those anyways, and just bump up the texture resolution 8x and release that on 2013 consoles. That would make an interesting test to try today. Compare a game on pc like Crysis 2 as it stands right now. Then heavily downgrade it to console level be removing most everything however bump up the texture resolution 8x. Then show both versions to the avarage joe and see which one they thinks looks better. I guess the results could be surprising.
 
Well I look at it this way, I finished Crysis 2 last night and I'm playing the new Batman on ~500 watts of pc hardware and they both looked damn nice, to my eyes far beyond console gaming. The console folk in about a year in a half will have ~200 watts of hardware to play with on their new consoles. If todays ~500 watts of pc hardware meets with a resounding meh and/or little difference to ps3 games of today then I can't imagine how ~200 watts of console hardware will impress them 1.5 years from now. Even if you assume consoles are twice as efficient so that ~500 watts of pc grunt is more like ~250 watts of console equivalent wattage, it still implies 2013 consoles won't be vastly different than what a pc can do right now. Which in turn leads me to wonder if the core gamers today that don't see much different between pc/ps3 graphics will be dissapointed come 2013.

the thing is the games running on your 500 watts of PC hardware wasn't coded to take full advantage of your 500 watts of PC hardware like the games will be coded to take advantage of the 200 watts of new Console hardware.
 
the thing is the games running on your 500 watts of PC hardware wasn't coded to take full advantage of your 500 watts of PC hardware like the games will be coded to take advantage of the 200 watts of new Console hardware.

Agreed, but that's why I gave it a guesstimate 2:1 efficiency advatange, so say a 500w pc is roughly equivalent to 250w on console. That still puts next ~200w console hardware more or less in spitting distance to what a strong 500w pc can do today.
 
Well it's the "visually looks the same" type comments that blow me away, you can read that all over the web. High resolution, better dof, better motion blur, better anti aliasing, better texture filtering, better shadows, 60fps, etc, all the stuff the pc does far better today than consoles do are what's intended for the consoles of 2013. It's unbelievable to me that a modern pc game is considered in the same graphics league of a 2005 console, but then again it does lend credence to my own age old argument of diminishing returns. Maybe people really just can't see it and it's all just about texture resolution.

People care about eye-candy sure, but they also care about other things like entertainment on tap, where they want it, when they want it, sharing it with the people they want. And entertainment these days doesn't mean just games, it means all sorts of things, with applications that enable you and dont restrict you. Casual or hardcore, if it involves your attention and your wallet away from the living room, what use is the console?

People also want convergence, they don't want a specific device stuck in a lounge room that only does a couple of things well. If people are "meh" about new consoles capabilities it's because people understand that hardware grunt is not directly proportional to entertainment, or profits.

I could care less about the new consoles because immediately I think, they are not portable, there's ecosystem software and hardware fragmentation, their games are expensive, their hardware is fixed for the life of the console (there are no yearly refreshes), they tend not to do anything else other than gaming well (or at least they are very much pigeon holed into this idea), they are social to a point and even then to an eco-system's constituency and so on.

People have been introduced a new paradigm and when you see simple games like Angry Birds making squillions and simple Apps like Instagram come from nowhere and be worth a billion. Why? That isn't about hardware, its about mindshare, its about a new social space, and these new consoles don't know how to get it or fit in, because they restrict you and don't enable you.

The industry is evolving. One of the big three will definitely fail this generation, and the other two will limp along because they haven't evolved enough, or as some have said it, their time has come, and it was never about MFLOPS or motion blur.
 
Agreed, but that's why I gave it a guesstimate 2:1 efficiency advatange, so say a 500w pc is roughly equivalent to 250w on console. That still puts next ~200w console hardware more or less in spitting distance to what a strong 500w pc can do today.

It's not an efficiency argument, it's just plain underutilization. The core market is still console centric. PC is like that plain but unassuming girl who everyone suddenly discovers is hot once prom rolls around. Next gen = PC prom.

you do know there is a GPU in that APU right?
I think he would prefer it if there were SPEs on there too.
 
Well it's the "visually looks the same" type comments that blow me away, you can read that all over the web. This makes me wonder if maybe the engine makers are wasting their time. All the stuff they are making that run on pc today is what is intended for the consoles of tomorrow. High resolution, better dof, better motion blur, better anti aliasing, better texture filtering, better shadows, 60fps, etc, all the stuff the pc does far better today than consoles do are what's intended for the consoles of 2013. Yet if I understand this right to the console people it's all meh, can't see the difference, pc isn't much better. I have to admit that's totally shocking to me! I don't get it but let's say it's true and console folk don't notice any of those improvements, so what do console people notice then? Is it all about art style and texture sizes? That might make sense since pc games by and large use the same texture set as their console counterparts. If that's the case then the gpu selection on the next consoles should be all about lots of texture samplers and lots of ram. All the rest that pc improves today, stuff like better shadows, 60fps, less aliasing, etc, all of that seems to be falling on deaf eyes as it were. It's unbelievable to me that a modern pc game is considered in the same graphics league of a 2005 console, but then again it does lend credence to my own age old argument of diminishing returns. Maybe people really just can't see it and it's all just about texture resolution.

If that's the case then all the game makers have to do is take the pc games of today, don't bother with the super fancy stuff like 60fps, etc, remove all that since console guys don't care about those anyways, and just bump up the texture resolution 8x and release that on 2013 consoles. That would make an interesting test to try today. Compare a game on pc like Crysis 2 as it stands right now. Then heavily downgrade it to console level be removing most everything however bump up the texture resolution 8x. Then show both versions to the avarage joe and see which one they thinks looks better. I guess the results could be surprising.

The graphics techniques of most PC games are still heavily rooted in console-era hardware. I'm not surprised that a lot of people don't see a huge difference between PC and console games when in most cases you only see improvements for the low-hanging fruit like higher resolution, or more taps for SSAO or some post-processing effect. In general you're not seeing radical departures in material models, or non-traditional rendering techniques for difficult surfaces like hair and foliage, or volumetric simulation + rendering for smoke effects, or any of the other really cool things that you could do if you truly exploited the throughput and flexibility of a high-end DX11 GPU. I really think that we're going to have to improve a lot more than just image quality if we want to give people a true next-gen graphical experience.
 
It's not an efficiency argument, it's just plain underutilization. The core market is still console centric. PC is like that plain but unassuming girl who everyone suddenly discovers is hot once prom rolls around. Next gen = PC prom.

Efficiency is part of it, how many draw calls does a typical PC GPU send versus one in a console?

I prefer to have 20,000 draw calls per frame in the span of 30hz as opposed to 5,000 per frame in the span of 60hz.

It doesn't matter what consoles will ship with as pretty much anything will provide a tangible upgrade in IQ over what we have today.
 
The graphics techniques of most PC games are still heavily rooted in console-era hardware. I'm not surprised that a lot of people don't see a huge difference between PC and console games when in most cases you only see improvements for the low-hanging fruit like higher resolution, or more taps for SSAO or some post-processing effect. In general you're not seeing radical departures in material models, or non-traditional rendering techniques for difficult surfaces like hair and foliage, or volumetric simulation + rendering for smoke effects, or any of the other really cool things that you could do if you truly exploited the throughput and flexibility of a high-end DX11 GPU. I really think that we're going to have to improve a lot more than just image quality if we want to give people a true next-gen graphical experience.

FWIW I think it's more than this.
I think your asking people to value subtleties in lighting and AA the same way they valued round things being round.

There is certainly a lot more that can be done to reach photorealism, but I think we're at a dangerous inflection point where until you hit that tipping point of things looking real (and I think we're a long ways off), it's just "slightly better" looking to people who don't really know what they are looking at.

I look at Epic's Samaritan demo, something designed to impress on the latest and greatest PC hardware and I don't see the huge leap. I see a lot of things that are better, but nothing that screams next gen.

I do think NextGen consoles will have much better graphics, I'm just not sure that's enough to read as a huge graphical leap. So I think they are going to have to differentiate in other ways.

Of course I'm likely an outlier, I have a 580SLI setup on my PC and most of the time I play without AA, FWIW edge aliasing has just never bothered me.
 
The graphics techniques of most PC games are still heavily rooted in console-era hardware. I'm not surprised that a lot of people don't see a huge difference between PC and console games when in most cases you only see improvements for the low-hanging fruit like higher resolution, or more taps for SSAO or some post-processing effect. In general you're not seeing radical departures in material models, or non-traditional rendering techniques for difficult surfaces like hair and foliage, or volumetric simulation + rendering for smoke effects, or any of the other really cool things that you could do if you truly exploited the throughput and flexibility of a high-end DX11 GPU. I really think that we're going to have to improve a lot more than just image quality if we want to give people a true next-gen graphical experience.

Some of the stuff you mentionn, the newer techniques have less bang for the buck improvement than some of the improvements you can see now on pc. Take volumetric fog for example. If your game is Silent Hill or Alan Wake then ok it will help somewhat to make it look better. But in most cases it may make a typical game look better 5% of the time, basically whenever fog is present in a scene. Someone that plays a demo may never even get to see it. Hair surfaces as well, for some games it will help like behind the camera rpg's but even then hair represents just a small percentage of what you see on screen so you are looking at lower bang for the buck. And in many games you don't even see player hair so there is no value there.

So I thought about it a bit more, trying to sort out why pc gaming isn't viewed as an upgrade to some and wondered, what types of improvements hit you in the face on any game through the entire game viewed on any type of tv of any size, hence giving more Ooooo and Ahhhh bank for the buck? For example I didn't count aliasing because in some games it's not that bad, higher resolution helps mitigate it as well, some games color palettes make it less of an issue, small tv's make aliasing harder to spot, etc. Likewise I decided to not even include resolution since it seems like people can't see the difference much generally speaking, and if they have a small tv they will notice resolution bump even less so. So what improvements hit across entire games irregardless of game type, tv size, game color pallete, etc? I came up with these:

1) 60fps
2) Texture resolution.
3) Shadows
4) Art style

The above 4, if improved, should be noticeble to most irregardless of what game they are playing, how small their tv is, and so on, hence those improvements to a game should be most noticeable across the masses. At least that's all I could come up with when speaking purely visually. Yeah there's a million other things you can improve, but my argument is if people don't even notice the above 4 which should stand out on any game on any tv, then would they even really notice stuff like improvements in hair rendering? I would see it, you would see it, but would the masses really see it as wow look at that improvement?

Making that list above I think explains to me why pc isn't seen as much of an upgrade at all to many out there. #2 Texture resolution and #4 Art style by and large aren't improved on pc at all. Most of the time those are simply equivalent to console. #1 60 fps should be a big one, but I think when most people say they have a high end pc they very possible don't have one, and while they may be running with all details maxed they may only be getting 30fps. Likewise people tend to view youtube videos, etc, to compare to pc and those are normally encoded at 30fps so poof, the 60fps pc advantage in many cases gets totally negated or their hardware can't do it anyways. Which leaves #3 Shadows. Every game has shadows so every game shows improvement here on pc in form of longer draw distances for shadows, better looking shadows, more shadow casting sources or whatever. That's about all I could come up with.

In other words I think I now realize why pc games are seen as meh in terms of visual improvement for many simply because many just don't notice the smaller details even on games like BF3, and the bigger details aren't improved enough on your typical pc game. At least that's my theory :)
 
Efficiency is part of it, how many draw calls does a typical PC GPU send versus one in a console?

I prefer to have 20,000 draw calls per frame in the span of 30hz as opposed to 5,000 per frame in the span of 60hz.

It doesn't matter what consoles will ship with as pretty much anything will provide a tangible upgrade in IQ over what we have today.

Regardless of draw call limitations brought about by driver/API overheads there's still a hell of a lot more you can do with modern PC hardware though.

I'd rather 5000 draw calls per frame at 60fps than 10,000 at 30fps if those 60fps also come with 10x the shader operations, 10x the texture operations, 10x the geometry through tessellation and 4x the image quality.

Driver/API efficiency certainly is a factor. 2x seems to be a generous advantage in favour of the consoles but it's probably less under DX11. But as Sedentary and MJP said, it's mostly just poor utilisation of the available power. Todays PC's are capable of vastly, vastly better graphics than the current consoles despite lower driver/API efficiency. But thanks to console drag, the code itself just ends up using the available power inefficiently by ramping up elements of the graphics that are easy (cheap) to implement from a development point of view but have limited visual impact.

If you want to judge true PC efficiency against consoles then you need to look at how games perform on PC's at console settings - where the code efficiency is at it's highest.
 
Regardless of draw call limitations brought about by driver/API overheads there's still a hell of a lot more you can do with modern PC hardware though.

I'd rather 5000 draw calls per frame at 60fps than 10,000 at 30fps if those 60fps also come with 10x the shader operations, 10x the texture operations, 10x the geometry through tessellation and 4x the image quality.

Driver/API efficiency certainly is a factor. 2x seems to be a generous advantage in favour of the consoles but it's probably less under DX11. But as Sedentary and MJP said, it's mostly just poor utilisation of the available power. Todays PC's are capable of vastly, vastly better graphics than the current consoles despite lower driver/API efficiency. But thanks to console drag, the code itself just ends up using the available power inefficiently by ramping up elements of the graphics that are easy (cheap) to implement from a development point of view but have limited visual impact.

If you want to judge true PC efficiency against consoles then you need to look at how games perform on PC's at console settings - where the code efficiency is at it's highest.

I doubt we have seen whats really capable on PCs with modern and powerful hardware since the demise of highend exclusive PC offerings. I doubt you'll find many PC exclusive projects with $20-50 million dollar development budgets.

I think the performance gap between consoles and PC have been artifically narrowed due to the differing amounts of resources that typically exists for the development of PC gaming and console gaming.
 
We might not make it there this generation but I think good quality hair and clothing will go a long way to making graphics pop. Every time I talk to a character in a game like Skyrim I find myself wishing for better hair and clothing.
 
We might not make it there this generation but I think good quality hair and clothing will go a long way to making graphics pop. Every time I talk to a character in a game like Skyrim I find myself wishing for better hair and clothing.

Do you really feel that would be enough for the masses to stand up and take notice, in them saying "Wow this really looks better"? I ask because take this example, the game Metro 2033. It's an older 2010 game but it looks pretty good on pc. From an old interview they said the pc version improved on the console version in these respects:

•Most of the textures are 2048^2 (consoles use 1024^2).
•The shadow-map resolution is up to 9.43 Mpix.
•The shadow filtering is much, much better.
•The parallax mapping is enabled on all surfaces, some with occlusion-mapping (optional).
•We've utilised a lot of "true" volumetric stuff, which is very important in dusty environments.
•From DX10 upwards we use correct "local motion blur", sometimes called "object blur".
•The light-material response is nearly "physically-correct" on the PC on higher quality presets.
•The ambient occlusion is greatly improved (especially on higher-quality presets).
•Sub-surface scattering makes a lot of difference on human faces, hands, etc.
•The geometric detail is somewhat better, because of different LOD selection, not even counting DX11 tessellation.

That seems like a reasonable amount of improvements and it's only a partial list, yet all of which are falling under the "meh pc doesn't look much better than console" comments as you can read on various forums. Newer games do that and more on pc, yet still don't seem to resonate with the masses who believe they aren't looking much better than console still. Some like Batman and Mafia use Physx to animate cloth clothing or throw around tons of debris, and yet still considered "not much better than console". That's why I was wondering what exactly are the masses looking for when it comes to visual improvement?
 
My guess is most people that aren't impressed by the quality of a high end PC game haven't actually played the game in both settings. Plus, the PC's main problem is complexity which is the reason consoles exist.

When console players that have been on the current gen for most of its life see a game that's only available on next gen systems and looks better than games on the current systems they'll upgrade. Then they'll go back to replay an old game and realize the quality sucks even though it used to be adequate. Then word of mouth will bring other gamers forward.

The exclusive games are the key to get the upgrade cycle started and graphics is the marketing gravy.

edit: also, consumers understand people so making them look better is more immediately obvious than many other graphical effects.
 
Some of the stuff you mentionn, the newer techniques have less bang for the buck improvement than some of the improvements you can see now on pc. Take volumetric fog for example. If your game is Silent Hill or Alan Wake then ok it will help somewhat to make it look better. But in most cases it may make a typical game look better 5% of the time, basically whenever fog is present in a scene. Someone that plays a demo may never even get to see it. Hair surfaces as well, for some games it will help like behind the camera rpg's but even then hair represents just a small percentage of what you see on screen so you are looking at lower bang for the buck. And in many games you don't even see player hair so there is no value there.

I was just giving examples, my point was simply that there is huge room for improvement beyond what PC games are doing and thus I don't think it makes sense to use them as a proxy for next-gen console games. Regardless I don't think it's fair to say good hair is less valuable than something else just because it's not not always on-screen. I think you have to look how comparatively bad the state of hair is compared to more ordinary surfaces of it, and how much it contributes to that "gamey" look that a lot of us try to avoid. Personally I'd say plastic-looking hair made out of alpha-tested cards is a pretty big offender, especially in cutscenes. Same goes for using poorly-lit billboarded sprites in lieu of fog and smoke effects (which I'd actually argue are pretty prevalent, especially in shooters).
 
Taking out the fuglies and improving IQ goes a long way imo. Then again consumers can be nostalgic (go back and play N64 games! No, you cannot wipe the vasalene off the screen). NG won't stand on graphics alone but they can wow. And just looking at what is coming out on the PC, as well as their budgets, I have no reason to think they are pushing boundaries. I am impressed a game like BF3 plays so well on the 360, but on the reverse side a couple switches here and there that make the PC look much prettier you have to ask: What did DICE give up by using the consoles?

Anyway, looking back on the Xbox1 id fit Doom 3 on it. That doesn't mean the Xbox 360 was too early. Heck, most Year1 games failed to impress much--being marginally better than the last gen stuff. The first year or so are a lot of ports with shiny graphics. So that may mean the Xbox 360 runs around with BF3 at low settings struggling at around 720p with poor AA and very low IQ at 30Hz and, assuming a quality GPU, the Xbox 720 could really crank it up at 720p (!!) or could with small tweaks run 1080p at 60Hz with better AA. That is what IW did with CoD2 to a degree and they got a great foothold in the market--even though it was a marginal step better. But CoD4, when it released 2 years later, was a HUGE step over the Xbox 1.

It only takes a handful of launch titles to show off potentials. Yes, it will take 2-3 years for fresh software aimed at DX11 as a baseline and use the console as the *minimum spec*, but using the current PC titles as a reason why there shouldn't be good hardware is like looking at the Xbox1 run Doom 3 and Conquers Live and Reloaded and concluding the 360 didn't need 8x the memory are a vastly upgraded GPU.

Why? Just look at a game like BF3: there is no way you are getting worlds that size, with those details, with that many players, destruction, and eye candy on an Xbox 1. Seeing as the Xbox 360 barely runs *I bet* a 1/2 Xenos with 256MB would NOT have run BF3 with the experience anywhere near the 360 with a full Xenos and 512MB memory. That isn't true of all games--I bet a game like Trials HD could be done on the Xbox1, but that doesn't mean every game has to push envelop. But for a premium device, like a console, I do expect some AAA content every year. I see little point paying $300+ for new hardware that is marginally better because the experience will be marginally better. No, I think consumers want to see a Gears of War 1 to inspire them about the purchase and then enjoy a variety of fun games in between the triple decker parfait loaded with nuts, berries, whip cream, and a cherry on top. The first 10M users who pay a premium for these things and spread word of mouth are not going to be happy with "upgrades are the limit" bald heads and boxy hair with a focus on "hey more gimmick Move/Kinect crap."
 
Tbh most games played on PC nowadays are merely console games up-rezzed.

It's unsurprising that the vast majority of users aren't percieving a massive difference in fidelity between those same games played on Consoles from on PC.

If all games next gen are simply made with the texture fidelity of Crysis 1 maxed out, with better lighting and shadowing, then i'm quite sure that the prevailing majority will change their tune and be very impressed.

As far as i'm concerned, if most games next gen can have the texture, lighting, shadowing, effects quality as Crysis 1 fully modded and maxed out, together with the destruction engine of Red Faction Guerilla, an animation system of similar quality and believability as the Natural Motion Euphoria stuff and a few more extra bells and whistles like better hair and clothing, then i'll be 100% content with the upgrade next-gen.

Most if not all of the above should be achievable on HW available today, based on what's been done on 2006-level tech. So I would think it pretty hard for Sony and MS to NOT impress gamers with their boxes next-gen.
 
Back
Top