Are PCs holding back the console experience? (Witcher3 spawn)

  • Thread starter Deleted member 11852
  • Start date
The same thing will happen for feature levels 12+. Those features will be added as an afterthought because they don't apply for consoles. Once Xbox Two and PlayStation 5 come out I suspect those features will no longer be consider an afterthought! This is not to put consoles in a negative light (everything in life has trade offs), but I think we can all agree regardless of what "team" you're on it's silly to suggest a fixed platform doesn't hold back an ever-changing one.
Well thankfully history is easy to revisit here at Beyond3D. Xbox vs. DirectX 8/9? We got it. Xbox 360 vs DirectX 10. Got it covered. Though it is a shame that we don't go back to Nintendo 64 vs. 3Dfx Glide! You really need Dimension3D for that (hi Azbat!)

I suppose the big picture is none of the latest API hype matters at all. Unless one really enjoys the little tacked on extras. Once the API is actually being fully utilized from the ground up, it's old news and nobody wants to hear about it anymore because something new is being hyped up to sell hardware that is too slow for it anyway. ;)
 
Last edited:
That's been true for other API's but thanks to the efficiency and control advancements in DX12 this could be the first API that offers major benefits from near day 1.
 
That's been true for other API's but thanks to the efficiency and control advancements in DX12 this could be the first API that offers major benefits from near day 1.

Well probably not (at least not to the broader point I was making)! :D I agree that perhaps DirectX 12 the api might offer major benefits sooner rather than later, but I don't believe that will be true for feature levels 12_1 and higher.
 
You gotta let go of that flag capture.
My harshness towards consoles truly dates back to that fateful day when Captain Al betrayed everything I thought we stood for. To murder me in cold blood and steal my hard earned booty FTW. I saw what the consoles had turned him into.

Or maybe he didn't see me before he jumped out of the Banshee. Or perhaps his 1337 skillz were not as 1337 as he thought.
 
I think discussing editing .ini files/market size/etc. largely misses the point (and is more of a generic pc vs console discussion). But of course consoles hold us back. Just look at the last couple of years in pc gaming. You'll find the vast majority of these games are built with a "D3D 9 philosophy" (aka 360/ps3 philosophy) in mind with various D3D 10/11 effects added as an afterthought (and a lot of them are just brute force techniques). You cannot build an engine (easily at least) that makes the most of D3D 11 while still having to support fixed platforms like 360/ps3. Only now has there been a sudden interest in "D3D 11 philosophy" of rendering (I'm aware this is an oversimplification). :)

Yet at the same time, some PS3/360 games were 'beyond DirectX12' already (particularly some PS3 games I would argue, like God of War 3, Uncharted 2, Killzone 3, Super Stardust HD, Resistance, and eventually Frostbite et al, etc.). DirectX is an abstraction layer, after all, and the PS3 pared an 8-core heterogeneous CPU with a somewhat basic for the times GPU, but still worked in such a way that DirectX12 is only now starting to achieve.

The same thing will happen for feature levels 12+. Those features will be added as an afterthought because they don't apply for consoles. Once Xbox Two and PlayStation 5 come out I suspect those features will no longer be considered an afterthought! This is not to put consoles in a negative light (everything in life has trade offs), but I think we can all agree regardless of what "team" you're on it's silly to suggest a fixed platform doesn't hold back an ever-changing one.

I think you're wrong. Many 'DirectX9' misery happened because more and more games were developed on PC with out-of-date engine technology developed on and for PC architectures and it was getting easier to port those to consoles, where dedicated titles could go beyond DirectX11 already before it was even on the table. The only limitations imposed on PC games by 360/PS3 were simply art detail, but artists and tools were bottlenecks there too - the cost of creating high-quality art I bet was typically 95% of the game's budget (marketing excluded).

But as said, this discussion has been had a few times already. ;)

As a more topical question - which platform do you think is currently doing more Async Compute ... PC or PS4?
 
From a developer/publisher standpoint, if you have a great game like Witcher 3 which will push PC version sales on its own regardless of the fact it doesn't have significantly superior visuals over console, why add millions to development costs when only a fraction of pc users have cards that can take advantage of those superior settings. And with ease of which it is to pirate single player games with no consequence why make something that is so superior that it might deter people from buying a console and which is a much less convenient platform to pirate games on. Console sales are probably predicted to be their bread and butter.

If every developer that made an AAA single player type game had significantly superior visuals on pc I'm sure alot more people would look at PC to pirate games to save on money. Buy a $400 console pay $50 a year for online and $60 every game and expensive accessories, or buy a $700-800 PC get way way superior visuals in every "free" singleplayer game and no online gaming fee, and you can do way way more on PC.

Its a shame, I'd like to to see more than higher resolution, AA, hair effects, and slightly better textures as well.
But railing against developers isnt going to solve the problem. Instead maybe creating a kickstarter system concept where developers see the ppl are pooling their money if they achieve some goal in their game might motivate them,.
 
Last edited:
Yet at the same time, some PS3/360 games were 'beyond DirectX12' already (particularly some PS3 games I would argue, like God of War 3, Uncharted 2, Killzone 3, Super Stardust HD, Resistance, and eventually Frostbite et al, etc.). DirectX is an abstraction layer, after all, and the PS3 pared an 8-core heterogeneous CPU with a somewhat basic for the times GPU, but still worked in such a way that DirectX12 is only now starting to achieve.

Perhaps you should be asking yourself why did developers need to use the cell for those effects instead of the gpu. :p

I think you're wrong. Many 'DirectX9' misery happened because more and more games were developed on PC with out-of-date engine technology developed on and for PC architectures and it was getting easier to port those to consoles, where dedicated titles could go beyond DirectX11 already before it was even on the table. The only limitations imposed on PC games by 360/PS3 were simply art detail, but artists and tools were bottlenecks there too - the cost of creating high-quality art I bet was typically 95% of the game's budget (marketing excluded).

I'm not sure I even understand this, but in general games are developed for consoles first ("D3D 9 philosophy" ) then ported to pc after. It's impossible to develop a highly optimized D3D 11 game when your starting point is ~D3D 9.

As a more topical question - which platform do you think is currently doing more Async Compute ... PC or PS4?

I don't think the benefits of async compute are fully understood. While it does help out gcn, its benefits to other architectures are less meaningful (other architectures are narrower than gcn). Regardless, async compute will not be some magical bullet for consoles that'll leave the pc ecosystem in the dust. ;)
 
From a developer/publisher standpoint, if you have a great game like Witcher 3 which will push PC version sales on its own regardless of the fact it doesn't have significantly superior visuals over console, why add millions to development costs when only a fraction of pc users have cards that can take advantage of those superior settings. And with ease of which it is to pirate single player games with no consequence why make something that is so superior that it might deter people from buying a console and which is a much less convenient platform to pirate games on. Console sales are probably predicted to be their bread and butter.

If every developer that made an AAA single player type game had significantly superior visuals on pc I'm sure alot more people would look at PC to pirate games to save on money. Buy a $400 console pay $50 a year for online and $60 every game and expensive accessories, or buy a $700-800 PC get way way superior visuals in every "free" singleplayer game and no online gaming fee, and you can do way way more on PC.

Its a shame, I'd like to to see more than higher resolution, AA, hair effects, and slightly better textures as well.
But railing against developers isnt going to solve the problem. Instead maybe creating a kickstarter system concept where developers see the ppl are pooling their money if they achieve some goal in their game might motivate them,.
I'd wager the Witcher series is highly dependent on PC sales. It has always been a PC first series (until now apparently).

Also single player PC games continue to sell incredibly well despite the apparent piracy apocalypse we are experiencing. Or not.
 
My harshness towards consoles truly dates back to that fateful day when Captain Al betrayed everything I thought we stood for. To murder me in cold blood and steal my hard earned booty FTW. I saw what the consoles had turned him into.

Or maybe he didn't see me before he jumped out of the Banshee. Or perhaps his 1337 skillz were not as 1337 as he thought.

:D
 
Perhaps you should be asking yourself why did developers need to use the cell for those effects instead of the gpu. :p

Because they could, and didn't have to wait until the GPU included hardware support! Point is, they could effectively combine multi-core CPU with GPU in one rendering engine, with access to the same memory at all times, etc. Where DirectX9 was a very fixed rendering pipeline with little possibility to look back, very limited drawcalls, etc.

I'm not sure I even understand this, but in general games are developed for consoles first ("D3D 9 philosophy" ) then ported to pc after. It's impossible to develop a highly optimized D3D 11 game when your starting point is ~D3D 9.

But developers found that jobified, multi-threaded, engines specifically tailored to make the most of Cell and GPU combined were a much better start for D3D 11 development.

I don't think the benefits of async compute are fully understood. While it does help out gcn, its benefits to other architectures are less meaningful (other architectures are narrower than gcn). Regardless, async compute will not be some magical bullet for consoles that'll leave the pc ecosystem in the dust. ;)

Of course not. But eventually it's likely to become standard everywhere. Point being, don't underestimate the consoles open architecture and optional freedom from (DirectX) abstraction layers, and what they've done for PC development. You should go back to the Mantle discussion before DirectX12 was even in the picture, and one of the reason's detre mentioned was that it was ridiculous how inefficient PC games were running versus console games and how badly they could use multi-core CPUs.
 
I'd wager the Witcher series is highly dependent on PC sales. It has always been a PC first series (until now apparently).

Also single player PC games continue to sell incredibly well despite the apparent piracy apocalypse we are experiencing. Or not.
Thats why i wrote "predicted", in the last sentence of the 1st paragraph. They are doing the whole ign, gamespot, etc etc honeymooning and big marketing push this time around to get those big console sales.

Its not an apocalypse but like I said from a publisher/dev standpoint, given Witcher 3 will sell on pc even without those 2013 e3 visuals, why spend millions more on dev budget to hit those visuals, instead of streamlining dev costs and create nearly uniform game visually across the platforms for all platforms.

PC piracy still probably is a big concern even with good pc sales on their past games. Economy has only gotting worse more people have less recreational spending money. Many people dont have a big moral issue with piracy. With tens of million still not transitioned off oldcomp/360/ps3, its probably a good strategy to insentivize as many people as possible over to console rather than PC.

If I was an exec at a struggling AAA publisher uncertain about the companies future and downsizing losing my job and was supporting my family, heck I'd unfortunately want to form an industry wide agreement-understanding with other publishers/devs not to offer that kind of huge of a visual difference e3 2013 vs release witcher3 on PC as that could move more players to the platform easier to pirate on.
 
Last edited:
Because they could, and didn't have to wait until the GPU included hardware support! Point is, they could effectively combine multi-core CPU with GPU in one rendering engine, with access to the same memory at all times, etc. Where DirectX9 was a very fixed rendering pipeline with little possibility to look back, very limited drawcalls, etc.



But developers found that jobified, multi-threaded, engines specifically tailored to make the most of Cell and GPU combined were a much better start for D3D 11 development.



Of course not. But eventually it's likely to become standard everywhere. Point being, don't underestimate the consoles open architecture and optional freedom from (DirectX) abstraction layers, and what they've done for PC development. You should go back to the Mantle discussion before DirectX12 was even in the picture, and one of the reason's detre mentioned was that it was ridiculous how inefficient PC games were running versus console games and how badly they could use multi-core CPUs.

You are missing the point. Forget APIs for a second (API efficiency is not what I am talking about here). What I mean is a developer would optimize differently depending on if he/she were targeting a gpu from the "DirectX 9 age" vs a gpu from the "DirectX 11 age". My point about cell was developers were only doing this because they wanted to achieve effects that gpus from the "DirectX 9 age" were not designed to do. This isn't an api problem but a hardware one! However even in those cases the graphics engine is still generally designed for "DirectX 9 age" gpus. That does not port well over to D3D 11 (and neither does cell code!). But since PCs had vastly superior performance it was okay to "brute force" instead of trying to develop eloquent solutions. This does not mean eloquent solutions do not exist, just no one tries to find them. :)

To be very clear my arguments would be exactly the same if we had a "DirectX 12" style api the whole time.
 
I am not missing the point, I'm just (almost) completely disagreeing with you.
However even in those cases the graphics engine is still generally designed for "DirectX 9 age" gpus.
is just false.
 
Because they could, and didn't have to wait until the GPU included hardware support! Point is, they could effectively combine multi-core CPU with GPU in one rendering engine, with access to the same memory at all times, etc. Where DirectX9 was a very fixed rendering pipeline with little possibility to look back, very limited drawcalls, etc.

I'm not sure that saying the PS3 was beyond DX12 because it could do all that stuff in software really counts. That's the whole point of a DX spec, the features have to be implemented in hardware because that's how they get fast enough to use. Sure maybe you could replicate the functionality of Geometry shaders and programmable tessellation and conservative rasterisation on Cell. But you could do that on a Core 2 Quad too. The question is in both cases, would it be fast enough to be worthwhile on any kind of meaningful scale? And I'm guessing the answer is no. That's why we have hardware based GPU's in the first place.

And of course, the same question would also have to be applied to the 360's Xenon CPU too if this argument is to apply to all consoles and not just the PS3 with it's unusual CPU design.
 
Gran Turismo 6 does just that - tesselate with SPEs. Uncharted 2 blended animations with them, raytrace audio, etc. More importantly, DirectX 9 was crap with multi-core CPUs - heck, until Mantle and DirectX12, that wasn't ever solved that well even today, where teams like Insomniac were already moving to SPE driven game engines, where one SPE would be driving other SPEs, etc.

Anyway, this is an old discussion. And it did apply to 360 as well - all the optimisations that made sense for PS3 turned out to greatly benefit the 360 as well, just not as much as on the PS3, where it was more necessary because it was so different from PC. There's lots of talks about this from multi-platform developers, such as by Criterion (Burnout) or later DICE as well.
 
I am not missing the point, I'm just (almost) completely disagreeing with you. is just false.

Why do you feel that way? Are you suggesting in those situations the PS3's gpu wasn't used at all (I find this hard to believe...)? Just because cell was used for various effects doesn't mean developers could pretend the 7800 didn't exist. Like I said previously, it's difficult to develop an efficient port targeting D3D 11 cards (note I mean feature level here not API) when your starting point is ~7800+Cell.

It seems you're still hung up on APIs. Cell didn't solve the "API problem". Cell was used because the 7800 wasn't able to achieve the desired effects developers wanted. Perhaps did you consider the possibility that developers might have been able to achieve the same effects if sony used a smaller cpu and a larger gpu for the ps3? In that hypothetical situation my argument would still be the same. The architecture of this "super 7800" would still hold back gpus that have made architectural enhancements since then.
 
I wonder, considering the hardware that was available at the time, would a game developer rather have a G94 + Normal CPU (e.g. AthlonX2) or RSX + Cell? I suspect the former is vastly preferable to the latter in almost all cases. I know G94 was a bit late to the party but maybe it wouldn't have been if Sony hadn't put so much faith in Cell until the last minute when they realized they would still need a conventional GPU in there.

Of course what we got was RSX + Cell so graphics programmers made the best of it (and did a fantastic job IMO considering the challenges Cell presented).

Edit because I meant G96 not G92 which was a different class of GPU.
Edit2 Nooo I meant G94 not G96.
 
Last edited:
Why do you feel that way? Are you suggesting in those situations the PS3's gpu wasn't used at all (I find this hard to believe...)? Just because cell was used for various effects doesn't mean developers could pretend the 7800 didn't exist. Like I said previously, it's difficult to develop an efficient port targeting D3D 11 cards (note I mean feature level here not API) when your starting point is ~7800+Cell.

I'm not hung up on APIs, you are. You seem to think that Graphics abstraction layers such as DirectX just describe whatever great new invention was done on GPUs. But that's just not true. In fact, I'm pretty sure D.I.C.E. has a paper out there outlining that targeting DirectX11 cars using the setup that was becoming common on Cell+RSX was perhaps the easiest path of all, except that they were still running into some DirectX11 API limitations. And the whole drive for Mantle and eventually DirectX12 was because consoles were showing how hardware was being limited by DirectX forcing programmers into a specific pipeline that was no longer the most efficient.

Wow, I actually found that presentation again. It was six years ago, and on Frostbyte 2. It shows very clearly what they could already do on consoles, but had to wait until DirectX11 to be able to do on PC, which already had far more advanced GPUs back then:
http://s09.idav.ucdavis.edu/talks/04-JAndersson-ParallelFrostbite-Siggraph09.pdf

Here they were still very enthusiastic about the progress made in DirectX11, but they would still find themselves run into limits, such as the famous 'hey wait why am I CPU limited to one core/thread when feeding the GPU' issue in DirectX up to 12, the draw call issues, inability to innovate the rendering pipeline, etc. (see http://www.gamasutra.com/view/news/123987/AMD_DirectX_Holding_Back_Graphics_Performance_On_PC.php)

Cell by the way did a whole host of interesting things, but PC gamers typically prefer to remember it 'only' for 'fixing' the RSX performance. ;)
 
And if the PS3 had a decent DX10 class GPU they could have used it in ways that weren't possible on PC as well. That is just the nature of console development, and not specific to Cell.
 
Let's try this: pretend the PC had a perfect API the whole time (how ever you want to define it). My argument would not change. Forget APIs (really forget them this time)! Do you feel that we've made advancements in gpu architectures since the 7800? Do you think these advancements might require new ways to approach problems?

As a side note you need to let go of cell. It was a mistake. If sony could go back and do it all over again they wouldn't pick cell. The way developers used cell to solve RSX's deficiencies was interesting but really RSX shouldn't have been deficient in the first place. I'm aware this will elicit some strong feelings from you but you continue to use cell as a crutch in this debate when it really has nothing to do with what I am talking about. Like it or not RSX was used extensively even in games that utilized cell for graphics related tasks. That code does not translate well to modern day PCs. Beyond that we'll have to agree to disagree.

Finally (another side note) it seems like you are under the impression that pc ports suffer mainly due to cpu overhead of APIs. More times than not (at least for PS3/360 ports) I'd say that was not the case. It's one thing if developers consistently made the best possible use of DirectX and still came up short, but imo that hasn't been the case...
 
Back
Top