Are PCs holding back the console experience? (Witcher3 spawn)

  • Thread starter Deleted member 11852
  • Start date
Let's try this: pretend the PC had a perfect API the whole time (how ever you want to define it). My argument would not change. Forget APIs (really forget them this time)! Do you feel that we've made advancements in gpu architectures since the 7800? Do you think these advancements might require new ways to approach problems?
...

That code does not translate well to modern day PCs.

mm... yeah, so what I think you're trying to get at is how devs make their shaders or use certain formats or even look at Low-Level GCN optimizations. There's some great explanations in there on how things have changed over the years.
 
Let's try this: pretend the PC had a perfect API the whole time (how ever you want to define it). My argument would not change. Forget APIs (really forget them this time)! Do you feel that we've made advancements in gpu architectures since the 7800? Do you think these advancements might require new ways to approach problems?

Yes, of course!

As a side note you need to let go of cell. It was a mistake. If sony could go back and do it all over again they wouldn't pick cell. The way developers used cell to solve RSX's deficiencies was interesting but really RSX shouldn't have been deficient in the first place. I'm aware this will elicit some strong feelings from you but you continue to use cell as a crutch in this debate when it really has nothing to do with what I am talking about. Like it or not RSX was used extensively even in games that utilized cell for graphics related tasks. That code does not translate well to modern day PCs. Beyond that we'll have to agree to disagree.

What gives you the impression that I am hanging on to it? You're completely missing the point. I'm not here to champion Cell at all, that's just an example that was relevant to when this point was argued a lot, and the first many-core CPU in common use. You could see CU programming very similar to SPE programming as a general puzzle on how to nicely fit that into new types of rendering pipelines (and old).

Finally (another side note) it seems like you are under the impression that pc ports suffer mainly due to cpu overhead of APIs. More times than not (at least for PS3/360 ports) I'd say that was not the case.

You just don't notice they suffer because they are so much more powerful.

It's one thing if developers consistently made the best possible use of DirectX and still came up short, but imo that hasn't been the case...

Here we couldn't disagree more. I gave you several specific quotes from high-profile developers. If you're genuinely interested, then read more on the subject, is all I can say. I'm done!
 
You could see CU programming very similar to SPE programming as a general puzzle on how to nicely fit that into new types of rendering pipelines (and old).

Perhaps at a very high level, but Cell code does not port well to the PC. You're right if developers spent as much time on PC ports as they did with Cell they could probably find some very eloquent solutions. But developers don't because the "brute force" option often works on PC. Again this isn't an api/technical issue; it's a business issue.

You just don't notice they suffer because they are so much more powerful.



Here we couldn't disagree more. I gave you several specific quotes from high-profile developers. If you're genuinely interested, then read more on the subject, is all I can say. I'm done!


If you think pc games are making the best possible use of DirectX then yeah we'll definitely have to agree to disagree. ;-)

[edited to be less of a jerk]
 
It's also a business issue to be content with 30fps on consoles, or go for 30fps to begin with, because everything is a tradeoff in the end. I'm not sure if you're expecting that all developers ever will always write fully optimal DirectX code, but if you're into logic at all, you should understand it only takes one who does, and we've heard from several, even directly here in the forums. I've given you links to developers who wrestled with DirectX and ran into limitations and even offered solutions, and the same has been discussed here in many threads. If terms fixed rendering pipeline, tight CPU/GPU integration, threader rendering and such don't mean anything to you and you're also not interested in reading about them, then I'm not sure why you'd even want to have this discussion.

If we purely go back to the topic, then I'll conclude with saying that I'm pretty sure consoles aren't holding back the PC experience, neither for business reasons and or technical ones.
 
It's also a business issue to be content with 30fps on consoles, or go for 30fps to begin with, because everything is a tradeoff in the end. I'm not sure if you're expecting that all developers ever will always write fully optimal DirectX code, but if you're into logic at all, you should understand it only takes one who does, and we've heard from several, even directly here in the forums. I've given you links to developers who wrestled with DirectX and ran into limitations and even offered solutions, and the same has been discussed here in many threads. If terms fixed rendering pipeline, tight CPU/GPU integration, threader rendering and such don't mean anything to you and you're also not interested in reading about them, then I'm not sure why you'd even want to have this discussion.

If we purely go back to the topic, then I'll conclude with saying that I'm pretty sure consoles aren't holding back the PC experience, neither for business reasons and or technical ones.

I've read many papers in my life. ;) I've also ran many games through profilers (pix, etc.). I've also heard many horror stories about developers from IHVs. :p I'm very confident that developers don't (and haven't) made the best use of DirectX. You're right that some get closer than others (and then write papers about it!). But honestly I think you're talking about a market that you don't fully understand. If you don't believe me just try profiling a few games (start with GTA 4).

And going back to topic I'll ask you one more question: Why do you think a fixed platform couldn't possibly hold back a fluid platform 7 years later? You admitted we've made technological advancements since then (and that those technological advancements might require new approaches to various problems). But those advancements require time and resources to fully exploit. Time and resources have a finite quantity. Having to support a fixed platform diverts time and resources away from researching those technological advancements (since you need to spend resources researching how to extract the best possible performance from the fixed platform). I don't think I'm making any crazy claims here! :p Do you think if given more time for pc ports that developers could do a better job? Because if we go by your logic the answer should be no (they've already "maxed out" DirectX). I suspect though the answer might be yes...
 
Or you could profile StarCraft II, not sure how it is now but at release it was hilarious. (With PIX or whatever tool you prefer.)
 
And going back to topic I'll ask you one more question: Why do you think a fixed platform couldn't possibly hold back a fluid platform 7 years later?
Or you could profile StarCraft II, not sure how it is now but at release it was hilarious. (With PIX or whatever tool you prefer.)
I think this statement demonstrate Arwin's view (one I don't agree nor disagree with). Why was StarCraft II poorly optimised when it wasn't a console game? It wasn't the consoles holding that title back. I think fundamentally it's business choices/economics that are holding things back. Whether consoles are negatively or positively supporting technological progress is software seems to be a matter of opinion. Do they divert attention from PC specific implementation? Yep. Do they bring money to fund development that a PC only title wouldn't get? Yep. The answer to the thread question is a mix of positives and negatives, and the net result, do console hinder PC games more than they help, is something probably unquantifiable.

Did the consoles result in a weaker Witcher 3 on PC? Yes. Therefore, consoles hold back PC, end of discussion, we only need this one perspective.
Did the consoles result in enough funding for GTAV to be realised and given a PC port? Yes. Therefore, consoles enable games to be made that otherwise wouldn't exist, bringing gaming to PC, end of discussion, we only need this one perspective.

Or, we look at several perspectives and find no clear monochromatic, monosyllabic answer. ;)
 
I think this statement demonstrate Arwin's view (one I don't agree nor disagree with). Why was StarCraft II poorly optimised when it wasn't a console game? It wasn't the consoles holding that title back. I think fundamentally it's business choices/economics that are holding things back. Whether consoles are negatively or positively supporting technological progress is software seems to be a matter of opinion. Do they divert attention from PC specific implementation? Yep. Do they bring money to fund development that a PC only title wouldn't get? Yep. The answer to the thread question is a mix of positives and negatives, and the net result, do console hinder PC games more than they help, is something probably unquantifiable.

Did the consoles result in a weaker Witcher 3 on PC? Yes. Therefore, consoles hold back PC, end of discussion, we only need this one perspective.
Did the consoles result in enough funding for GTAV to be realised and given a PC port? Yes. Therefore, consoles enable games to be made that otherwise wouldn't exist, bringing gaming to PC, end of discussion, we only need this one perspective.

Or, we look at several perspectives and find no clear monochromatic, monosyllabic answer. ;)

But none of the points you raised are technical. I'm not arguing whether or not the PC would be better off with or without consoles (that's a whole different discussion). So I don't think your GTA V example applies (in fact I think it proves my point if anything...look at the jump pc gets when consoles upgrade!). This debate to me is exactly the same "problem" I have when I have to support the iphone 4. It sucks. Anything new we create still has to be designed to be compatible with the iphone 4. That takes resources that otherwise maybe could have been invested to adding new (or improving old) features to newer (and more capable) phones. Again I don't feel like I'm saying anything radical, but fixed platforms (that are still relevant) hold back progress for newer ones.
 
So the question is whether current "average" gaming PC is faster/more capable than the best consoles available right now ?
Likely yes. (Could check steam hardware survey to have an idea.)
 
It's still an issue of economics. You could target only iPhone 6 but then you have a smaller target audience. You could target only PCs with DX11 GPUs but then you exclude older PCs. Even without consoles it's a matter of targeting your spec for economic reasons. Perhaps the consoles drag the engine tech a bit longer but it's not like only they are responsible.

So the question is whether current "average" gaming PC is faster/more capable than the best consoles available right now ?
Likely yes. (Could check steam hardware survey to have an idea.)
That's one way I guess. If consoles < 'average' PC, they set a lower standard. Determining a meaningful average is going to be hard though as you need an equivalent user base and revenue. So for example, rather than median average PC spec you might need the minimum spec PC to provide a large enough audience to fund a game to the same degree that consoles would. What spec PC would you have to drop to to reach an audience of 40 million PCs, say?
 
I don't believe I ever said consoles are solely responsible!

Let's try this example (remember I'm debating on technical grounds): Pretend consoles don't exist. When do you think feature level 12_1 will be "seriously" (not an afterthought) used in many AAA PC games? Now? Of course not only maxwell 2 supports it. Next year? Still probably not. Two years? Maybe, but you get the point. Eventually it's going to happen. Now try this exercise again knowing consoles exist. Do you think the "time to market" of feature level 12_1 will increase or decrease? Maybe you answer neither because you believe only the pc market dictates that pace. I believe consoles in this aspect (remember I'm arguing technical not economics) hinder the "time to market" of feature level 12_1 because they take resources away from the exploration of the possibilities of that feature level. Consoles may not be solely responsible but they are a large part of the problem.
 
I don't believe I ever said consoles are solely responsible!

Let's try this example (remember I'm debating on technical grounds): Pretend consoles don't exist. When do you think feature level 12_1 will be "seriously" (not an afterthought) used in many AAA PC games? Now? Of course not only maxwell 2 supports it. Next year? Still probably not. Two years? Maybe, but you get the point. Eventually it's going to happen. Now try this exercise again knowing consoles exist. Do you think the "time to market" of feature level 12_1 will increase or decrease? Maybe you answer neither because you believe only the pc market dictates that pace. I believe consoles in this aspect (remember I'm arguing technical not economics) hinder the "time to market" of feature level 12_1 because they take resources away from the exploration of the possibilities of that feature level. Consoles may not be solely responsible but they are a large part of the problem.

I actually had a similar conversation with Max about DX12 feature levels and I basically asked why could 12_1 been pushed into 12_0 (since it would have been really massive). And he told me in part bindless resources was in fact a large determining factor for 12_0. Andrew actually told me the same thing. Them bindless resources ;)

12_1 (Tier 1 CR, and ROV) still have fringe cases in which the hardware has issues in handling, so we're essentially looking at the Tier 3 variants of CR and ROV to really get the hardware working effectively on all cases.

That being said, wrt Console... if the generation is shorter, this timing might be right. To be fair to the opposing arguments, we are still waiting for games to take advantage of PRT/Tiled Resources, and not all of them will leverage it. I think the Doom remake is the only game I know using it as a megatexture system. Tomorrow's Children seem to be using it for their Voxel based GI.

While I agree consoles did hinder progress, ie. Compute Shaders have been available for use since 07/08, but unavailable on consoles until now - when we look at fringe features, like PRT/TR, and possibly ROV and CR, then perhaps it's not hindering.
 
Last edited:
I don't believe I ever said consoles are solely responsible!

Let's try this example (remember I'm debating on technical grounds): Pretend consoles don't exist. When do you think feature level 12_1 will be "seriously" (not an afterthought) used in many AAA PC games? Now? Of course not only maxwell 2 supports it. Next year? Still probably not. Two years? Maybe, but you get the point. Eventually it's going to happen. Now try this exercise again knowing consoles exist. Do you think the "time to market" of feature level 12_1 will increase or decrease? Maybe you answer neither because you believe only the pc market dictates that pace. I believe consoles in this aspect (remember I'm arguing technical not economics) hinder the "time to market" of feature level 12_1 because they take resources away from the exploration of the possibilities of that feature level. Consoles may not be solely responsible but they are a large part of the problem.
I'm unable to separate the technical from the economic. ;) Let's say 12_1 GPUs are prevalent in PCs in 3 years' time. There's absolutely nothing stopping developers creating 12_1 engines that only run on these PCs at that point. If they choose to target lower end spec for economic reasons, whether for consoles or a wider PC market, that can't be helped. Certainly if new consoles released in three years time with modern GPUs, exploration of possibilities of that feature set will be explored more, but those reasons are purely economic. Likewise, the reason Witcher 3 isn't pushing the envelope is economic. The developers decided it wasn't in their financial interest to invest in a higher level PC engine targeting top-end hardware.

If there was an unlimited amount of money in the gaming industry, devs would likely explore cutting-edge rendering techniques for the fun of it and run multiple highly optimised engines, but they don't have that luxury now as they chase deadlines and paying the bills.
 
GxaISiE.jpg


hm.....

Pigment Map (1)

Render top-down view of the terrain
  • 1’st clipmap level (closest area)
  • Use lowest mipmaps
  • No interpolation
  • No triplanar mapping
  • Account for color map
  • No tessellation - just one huge quad
    • provided that pregenerated normal maps are enabled

kes6aVB.jpg


CgfJQDd.jpg
 
Last edited:
lol what is this? I didn't recognize this as being a real issue, I guess I'm blind to these problems.
I think they're trying to make the vegetation blend in better with the terrain, but I dunno, it seemingly destroys the colour contrast that makes the details pop-out. Maybe? The third slide I uploaded doesn't seem so bad, but a lot of the screenshots just don't quite look like that (right side). *disclaimer: something something compression etc.
 
I've read many papers in my life. ;) I've also ran many games through profilers (pix, etc.). I've also heard many horror stories about developers from IHVs. :p I'm very confident that developers don't (and haven't) made the best use of DirectX. You're right that some get closer than others (and then write papers about it!). But honestly I think you're talking about a market that you don't fully understand. If you don't believe me just try profiling a few games (start with GTA 4).

But whether to make the best use of DirectX is the same as being better off designing a rendering pipeline that matches what your game is trying to do more closely, or allows for completely different types of effects, is a completely different discussion. Why are there so very few games that morph vertices in realtime? Why is animation blending so rare? Why couldn't raycasting be used for more things on PC easily? I'm a noob, admittedly, but how efficient DirectX is being used is not entirely the point. Point is, you're not showing anything about papers you've read, you're just telling me that I'm wrong, with little in the way of backing it up? I'm very reasonable and very open to argumentation and evidence, and contrary to popular belief, don't care about winning an argument . So give me some reasons to change my mind other than on faith. ;)

And going back to topic I'll ask you one more question: Why do you think a fixed platform couldn't possibly hold back a fluid platform 7 years later?

I've said the same - by the end of the console generation a lot of PCs are so far ahead of consoles that there was a sizeable number of PCs capable of running higher details, better physics, etc. If a game can target those with far better graphics and make a profit though, then it would happen, and towards the end of the console cycle, you saw some of that happening (high-res textures packs, etc.).

However, I was arguing that your 'fixed' platform is DirectX9 for much of last-gen, before DirectX11 become widely enough available. You think it's hardware that is fixed, but if it is 'open' enough to programmers, it is more software limited than hardware limited. Of course that is not entirely true - at the end of the day, we all know that Art creation has been the primary limitation and eventually a PC can push more by brute forcing things, but they power-to-performance ratio isn't always great, and has typically been worse.

You admitted we've made technological advancements since then (and that those technological advancements might require new approaches to various problems). But those advancements require time and resources to fully exploit. Time and resources have a finite quantity. Having to support a fixed platform diverts time and resources away from researching those technological advancements (since you need to spend resources researching how to extract the best possible performance from the fixed platform). I don't think I'm making any crazy claims here! :p Do you think if given more time for pc ports that developers could do a better job? Because if we go by your logic the answer should be no (they've already "maxed out" DirectX). I suspect though the answer might be yes...

There are at least as many PC to console ports as the other way around, and right now in the current generation that holds far more than ever. Not even all exclusives were developed on consoles last gen. On PC, for a long time they had only DirectX9, and getting at the hardware improvements was a minefield because of the huge variation in GPUs (still an issue). Then there's this whole mentality of we'll 'per game' optimise everything away in the graphics driver ... Again, I'm pretty sure you know more about PC development than I do, but I still don't understand your position.

So the question is whether current "average" gaming PC is faster/more capable than the best consoles available right now ?
Likely yes. (Could check steam hardware survey to have an idea.)

I've seen numbers that suggest the number of Steam PCs more powerful than the PS4 is not even half the number of PS4s out there (remember, already more than 22 million). So perhaps it would be worthwhile looking at that number to be sure. Then I would like to know if you think PC hardware can be used as efficiently as console hardware in principle (we can discuss in practice more later - right now I'm suspecting console optimisation is still in 'early days').
 
I've seen numbers that suggest the number of Steam PCs more powerful than the PS4 is not even half the number of PS4s out there (remember, already more than 22 million). So perhaps it would be worthwhile looking at that number to be sure. Then I would like to know if you think PC hardware can be used as efficiently as console hardware in principle (we can discuss in practice more later - right now I'm suspecting console optimisation is still in 'early days').

I'm not sure about that. Based on the workings in this thread there would seem to be around 30 million PC's by January this year running GPU's based on either GCN, Kepler or Maxwell.

https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-5

Obviously some of those GPU's won't be more powerful than the PS4 but if only half of them are that's still a very big market. It's be interesting to know what the sales split is in those families. I'd say anything above a 660, 7850 or 750Ti would be worthy of being considered "more powerful" than a PS4. Obviously we don't need to discuss the CPU split.

With regards to efficiency, at least in terms of GPU I think the DF face offs have shown by now that the likes of the 270 and 750Ti are performing as well as would be expected in comparison to consoles relative to their specs. Maybe that will change as the generation moves on but then again DX12 is due in that same time frame.
 
How much graphics memory does a typical 660 have? I think 3.5GB of VRAM would soon be the minimum, and don't tjose have 2GB max? Half of 30 is already less than the number of PS4s out there, close to judt the XboxOne. True it is still a very sizeable market and it is well catered for right now, but the question is which market will grow faster. Directx12 will also help PC to console ports become better. But all that said, I do believe that PCs are in better shape this gen than last gen, mind you (part of the reason why I bought a 970 for my i7, and will likely soon get a dedicated DS4 for it), and consoles definitely also have less 'hidden reserves'.
 
I'm unable to separate the technical from the economic..

That's not a very compelling argument. Like I stated before, I believe consoles make supporting the "least common denominator" worse and makes it harder to explore new advancements. I'm aware why it happens (as you said economics) but that doesn't change my argument. I'm not suggesting that life in the gaming world would be better overall without consoles, but they certainly don't speed up progress for architectures that have evolved since their release.

But whether to make the best use of DirectX is the same as being better off designing a rendering pipeline that matches what your game is trying to do more closely, or allows for completely different types of effects, is a completely different discussion.

No it isn't. To make the best use of DirectX your engine has to be built (from the ground up) to make use of it. I've already given you a way to verify my claims (pix can't lie!). Roderic provided another example that illustrates what we are talking about (generally, incoming oversimplification, you want to render the same objects together e.g. tree, tree, tree, building, building, unit, unit, etc. whereas sc2 iirc is more tree, building, tree, unit, tree, unit, building, etc.). These types of performance problems have nothing to do with the overhead of the API and can easily be verified yourself. I feel like you're the one who needs a little more proof than a couple of papers in which developers pat themselves on the back (I look good in all the presentations I make too!). ;-) We should continue the discussion when you've done the proper research because honestly we're just spinning in circles here. Like I said before, if you believe developers have been consistently making the best use of DirectX there's not much we can discuss.
 
Back
Top