The semantic complications of 'demanding' games. *spawn

Status
Not open for further replies.
Depending on what you're talking about, not very. Context.

You're not making any kind of meaningful argument here, just making fun of someone's very clearly stated and perfectly coherent semantics.

I'm not making fun of anyone, it just sounds ridiculous. i understand what he's saying, about the code not running well on the hardware, thus the code is more demanding. but i have never heard poorly optimized codes being called the most demanding games on the console hardware. zone of enders on ps3, a ps2 game, at release would becalled a more demanding game then uncharted 3 because of running 20- 30fps at release according to sebbi, this his pov as a coder, i'm looking at as in GPU saturation, and frankly it's how most people look at it.
 
I'm not making fun of anyone, it just sounds ridiculous. i understand what he's saying, about the code not running well on the hardware, thus the code is more demanding. but i have never heard poorly optimized codes being called the most demanding games on the console hardware. zone of enders on ps3, a ps2 game, at release would becalled a more demanding game then uncharted 3 because of running 20- 30fps at release according to sebbi, this his pov as a coder, i'm looking at as in GPU saturation, and frankly it's how most people look at it.
Many ways to view it. An excellent example might be if games were released at native 4K vs reconstruction on the same system. A normal person say couldn't see the difference, but they could notice that one had a graphical advantage over the other as a result of freeing up system resources for higher settings.

Which is more demanding? The native 4K or the reconstruction running higher settings? In your POV the latter should be more demanding but we see that the hardware struggles with a native 4K presentation.

Ie: look at the 4Pro. Can I reposition the argument to be some form of lower native resolution like 1440p vs. CBR 4K. In your mind the latter will look better and it's running a higher resolution.

Was the code unoptimized? Or is the code now doing more work for less?

It's th same from when graphics moved away from SSAA to other forms of anti aliasing, they found ways to get approximately the same results for less.

Or what about carmacks (attributed to, but not likely) fast inverse root trick? Back in the day And still today's is one of the most taxing of all operations to be performed on hardware natively but this trick saves all this calculation to be done with a simple hexadecimal value.

You won't notice the difference graphically, but you will notice the performance difference between doing the math the right way vs the trick.

So sebbbi imo is right. He's POV is extremely valid and his experience as senior renderer for as long as he's been doing it; he's seen tons of this type of stuff come and go forever. He's one of the few people in the world that actually use high level math as part of their regular job. He's one of the few people that are constantly looking to optimize different ways to exploit faster ways to do the same thing.

Because If he didn't, if the industry didn't (try to find ways to get the same result for less), graphics wouldn't move. Since the inception of graphics hardware, we've done more than just increase power output, we've changed the way games are coded.

Look at Crysis 1 vs Crysis 3. 3 looks better than 1 and runs several times better than 1 on the newest hardware. Their coders thought that graphics would head in one direction, but ended up heading into another. And that's why that meme exists. Cause it runs like dog on hardware and no one can run that game properly on ultra until recently. That's how much raw horsepower is required to run Crysis.
 
If I ask you to list 10 of the most demanding console games, how many would be graphical tour-de-forces and how many would be badly optimised indies with poor framerates and IQ?
 
If I ask you to list 10 of the most demanding console games, how many would be graphical tour-de-forces and how many would be badly optimised indies with poor framerates and IQ?

The shittier the results the more demanding the game, you can have games that look a generation ahead of another game, and doing way more in every sense technically, but if it's lower resolution at the same frame rate it's more demanding some how. zone of enders before patch would have made the list on ps3, not after patch, it was more demanding at 720p 20-30fps. lets not just call it badly optimized .
 
it was more demanding at 720p 20-30fps. lets not just call it badly optimized .
The two aren't mutually exclusive. It was demanding because it was badly optimised (possibly) and the GPU had trouble doing the work required. Then it became demanding because it was optimised and the GPU was kept more optimally busy.

I understand you POV and am hoping I can illustrate it to others, but I don't exclusively side with your argument and agree with others that Sebbbi's post should have been read in context, as the context defined his use of 'demanding'.
 
The two aren't mutually exclusive. It was demanding because it was badly optimised (possibly) and the GPU had trouble doing the work required. Then it became demanding because it was optimised and the GPU was kept more optimally busy.

I understand you POV and am hoping I can illustrate it to others, but I don't exclusively side with your argument and agree with others that Sebbbi's post should have been read in context, as the context defined his use of 'demanding'.

yes i know. obviously if snake pass looked like it was pushing the ps4, i could say its doing things that pushing the gpu/cpu, when i first saw the game i expected 1080p/60fps.

I think sebbi post is just plain wrong

" Snake Pass can only muster 864p @ 30 fps on PS4. This makes it more demanding to the GPU than all the biggest AAA 1080p @ 30 fps games, including Horizon Zero Dawn."

Basically being lower resolution = more demanding. its like developers are not accountable for nothing.
 
yes i know. obviously if snake pass looked like it was pushing the ps4, i could say its doing things that pushing the gpu/cpu, when i first saw the game i expected 1080p/60fps.

I think sebbi post is just plain wrong

" Snake Pass can only muster 864p @ 30 fps on PS4. This makes it more demanding to the GPU than all the biggest AAA 1080p @ 30 fps games, including Horizon Zero Dawn."

Basically being lower resolution = more demanding. its like developers are not accountable for nothing.
He looks at demanding as a function of frame time. The longer the frame time the more demanding it is.

He has no care of the GPU is idling for work to do or is working, it's clearly the same to him. If there is a reduction in resolution to meet frame rate requirements he knows the game is demanding on the GPU side of things.
 
I think sebbi post is just plain wrong. Basically being lower resolution = more demanding. its like developers are not accountable for nothing.
No, it's context, as Iroboto describes. A game running at lower res and framerate than another is demanding more work from the pixel shaders/bandwidth/etc. than they can serve up at 1080p30.

It's like a secretary who can type 80 words per minute. One boss provides clear hand-written text that's very easy to read and properly punctuated, so she can just copy it at her full 80 wpm. Another boss provides messy scrawls with lots of corrections and errors she needs to fix, but still wants the same quality of output and the same amount of work done in the same time. Of course she can't manage that and her output drops to 40 wpm. Which boss is the more demanding?
 
The shittier the results the more demanding the game,

Sigh. No. The more that a game needs the GPU to work on each frame (= lower resolution&framerate) the more demanding it is, regardless of how it looks.

you can have games that look a generation ahead of another game, and doing way more in every sense technically, but if it's lower resolution at the same frame rate it's more demanding some how.

No, not somehow. It's been explained to you multiple times and thoroughtly by sebbi. But as an overly simplyfied simple example: a certain graphical effect coded by an indie developer takes 1000 GPU instructions to do the work, the same graphical effect coded by an AAA developer does it in 700 instructions. Clearly the AAA effect is less demanding (700 ops vs 1000 ops) and runs at a higher framerate.

and doing way more in every sense technically

Incorrect. In the context in which we are speking (how taxing it is to the hardware), technically the better optimized AAA game, as long as it's running at a higher framerate, is actually doing less per frame, although achieving more with what it's making the GPU work. That is optimization.

zone of enders before patch would have made the list on ps3, not after patch, it was more demanding at 720p 20-30fps. lets not just call it badly optimized .

You've been told several times that being badly optimized and being more demanding are not mutually exclusive. And yes it was clearly more demanding pre-patch since it required the GPU to work harder on each frame.
 
Optimisation makes your game / code less demanding in terms of required hardware by reducing work done or by increasing utilisation.

It's about doing less or needing less.
 
We programmers use the word "demanding" in very different context than most people. If someone in my team wants to add a new rendering algorithm and he/she describes it as "demanding", I am worried, because "demanding" isn't exactly a positive word when you need to achieve a 16.6 ms frame time goal.

Everyone agrees that the original Crysis was a very demanding game. It is the prime example of a "demanding" game. We all remember the internet meme: "Does it run Crysis?". Crysis was running at poor frame rate on most PC GPUs, even the expensive ones. Only future GPUs made it smooth on high resolutions and allowed maximum settings. Crysis 2 on the other hand used the new Cryengine 3 and was much better optimized. Crytek was able to launch it on consoles, because it was much less demanding than the original Crysis. If you compared both games on last gen console equivalent PC, the new game looked much better. Original Crysis had to be scaled to very low graphics settings and resolution to run at acceptable frame rate. Demanding isn't always a good thing.

This debate is silly, because the word "demanding" has so many meanings. All of the meanings mentioned in this thread are correct in their respective context.
 
No, it's context, as Iroboto describes. A game running at lower res and framerate than another is demanding more work from the pixel shaders/bandwidth/etc. than they can serve up at 1080p30.

It's like a secretary who can type 80 words per minute. One boss provides clear hand-written text that's very easy to read and properly punctuated, so she can just copy it at her full 80 wpm. Another boss provides messy scrawls with lots of corrections and errors she needs to fix, but still wants the same quality of output and the same amount of work done in the same time. Of course she can't manage that and her output drops to 40 wpm. Which boss is the more demanding?

this could be for 2 reasons,
No, it's context, as Iroboto describes. A game running at lower res and framerate than another is demanding more work from the pixel shaders/bandwidth/etc. than they can serve up at 1080p30.

It's like a secretary who can type 80 words per minute. One boss provides clear hand-written text that's very easy to read and properly punctuated, so she can just copy it at her full 80 wpm. Another boss provides messy scrawls with lots of corrections and errors she needs to fix, but still wants the same quality of output and the same amount of work done in the same time. Of course she can't manage that and her output drops to 40 wpm. Which boss is the more demanding?

It could be 2 reasons for this, the game is well optimized, and is pushing the gpu/cpu to near it's limits, or it could be just poorly optimized and pushing ithe gpu to it's limits like sebbi was saying. like in the case of zone of enders on ps3, resident evil 5/mgs rising on tegra x1 shieild, or snake pass, we know those gpu's can do so much more. i get the context though, just don't see how it's relevant, as we were talking about AAA games running on switch. i was saying indie games should run better on switch since AAA games are pushing the ps4, which has a huge spec advantage on the switch, he brought snake pass being more demanding then any AAA games, and that a highly optimized code on ps4/xb1 which have totally different architecture, and are much more powerful, would do better on switch, how does that make sense?
 
Last edited:
i was saying indie games should run better on switch since they are pushing the ps4, which has a huge spec advantage on the switch, he brought snake pass being more demanding then any AAA games, and that a highly optimized code on ps4/xb1 which have totally different architecture, and are much more powerful, would do better on switch, how does that make sense?
Because highly optimised code on Switch will suffer less relative impact from lower max performance than poorly optimised code. PS4 can brute-force its way through solutions in a way Switch can't, so Switch being asked to run the same engine as one designed around 150 GB/s, for example, is going to be crippled. A AAA title written for PS4 and ported as a AAA port to Switch will rewrite how it solves problems to better fit Switch's strengths and weaknesses.
 
Because highly optimised code on Switch will suffer less relative impact from lower max performance than poorly optimised code. PS4 can brute-force its way through solutions in a way Switch can't, so Switch being asked to run the same engine as one designed around 150 GB/s, for example, is going to be crippled. A AAA title written for PS4 and ported as a AAA port to Switch will rewrite how it solves problems to better fit Switch's strengths and weaknesses.

were talking a highly optimized code on ps4/xb1 being ported to switch. how does that help switch? even if a badly optimized port get's ported to switch, that depends what the port team does with it, they could turn it to a very well optimized port on switch since the hardware, and specs are so different there gonna have to being doing a lot of rewriting anyway.
 
Last edited:
were talking a highly optimized code on ps4/xb1 being ported to switch.

No one has talked about that. Obviously an AAA port would be optimized for the new platform, not to mention that it would be using a different API.

EDIT: That's assuming that you are talking about ps4/xb1 specific optimizations here. Optimization is not limited to that, general optimizations would benefit all platforms.
 
No one has talked about that. Obviously an AAA port would be optimized for the new platform, not to mention that it would be using a different API.

EDIT: That's assuming that you are talking about ps4/xb1 specific optimizations here. Optimization is not limited to that, general optimizations would benefit all platforms.

thats up to developers though, it's not obvious at all , batman arkham knight was one of the most optimized games on consoles, its code didn't run well on much more powerful pc's when it launched. what about aaa games on wiiu, arkham batman city, and cod were optimized games on 360/ps3 but frame rate tanked on wiiu, being will optimized on different totally different architecture, has nothing to do with a port being good on another.
 
were talking a highly optimized code on ps4/xb1 being ported to switch. how does that help switch?
Why would it be any worse?

Having read through this thread properly, including sebbbi's replies where he did actually qualify his terms nicely making much of the debate a bit redundant (my example of a secretary was already covered by his example of a sorting algorithm), it's not even that clear what the argument is trying to be (dis)proven. I think you're saying that if a badly optimised PS4 game like Snake Pass runs poorly on Switch, a highly optimised PS4 game will run proportionally worse on Switch. That can only be true if room to optimise on PS4 is proportionally greater than on Switch. Your intention to extrapolate from low grade ports to AAA ports just doesn't work. The low optimisation games face the same challenges, if not more so, in running on Switch as the high quality ports.

Let's say PS4 can do 100 graphical units of drawing pretties on screen, and Switch can do 25.

What you are apparently suggesting is a game like Snake Pass is only using 50 of that PS4's theoretical 100 graphical units, being an undemanding game that's not maxing the hardware. And this 50 is then ported down to Switch's 25. Ergo, a drop of 50% power requirement from PS4 to Switch (50 >> 25). Therefore you'd expect a AAA game using PS4's 100 graphical units fully to port worse down to switch because the chasm is greater. That would likely be incorrect.

If a game spends PS4's 100 units on drawing a rather plain 900p60 game, that same game ported to the same quality will 636p30 on Switch (all other assets being proportional).
If a game spends PS4's 100 units on drawing a fabulously beautiful 1080p60 game, that same game ported to the same quality will 763p30 on Switch (all other assets being proportional).

That's because the same engine ported will gain the same level of utilisation from whatever hardware, all things being equal. The relative delta between ports will always be 4x.
 
Why would it be any worse?

Having read through this thread properly, including sebbbi's replies where he did actually qualify his terms nicely making much of the debate a bit redundant (my example of a secretary was already covered by his example of a sorting algorithm), it's not even that clear what the argument is trying to be (dis)proven. I think you're saying that if a badly optimised PS4 game like Snake Pass runs poorly on Switch, a highly optimised PS4 game will run proportionally worse on Switch. That can only be true if room to optimise on PS4 is proportionally greater than on Switch. Your intention to extrapolate from low grade ports to AAA ports just doesn't work. The low optimisation games face the same challenges, if not more so, in running on Switch as the high quality ports.

Let's say PS4 can do 100 graphical units of drawing pretties on screen, and Switch can do 25.

What you are apparently suggesting is a game like Snake Pass is only using 50 of that PS4's theoretical 100 graphical units, being an undemanding game that's not maxing the hardware. And this 50 is then ported down to Switch's 25. Ergo, a drop of 50% power requirement from PS4 to Switch (50 >> 25). Therefore you'd expect a AAA game using PS4's 100 graphical units fully to port worse down to switch because the chasm is greater. That would likely be incorrect.

If a game spends PS4's 100 units on drawing a rather plain 900p60 game, that same game ported to the same quality will 636p30 on Switch (all other assets being proportional).
If a game spends PS4's 100 units on drawing a fabulously beautiful 1080p60 game, that same game ported to the same quality will 763p30 on Switch (all other assets being proportional).

That's because the same engine ported will gain the same level of utilisation from whatever hardware, all things being equal. The relative delta between ports will always be 4x.

Those examples you listed are games running 60fps,a what if the game is 30fps, and using almost all of ps4 cpu power as well, which is around 40-50% more powerful. i imagine a simple indie game like snake pass, bomberman, and project setsuna are much easier port then something like batman arkham knight, assassin creed, and battlefront. the down grade gap should be obviously much smaller as well in those indie games
 
Last edited:
thats up to developers though, it's not obvious at all , batman arkham knight was one of the most optimized games on consoles, its code didn't run well on much more powerful pc's when it launched. what about aaa games on wiiu, arkham batman city, and cod were optimized games on 360/ps3 but frame rate tanked on wiiu, being will optimized on different totally different architecture, has nothing to do with a port being good on another.

But then you are talking about the quality of the ports and not the capabilities of the Switch. Wii U had terrible third party support, the Switch seems to have started a lot better, but only time will tell. That's not the discussion in a thread called "Nintendo Switch Technical Discussion", what was being discussed is if the Switch hardware and ecosystem itself supposes a barrier.
 
Status
Not open for further replies.
Back
Top