Hardware utilization: PC vs console question

It wasn't twisted into this debate. You started it that way. You tried to compare different powered hardware (PS4 and demos on a PC using GTX 680), and to back it up used a tweet from Carmack comparing similar PC and console hardware.



It's very easy to understand, but your explanations aren't logical. Your explanations try to make the console environment sound like it can exceed it's capability. You're even trying to twist ERP's post to justify what you are saying. Which really goes against the analogy you just made. I also agree with Sonic's analogy.
I'm not even sure what you are debating, are you saying that it doesnt improve performance? Sonic analogy was again about high end pc vs console. Not sure how that backs up anything we are talking about....


More like you take a prius and tune it... then race it against the stock mustang that no one bothered to tune because it's way faster than the prius is ever going to get ;)

Its like no one reads the thread and think people are talking about high end PC vs consoles...

:LOL:

The funny thing about a prius when racing it you get terrible mpg. Like on top gear they got around 12 or something silly...
 
I was just kidding around, hence the ;). See the rest of the post for the more serious reply.

Oh :LOL: That wasnt there when i posted. i thought b3d had went crazy today... :smile:

:runaway:

The whole debate started over this statement, can the ps4 at the leak spec handle the games running on single 680 gtx: UE4, starwars 1313, and square enix demo all at e3 and look about the same at console resolution.

edit: to be fair to carmack they do not use dx they use opengl. That is why he does not work with it...
 
Last edited by a moderator:
Upon rereading your original question, I guess I'd answer it like this:

First is it really possible to estimate how much a PC's hardware power is utilized for a game?
Outside of specific use cases, no. And even then, it's not about "how much is utilized"; it's utterly trivial to make the task manager say 100% ;) It's about how much time and more importantly power it takes to solve a fixed-size problem.

And if so is it really 50%?
In the general case, definitely not.

That said, I'd still say I think low level APIs are interesting, and you do lose *something* to the API. Specifically on integrated graphics, as Carmack mentions, the current APIs are not particularly well suited. But that's sort of a separate topic honestly.
 
I'm not even sure what you are debating, are you saying that it doesnt improve performance?

If "improve performance" means for example a 7850 in a console environment means that it will perform better than it theoretically can, then yes I disagree with that. It will never perform better then it was designed to in a console. It would just perform closer to how it was intended to perform in a console.

Sonic analogy was again about high end pc vs console. Not sure how that backs up anything we are talking about....
Its like no one reads the thread and think people are talking about high end PC vs consoles...

And here is why people think that.

The whole debate started over this statement, can the ps4 at the leak spec handle the games running on single 680 gtx: UE4, starwars 1313, and square enix demo all at e3 and look about the same at console resolution.

To me it seems like you're missing your own premise that started this. You compared a high end PC to a console and that's why people respond accordingly. I was trying to find out how logical the premise was from people more experienced because it didn't make sense to me. I'm satisfied with what I've seen from others posting.

Upon rereading your original question, I guess I'd answer it like this:


Outside of specific use cases, no. And even then, it's not about "how much is utilized"; it's utterly trivial to make the task manager say 100% ;) It's about how much time and more importantly power it takes to solve a fixed-size problem.


In the general case, definitely not.

That said, I'd still say I think low level APIs are interesting, and you do lose *something* to the API. Specifically on integrated graphics, as Carmack mentions, the current APIs are not particularly well suited. But that's sort of a separate topic honestly.

Actually those were my questions from a previous debate with KB. But the general responses given were in line with what I was thinking.
 
The whole debate started over this statement, can the ps4 at the leak spec handle the games running on single 680 gtx: UE4, starwars 1313, and square enix demo all at e3 and look about the same at console resolution.
"Console resolution" = 720p? With or without AA? 30Hz?

Remember, the difference between (720p, No AA, 30Hz) and (1080p, 4x AA, 60Hz) is ~4-10x more work! Indeed that massive difference in the amount of computational power required to hit the "baseline expected performance" on each platform is what often makes people thing PCs are vastly less efficient than they really are.
 
Last edited by a moderator:
If "improve performance" means for example a 7850 in a console environment means that it will perform better than it theoretically can, then yes I disagree with that. It will never perform better then it was designed to in a console. It would just perform closer to how it was intended to perform in a console.



To me it seems like you're missing your own premise that started this. You compared a high end PC to a console and that's why people respond accordingly. I was trying to find out how logical the premise was from people more experienced because it didn't make sense to me. I'm satisfied with what I've seen from others posting.



Actually those were my questions from a previous debate with KB. But the general responses given were in line with what I was thinking.

"will not perform better theoretically can, but it will perform closer. " Then wouldnt this statement be true. A console hardware would preform better than pc hardware? :LOL:

I am not comparing any high end pc, I comparing a 680 gtx to a console that will release in 1-2 years. That 680 will not be high end at the time these consoles launch.

Everyone has said it will perform better, exactly what I said... :LOL: Now how much better is really up for debate. I dont think we could find a exact answer unless we look game by game.


"Console resolution" = 720p? With or without AA? 30Hz?

Remember, the difference between (720p, No AA, 30Hz) and (1080p, 4x AA, 60Hz) is ~5-10x more work! Indeed that massive difference in the amount of computational power required to hit the "baseline expected performance" on each platform is what often makes people thing PCs are vastly less efficient than they really are.
with ps360 that could mean below 720p. A lot of big AAA games run under 720p.

But I do not believe that will be the case with ps470. I believe 720p will be the target for next gen games. I dont see the need for 1080p given that not all tv even support this res.
 
Last edited by a moderator:
"will not perform better theoretically can, but it will perform closer. " Then wouldnt this statement be true. A console hardware would preform better than pc hardware? :LOL:

I am not comparing any high end pc, I comparing a 680 gtx to a console that will release in 1-2 years. That 680 will not be high end at the time these consoles launch.

Everyone has said it will perform better, exactly what I said... :LOL: Now how much better is really up for debate. I dont think we could find a exact answer unless we look game by game.

I never said the console hardware wouldn't perform better than PC hardware. This is what I'm saying is wrong.

I look at it as the console having 2 TFLOP of "pc power" and pc having 1 Tflop.

A 1 TFLOP GPU is 1 TFLOP no matter what. And in turn your original premise tried to "push" the performance of PS4's target GPU up there with a computer using a 680 based on that thought process I just quoted. So no, nobody has said what you are saying.
 
I never said the console hardware wouldn't perform better than PC hardware. This is what I'm saying is wrong.



A 1 TFLOP GPU is 1 TFLOP no matter what. And in turn your original premise tried to "push" the performance of PS4's target GPU up there with a computer using a 680 based on that thought process I just quoted. So no, nobody has said what you are saying.
So you are saying the console hardware will perform better. But that was my point. :?: I really dont have a clue what you are going on about? Seem you agree with what i'm saying.

Was the fact I used tflop as measurement of this power difference?

But i was just using your example comparing the power in gflop in this thread. The reason I used it on gaf is because they think that gflop equal performance. Really glfop is all the care a lot which is silly IMO. Like I said on GAF I was making it easy to understand.

console's GPU would have a max of say 900 GFLOPs used (it still deals with the same things only much leaner) while the PC GPU is only having 450 GLOPs used.
I was saying the pc doesnt lose the performance[measure in your gflop for example] like you were syaing, console gains the performance[increase the gflop]

It's worth noting that when Carmack says 2x he's talking about DX9. DX11 will reduce that somewhat.

Also, that level of optimisation will only apply to games at least a couple of years into the console lifescycle because of the time it takes developers to optimise console hardware. So I wouldn't expect a 1.8 TFLOP GPU in PS4 to be matching the 7890 on day 1. Two years down the line in newer games it might but of course by then the 7970 will be mainstream level performance.

Finally, when we say PC's have half the efficiency of consoles that would only be at the console level graphics. i.e. it would take double RSX performance to achieve PS3 level visuals in a modern game. Once you start scaling the graphics up I expect PC games get far less efficient than that due to the lack of optimisation given over to graphics beyond the console level.

That is the main point I have been making. I believe these games we saw at E3 will be running on Ps4/x720. That was the whole point to this debate. I will go even farther and say by the end of gen game willl look better than tech demos/demos. This is why you made this thread because you do not believe the ps4/x720 at given specs can handle these games. That is the thread you should have posted....
 
Last edited by a moderator:
Hmmm what is probably missed in here is that consoles also get to "cheat" at certain workloads/optimizations as the games are designed for them. e.g. the Xbox 360 has issues with texturing (specifically AF) as it is a performance hog. So a PC may not get much, or any benefit in performance by matching 1-4x AF, but if 16x AF was applied in both scenarios the equations totally flips.

It really depends on what your workloads are and what you are willing to give up. Console developers are willing to give up a ton of IQ to match performance but may in turn complain due to some of the PC overhead makes a 60Hz game difficult on the PC. Closed boxes have a lot of advantages (specific feature set, well understood latencies, larger target audience to justify investment, etc) but the big one is being targeted as a baseline.

Maybe the best question would be to ask a handful of software engineers at a number of game studios making various games--both high end and multiform--if they could get better visual results from a 2TFLOPs GPU with 8 core CPU and 8GB of memory on a PC or a 1TFLOPs GPU with 4 Core CPU and 4GB of memory on a console?

Of course if you let them cheat and cut texture resolution, disable AF, apply full screen blur (ohhh sorry, post process AA!), cut LOD, and allow a ton of jaggies run wild and lock at 30Hz BUT call it the "same" then the closed "cheat" box will always win :p
 
That is the main point I have been making. I believe these games we saw at E3 will be running on Ps4/x720. That was the whole point to this debate. I will go even farther and say by the end of gen game willl look better than tech demos/demos..
Of course they'll run in some form. I doubt it'll be at the same resolution and quality levels, but who really cares? They'll make the best use of the hardware that they can. And frankly, I'll be they could do even better on the PCs they are demoing on now if they spent some time optimizing. There really is an absurd amount of power modern high end GPUs... people just throw a lot of it away by just jacking up resolutions, shadow maps, etc. without actually implementing more efficient algorithms.
 
So you are saying the console hardware will perform better. But that was my point. :?: I really dont have a clue what you are going on about? Seem you agree with what i'm saying.

Was the fact I used tflop as measurement of this power difference?

But i was just using your example comparing the power in gflop in this thread. The reason I used it on gaf is because they think that gflop equal performance. Really glfop is all the care a lot which is silly IMO. Like I said on GAF I was making it easy to understand.


I was saying the pc doesnt lose the performance[measure in your gflop for example] like you were syaing, console gains the performance[increase the gflop]



That is the point I have been making. I believe these games we saw at E3 will be running on Ps4/x720. That was the whole point I was saying. I will go either farther and say by the end of gen game willl look better than tech demos/demos.

That was only a part of it, which as I was saying the problem is the example you were giving because context-wise it doesn't agree at all.

And secondly was you using Carmack's tweet to back up that point which given the proper context according to some of the posts in this thread doesn't work at all.

And to say end gen PS4/Xbox3 games will look better than those demos considering the hardware used is expecting a lot from a fully fleshed out game even that late. But they seem to want to push console gens longer before a successor is released so they may figure out some tricks to achieve it. :LOL:

Of course if you let them cheat and cut texture resolution, disable AF, apply full screen blur (ohhh sorry, post process AA!), cut LOD, and allow a ton of jaggies run wild and lock at 30Hz BUT call it the "same" then the closed "cheat" box will always win :p

Of course they'll run in some form. I doubt it'll be at the same resolution and quality levels, but who really cares?

Haha. This is where I'm getting at. I don't see anyway the console version will be 1:1 with the PC version based on what we know so far.
 
Haha. This is where I'm getting at. I don't see anyway the console version will be 1:1 with the PC version based on what we know so far.

Its funny how little thing change, people said the same thing about the FF7 HD tech demo. "No way they will be able to run that," now go back and watch how dated that thing looks. Maybe you were not around back then to know...
 
Its funny how little thing change, people said the same thing about the FF7 HD tech demo. "No way they will be able to run that," now go back and watch how dated that thing looks. Maybe you were not around back then to know...

That was done on PS3. Not the same comparison as to what we're talking about now.
 
That was done on PS3. Not the same comparison as to what we're talking about now.

That was 2005. PS3 launched in end of 2006. Its was running on sli 6800s which was the first ps3 "dev kits." ;) Not only running on the best pc gpu card out, it ran them in sli.

Now you have tech demo running on the top of the line single gpu.
 
Back
Top