Hardware utilization: PC vs console question

:LOL:

Yes, there are no PS4 dev kits despite the target specs coming out last year. :rolleyes:

And the rumor says 18CUs which even you've attested to the power in this very thread. I give you credit for trying though. :smile:

Not sure how rumor target specs prove anything. You have to have it on paper before you make the chips. :oops:

So the confirm leak target specs/sdk of wiiu you discount, but the rumor target specs of the ps4 is set in stone? :LOL: ;)

I still think thing they will have apu with a gpu going back to the first ps4 rumors. It may not be a gpu/cpu apu but it going to have something on there with the cpu. Good thing about ps4/x720 they should announce all the specs with the console unlike the wiiu.
 
Its funny how little thing change, people said the same thing about the FF7 HD tech demo. "No way they will be able to run that," now go back and watch how dated that thing looks. Maybe you were not around back then to know...
No one's saying it's not going to look good, etc. I don't think, just keep reasonable expectations. Obviously technology moves along but if we're talking consoles in the next year or two at reasonable price points, it's doubtful they are going to have hardware at the level of a GTX 680. And no, being a fixed console design point is not going to make up that delta in raw performance in general.

But hey like I said I doubt they are really optimizing the hell out of the PC implementation, so it's quite possible you could achieve what they have with much less hardware.
 
Not sure how rumor target specs prove anything. You have to have it on paper before you make the chips. :oops:

So the confirm leak target specs/sdk of wiiu you discount, but the rumor target specs of the ps4 is set in stone? :LOL: ;)

What? LOL.

I also give you credit for being consistent about twisting posts and being all over the place. I've never discounted the Wii U target specs. :smile: I've said that to you the last time you said that to me. You even quoted me in the Wii U GPU thread saying I still expect the base architecture to resemble an R700. So hopefully this time you'll remember that from here on. And now rumored target specs don't prove anything, but you sure harp on Wii U's as proving something. Don't know what Wii U has to do with this thread though.

Regardless PS4 won't have a GTX 680 or AMD equivalent. And expecting 1:1 performance with those demos and eventually surpassing them at any point next gen based on what we know is expecting a lot.
 
No one's saying it's not going to look good, etc. I don't think, just keep reasonable expectations. Obviously technology moves along but if we're talking consoles in the next year or two at reasonable price points, it's doubtful they are going to have hardware at the level of a GTX 680. And no, being a fixed console design point is not going to make up that delta in raw performance in general.

But hey like I said I doubt they are really optimizing the hell out of the PC implementation, so it's quite possible you could achieve what they have with much less hardware.

They said no hardware optimizing for the square enix tech demo. If you notice at E3 everyone was using a single 680 gtx for their net Gen tech demos. Find that hard to believe everyone just happen to be demo on this same card. It mean something...

RPGSite pushed a little to find out what graphics card was powering Square Enix’s demo and although Hashimoto didn’t reveal its name, he said that ‘what I can say is that what we’re using is about the equivalent as what is being used by any other companies for their tech demos.‘

The equivalent as what is being used by any other companies huh? Well, we do know that Epic Games demonstrated Unreal Engine 4 on a single GTX 680. And we do know that Crytek used a GTX 680 for their CryEngine 3 tech demos. Gearbox has also used Nvidia’s GTX 680 cards to showcase the PC, PhysX accelerated, version of Borderlands 2. It’s also no secret that Nvidia’s GTX 6xx series was heavily used in this year’s E3 and we also know that the freshly released GTX 690 was not used by any company to showcase their tech demos.

Put these things together, and you get the card that powered the Agnis Philosophy Tech Demo. In other words, yes. Agnis Philosophy was running on a single GTX 680. In addition, the build that was demonstrated was not optimized at all, meaning that Square Enix could actually produce these graphics in real-time (when all physics, AI, and animations are added to the mix).

Now guess what star wars 1313 was running on? Yes a 680 gtx. Not hard to connect the dot here guys....

http://www.dsogaming.com/news/the-i...phy-tech-demo-was-running-on-a-single-gtx680/
 
They said no hardware optimizing for the square enix tech demo. If you notice at E3 everyone was using a single 680 gtx for their net Gen tech demos. Find that hard to believe everyone just happen to be demo on this same card. It mean something...
I wouldn't read too much into it honestly... they're just picking the fastest single GPU available, nothing magical.
 
Based on the video interviews and stuffs I've read about Agni, it was run on a single 680gtx with fps ranging anywhere from 30-60fps and heavily unoptimized "polygon modeled toes inside shoes etc" all at 1080p resolution.
Now assuming we run it with the rumored PS4 spec 4g gddr5 "let's be generous" + 1.84TF gpu, plus properly optimized as in cutting the unnecessary toes and etc, would you say the demo can retain 1080p at average 30fps with fxaa and same fidelity for the rest? This is the billion dollar question isn't it;)?
 
Based on the video interviews and stuffs I've read about Agni, it was run on a single 680gtx with fps ranging anywhere from 30-60fps and heavily unoptimized "polygon modeled toes inside shoes etc" all at 1080p resolution.
Now assuming we run it with the rumored PS4 spec 4g gddr5 "let's be generous" + 1.84TF gpu, plus properly optimized as in cutting the unnecessary toes and etc, would you say the demo can retain 1080p at average 30fps with fxaa and same fidelity for the rest? This is the billion dollar question isn't it;)?
Impossible to say of course without knowing the details of the demos but I wouldn't be surprised if dropping AA/30Hz and optimizing it could have it running in that sort of spec (or even lower really). But I'm sort of sceptical of that spec TBH... time will tell.
 
Based on the video interviews and stuffs I've read about Agni, it was run on a single 680gtx with fps ranging anywhere from 30-60fps and heavily unoptimized "polygon modeled toes inside shoes etc" all at 1080p resolution.

Maybe people need to spend more time asking exactly what was pre-computed and what was computed on the fly in that particular demo, instead of running with the "it was totally unoptimized assets yo" line. Although it does look pretty.
 
Based on the video interviews and stuffs I've read about Agni, it was run on a single 680gtx with fps ranging anywhere from 30-60fps and heavily unoptimized "polygon modeled toes inside shoes etc" all at 1080p resolution.
Now assuming we run it with the rumored PS4 spec 4g gddr5 "let's be generous" + 1.84TF gpu, plus properly optimized as in cutting the unnecessary toes and etc, would you say the demo can retain 1080p at average 30fps with fxaa and same fidelity for the rest? This is the billion dollar question isn't it;)?
I think if you change that from 1080p to 720p its a yes.... ;) Maybe by the end of the gen you can pull this off at 1080p....


I wouldn't read too much into it honestly... they're just picking the fastest single GPU available, nothing magical.
i dunno.. You dont have a single one running SLi or even the brand new 690 gtx. Seems like they targeted this spec for a reason and everyone of these will be next console engines/games. Maybe it is nothing , doubt we would ever find out if this was true or not.
 
That's cause SLI/AFR is trash :)

Epic saying everyone showing Sony and ms what they can do with this power. Maybe someone said here the target gpu, make what you can... target gpu was 680.

"In determining what the next consoles will be, I'm positive that [Sony & Microsoft are] talking to lots and lots of developers and lots of middleware companies to try and shape what it is. We've certainly been talking with them and we've been creating demonstrations to show what we think.


"And obviously the Elemental demo, same thing. We're certainly showing capability if they give s that kind of power, but so is everybody else."
Epic even say if they can't do that today then delay the consoles another year.


http://www.videogamer.com/xbox360/g...ive_leap_in_next-gen_console_performance.html
 
:LOL:

Yes, there are no PS4 dev kits despite the target specs coming out last year. :rolleyes:

And the rumor says 18CUs which even you've attested to the power in this very thread. I give you credit for trying though. :smile:

If there aren't dev kits, don't expect a PS4 until WELL into 2014. You do realize that dev kits are released well before hardware is even taped out right?
 
Just came across this:


games.on.net/2012/09/why-the-pc-version-of-nfs-most-wanted-will-be-the-best-around-criterion-talks-tech/ said:
http://games.on.net/2012/09/why-the...will-be-the-best-around-criterion-talks-tech/

[...]

games.on.net: Do you know which features of DX11 you’ll be using?

Leanne Loombe: We’re primarily leveraging the increased efficiency of DX11 to give improved performance. The move to DX11 from DX9 has given us around a 300% improvement in rendering performance.

[...]


:D
 
I have my 3x AMD 7950's at a constant 99% load so they're being used to the fullest ;)

What are they used for though? Running console-ports in a higher resolution? :smile:

If they'd optimise for PC then you would be playing sim city, the size of new york with the detail level of crysis :cool:
 
Just came across the following interview again and the following part appears to fit within this thread quite well:


Digital Foundry said:
http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=4

Digital Foundry: How would you characterise the combination of Xenos and Xenon compared to the traditional x86/GPU combo on PC? Surely on the face of it, Xbox 360 is lacking a lot of power compared to today's entry-level "enthusiast" PC hardware?

Oles Shishkovstov: You can calculate it like this: each 360 CPU core is approximately a quarter of the same-frequency Nehalem (i7) core. Add in approximately 1.5 times better performance because of the second, shared thread for 360 and around 1.3 times for Nehalem, multiply by three cores and you get around 70 to 85 per cent of a single modern CPU core on generic (but multi-threaded) code.

Bear in mind though that the above calculation will not work in the case where the code is properly vectorised. In that case 360 can actually exceed PC on a per-thread per-clock basis. So, is it enough? Nope, there is no CPU in the world that is enough for games!

The 360 GPU is a different beast. Compared to today's high-end hardware it is 5-10 times slower depending on what you do. But performance of hardware is only one side of equation. Because we as programmers can optimise for the specific GPU we can reach nearly 100 per cent utilisation of all the sub-units. That's just not possible on a PC.

In addition to this we can do dirty MSAA tricks, like treating some surfaces as multi-sampled (for example hi-stencil masking the light-influence does that), or rendering multi-sampled shadow maps, and then sampling correct sub-pixel values because we know exactly what pattern and what positions sub-samples have, etc. So, it's not directly comparable.
 
Back
Top