My facility isn't a matter of public record
Yikes, where is my tinfoil hat
My facility isn't a matter of public record
Yea, I just googled it, you guys are correct. It is triple buffered @ 30fps.
So the remaster is quite a bit faster in latency, but I suppose one could say equivalent to triple buffered 60fps
As Neversoft itself is responsible for most of the latest Guitar Hero games, where latency is hugely important, it is perhaps not surprising that Mick West took such an interest in this subject, and his conclusions are intriguing.
- The lowest latencies a video game can have is 50ms (three frames) - the PS3 XMB runs at this rate, but few games reach it.
- Most 60FPS games have a 66.67ms latency - Ridge Racer 7, for example.
- 30FPS games have a minimum potential lag of 100ms, but many exceed this.
- Game developers should test their own games using the camera technique in order to weed out bugs - West says that Heavenly Sword's response slows down to 300ms just by turning the character, and reckons it's a technical issue that should have been resolved before going gold with the game.
- Citing GTAIV as an example, West suggests that a 166ms response is where gamers notice controller lag, which could also explain the Killzone 2 furore too.
- Game reviewers should accurately measure latency for their reviews where controller lag is an issue, in the hope that sloppy game response times come under far more scrutiny.
Correct.So, I am assuming input lag = controller lag.
http://www.eurogamer.net/articles/digitalfoundry-lag-factor-article
Reading it back it sounds a little ominous but it's really not.Yikes, where is my tinfoil hat
True!! But I am surprised that you guys have clusters this size! The cluster I am using is one of the three national computing centers in Germany (Jülich Forschungszentrum).
My facility isn't a matter of public record
Scan down that list and think about what type of organisations are conspicuous by their absence.
Scan down that list and think about what type of organisations are conspicuous by their absence.
Now you're just teasing. You're obviously working with Pornhub's massive data centre. Fact. Fukt.Scan down that list and think about what type of organisations are conspicuous by their absence.
Sentient computers seeking world domination?Scan down that list and think about what type of organisations are conspicuous by their absence.
So, I am assuming input lag = controller lag.
http://www.eurogamer.net/articles/digitalfoundry-lag-factor-article
Arwin, why no PS3 TLoU on your site
Now you're just teasing. You're obviously working with Pornhub's massive data centre. Fact. Fukt.
Halleluja.Those of us working in cluster and server farm environments knew this 15 or so years ago. Even then the writing was on the wall; it was more economical to go wider than faster. It was interesting reading developer's issues with PlayStation 3 architecture, particularly parallelisation (or 'jobyifying') a basic problem into many parallel jobs. As always with parallelism, the wider you go, the more latency becomes a technical barrier. If you think managing L1 and L2 cache is difficult, pan out and try managing L4 and even L5 cache.
Arwin, why no PS3 TLoU on your site
I am not sure what you mean by simplicity. If you mean simplicity in terms of what architectures present to the programmer--that was intentional, or simplicity in terms of what they are internally--that is not the case.Managing data access and dependencies have been key in performance critical computing for a very long time. It has mostly been the simplicity of consumer computer hardware that has shielded the majority of coders from this fact, plus of course the fact that a lot of code just isn't performance critical, at least when viewed in isolation.
There are a number of ways to interpret this, since computational load vs complexity can be seen in one area as a trade-off between ALU work and memory accesses and additional levels of flow control.Simple algorithm+massive computational load seems, for whatever reason, to have higher status than complex algorithm+lower computational load.
Maybe it's because of the sex appeal of FLOPS, maybe it's because of the deeper understanding of the underlying problem required for the second option. It doesn't really matter.
Isn't making use of what is available subject to design constraints and economics engineering?But since science fundamentally revolves around increasing understanding, and then applying that to solving relevant problems, I'm a bit saddened to see bright scientific minds being diverted towards fitting simplistic approaches to the limitations of current hardware. Maybe it's just my field, maybe it carries over into the entire industry. I don't know.
No denying it, Naughty Dog loves challenging themselves and finding (creating) the paths on achieving their goals. Developer Gods they are...
Or perhaps the IRS, both engage in same activity but only one involves consenting adults...[emoji33]Now you're just teasing. You're obviously working with Pornhub's massive data centre. Fact. Fukt.