Im guessing cpu physx.
Because the game is scary enough to make you crap yourself?
About remote play, do we think it may be possible to support on mobile devices with low latency?
This got me thinking: http://www.youtube.com/watch?v=-n1nr9C6JMk
If they could get remote play on PS4 streaming to smartphones they could just release a cheep universal mount accessory like this and have VR available for everyone. Streaming would allow for far better graphics than just the mobile and with 1080p phones like Xperia Z out this year would be high enough resolution as well. Only potential issue i see would be latency but with Vita remote play being integral to PS4 surely they will have reduced it to a minimum already.
Nvidia isn't going to make physx run on a radeon unless physx complies with dx11 or something and its a simple change. As far as I know, the API is distinct and uses its own paths. It runs horrible on a cpu but at learn nvidia doesn't have to any work to get it running.No reason gpgpu couldnt be used, wasn't physx originally design for running on gpus from the get go? I know it it was proprietary for nvidia gpus but no reason the calculations wouldnt work just as well on amd hardware if allowed.
In the presentation, spectating and directing are lumped under Sharing.
RemotePlay is essentially purely peer-to-peer Gaikai to Vita (at this point). LAN RemotePlay is doable. It's hard to take peer-to-peer WAN Gaikai seriously though.
Originally it was designed for CPU and a custom accelerator, the Physics Processing Unit. It was a good fit for GPU though (reportedly, although I do wonder if that's a subset of the original physics?) and nVidia bought PhysX and withheld it for business reasons. It could be ported, but nVidia won't allow it as it gives their GPUs an advantage. Although not if other devs don't use it. All we need is a 3rd party physics lib like Havok to get a GPU port and PhysX will be undone.No reason gpgpu couldnt be used, wasn't physx originally design for running on gpus from the get go?
NVidia aren't powering a next gen console but their physics software, PhysX & APEX, will be present in PlayStation 4 games. The company announced its support for the PS4 with the introduction of new PhysX and APEX SDKs for developers.
http://www.playstationlifestyle.net/2013/03/07/nvidia-announces-physx-and-apex-support-for-ps4/
Is Nvidia angling to win more PC marketshare by hoping console game devs ingrain physx deeply into their games, whilst still holding physx exclusive to their own cards in PC market space?
I wouldn't count it beyond them to do something like that. Proprietary was only a bad thing when it was 3dfx who was trying to corner the gaming market...
No reason gpgpu couldnt be used, wasn't physx originally design for running on gpus from the get go? I know it it was proprietary for nvidia gpus but no reason the calculations wouldnt work just as well on amd hardware if allowed, doubt they would prevent it it would give them an advantage in the pc arena if it gained widespread use and stayed proprietary there.
Anyway there is Havoc.whicb is.properly optimized for cpu... and iirc some parts of Bullet Physics got ported to.gpgpu (don't remember the language they used though).
Originally it was designed for CPU and a custom accelerator, the Physics Processing Unit. It was a good fit for GPU though (reportedly, although I do wonder if that's a subset of the original physics?) and nVidia bought PhysX and withheld it for business reasons. It could be ported, but nVidia won't allow it as it gives their GPUs an advantage. Although not if other devs don't use it. All we need is a 3rd party physics lib like Havok to get a GPU port and PhysX will be undone.
Oh how it must gall them to support a system with AMD behind the GPGPU which means...
They can hope all they want but...
Nvidia who claims that PhysX is reliant on CUDA (well obviously since they themselves limit it to that for competitive reasons) isn't about to port it to GCN unless they are forced to.
And I believe Havok has already stated as well as demo'd in the PS4 reveal, that they will be taking advantage of GPGPU on PS4. Something they couldn't quite do on PC due to the PCIE interface with the graphics card.
Not only that but they deliberately castrate the CPU version on PC compared to consoles. A few years back someone showed that the CPU version was limited to one thread and didn't use any modern SIMD extensions (SSE3 for example). I believe at most it used old MMX instructions at the time. It wasn't until this came out that they grudgingly allowed for more CPU cores to be used, but then made it quite difficult to use them.
I've since stopped paying attention to PhysX, so maybe things have changed for the better in the past year, but I'm not holding my breath.
They'll likely optimize it quite well for the CPU on consoles but not allow those same optimizations to be used on the PC in order to keep it artificially slow on PC. But perhaps I'm being too cynical and Nvidia have changed their business philosophy with regards to PhysX.
Which means smaller developers who maybe can't afford to license Havok may use it, but I can't see the larger studios choosing to use PhysX over Havok at that point. I'd imagine the Bullet physics library will also take advantage of GPGPU on consoles. It is open source so could be a more attractive solution over PhysX for those that can't afford to license Havok.
Regards,
SB
Why does anyone use it then? There are plenty of other options that'll run better on any PC that's not using an nVidia GPU. I'd have thought PhysX would have naturally died from being too limited. Portability is much more important, and Havok seems to be a strong option.Not only that but they deliberately castrate the CPU version on PC compared to consoles. A few years back someone showed that the CPU version was limited to one thread and didn't use any modern SIMD extensions (SSE3 for example). I believe at most it used old MMX instructions at the time. It wasn't until this came out that they grudgingly allowed for more CPU cores to be used, but then made it quite difficult to use them.
Why does anyone use it then? There are plenty of other options that'll run better on any PC that's not using an nVidia GPU. I'd have thought PhysX would have naturally died from being too limited. Portability is much more important, and Havok seems to be a strong option.
I imagine support is probably good. Nvidia have a good developer relations team.
PlayStation 4’s Increased Memory is a ‘Joy,’ says ‘Dishonored’ level designer
So much of the conversation about any new hardware launch is about graphical upgrades: polygon count, volumetric lighting, depth of field, shaders, and so on and so forth. (....)
But there is more to game development than character modelling and 3D rendering: the level designers at Arkane Studios, the minds that brought you “Dishonored,” are really just excited about having a little bit of extra RAM in Sony’s upcoming PlayStation 4.
“We need memory, you know?" Christophe Carrier, a level designer at Arkane, told Eurogamer during this week’s British Acadamey of Film and Television Arts Game Awards, where “Dishonored” won Best Game.
"As a level designer we are struggling against memory every day. We cut things, we remove things, we strip things, we split the levels, we remove NPCs from levels because there's not enough memory,” Carrier elaborates. “So knowing that memory is something that is going to be improved in the next generation of consoles: to us, it's a joy.”
The PlayStation 4 will come with 8 GB of unified random-access memory when it is released this fall. The current PlayStation 3, in contrast, sports a comparatively puny 256 MB of dynamic RAM and another 256 MB of video RAM.
"We were PC gamers at the beginning. We love PC games, and we had to make games on consoles. But the main problem was memory,” Carrier explains. “The processors are good, but the memory, for our games, is the most important. So it's great."
They didn't make fine grain distinctions based on functionality which seems to be causing some confusion. I presented three categories based on how they work and not a marketing message. For example, I think for clarity sake we should only use the word "Gaikai" to refer to cloud rendered games. While Dave Perry says the Gaikai technologies and expertise informed the video streaming capabilities of the PS4, Sony is still using Gaikai to specifically refer to their cloud-based services.
For PS4's "Let me take control" feature, it would be poor form if they simply rely on a straight peer-to-peer WAN RemotePlay implementation. Besides network latency and performance issues, they also need to worry about firewall configs.
I see where you are coming from.
At this point, I am just assuming RemotePlay is that Vita-PS4 wireless play feature.
For PS4's "Let me take control" feature, it would be poor form if they simply rely on a straight peer-to-peer WAN RemotePlay implementation. Besides network latency and performance issues, they also need to worry about firewall configs.
At the minimal, they should send the game saves to the helper. Let that dude stream his game session back to the guy asking for help (for spectating). This assumes that the helper has a copy of the game.
I don't mind a paid server setup either since the performance should be better. And they can potentially stack more value added services on top.