PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
One thing I noticed is silhouette recognition in Playroom. Maybe it's not very accurate, but it does the trick.

What silhouette recognition ? You mean like Kungfu Live ?

They have tons of things to do for PSEye. I am hoping they revamp PS Move's XMB control for one. It was terrible in PS3.

If they want to showcase PSEye, IMHO one sample level from KF Live, the Tank war drawing recognition, a Wonderbook "user manual", a 3D photo + video app, and a way to scan objects + environment would make sense.

Heck, I may even prefer a stereoscopic 3D PS4 XMB with 3D PIP video chat.

Right now, I only saw snippets from Playroom. They seem superficial.
 
Having had my first taste of 3D gaming and being quite impressed (doesn't change the gameplay at all except for Tumble, but Assassin's Creed was very impressive), I'd like to see progress in 3D AR. It's something Sony can uniquely support. XB1 could do similar using avatars/Champions and a room scan, but only Sony is in a position to have a 3D view of your room with pets interacting in 3D. It's the next step up from EyePet, and a compelling addition to the likes of Skylanders.

However, I probably don't need to reiterate my expectation that Sony won't really bother. They'll leave it three years before making anything, then do a half-arsed effort like Sorcery, although they'll string people along until then with some amazing tech demos that'll never be made into anything substantial. I expect a concept video soon like the HD EyeToy video that preceded PS3.
 
Thanks. Quoting from the article:

"It's important to start with a 64-bit version because obviously the [PS4] hardware is 64-bit so it's nice to get those 32-bit/64-bit issues out of the way before you start worrying about the platform specifics. The initial aim of our work was to get the PS4 version to feature-parity with the Windows version."

Sony has made a big deal about the accessibility of the PS4 hardware, and a key element of that would be the quality of the toolchain - the series of programs used to create compiled code. For the PS4 developers, the use of the established Visual Studio environment proves to be a key benefit, and the extent to which Sony has acknowledged and supported cross-platform game-makers is self-evident. There are even options within Sony's compiler specifically added in order to increase compatibility with the Microsoft counterpart used in compiling DirectX 11 games.

But this actually seems to refer more to compiler options than anything else. But then, probably more to the point:

"Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if you're used to platforms like D3D11. We started with the high-level one but eventually we moved to the low-level API because it suits our uses a little better," says O'Connor, explaining that while GNMX is a lot simpler to work with, it removes much of the custom access to the PS4 GPU, and also incurs a significant CPU hit.

And

Another key area of the game is its programmable pixel shaders. Reflections' experience suggests that the PlayStation Shader Language (PSSL) is very similar indeed to the HLSL standard in DirectX 11, with just subtle differences that were eliminated for the most part through pre-process macros and what O'Connor calls a "regex search and replace" for more complicated differences.

Reading the interview, it's not immediately clear that Garlic and Onion mean hUMA is being used. Suggesting that you need to define memory as either Garlic or Onion first seems to contradict this, especially if you take ExtremeTech's picture, it doesn't seem to have anything directly to do with making sure that both CPU and GPU can access the same memory without having to copy something, and there is no direct mention anywhere that you can 'remap' any information so that it accessible to CPU, and then to the GPU. But perhaps there's a system that allows you to simply remap memory?

Then reading up on what hUMA really does:

Unified memory with bidirectional coherency and uniform access — This is the truly transformational feature of hUMA if it works as planned. The GPU and the scalar CPU will now share the same uniform memory space, and both can now allocate and operate on common memory with no need for redundant buffering on the GPU, and no need for explicit transfers.
Shared coherency domain — hUMA implements a common coherency mechanism, which will open up the potential for multiple GPUs operating on the same shared data and combinations of GPU and CPU in new algorithms, since each will automatically have the latest version of memory served either from cache or through a common cache-miss process. This will certainly change the way GPU-based algorithms are implemented, since developers can now be more flexible in where they perform processing with no penalty for moving data blocks.
GPU can take page faults and allocate memory — The GPU can take page faults, allowing it to work out of non-locked memory.

This in combination with the ExtremeTech diagram seems to suggest to me that the CPU always accesses memory through the L1 and L2 caches, and the GPU can access through either Onion or Garlic. But perhaps it's possible to just map the same physical address range through both busses and access them through different busses that way? And then the actual remap process is what makes sure everything stays coherent?
 
Having had my first taste of 3D gaming and being quite impressed (doesn't change the gameplay at all except for Tumble, but Assassin's Creed was very impressive), I'd like to see progress in 3D AR. It's something Sony can uniquely support. XB1 could do similar using avatars/Champions and a room scan, but only Sony is in a position to have a 3D view of your room with pets interacting in 3D. It's the next step up from EyePet, and a compelling addition to the likes of Skylanders.

However, I probably don't need to reiterate my expectation that Sony won't really bother. They'll leave it three years before making anything, then do a half-arsed effort like Sorcery, although they'll string people along until then with some amazing tech demos that'll never be made into anything substantial. I expect a concept video soon like the HD EyeToy video that preceded PS3.

A good first indication will probably be if the playroom demo that is now included with all PS4s apparently will support 3D tvs. If that is the case, then that's a good start right there. But I haven't seen any indication of that.
 
I imagine PlayRoom will support 3DTVs, but I also imagine that'll be the last venture by Sony in that space (like EoJ was the first and last Sony game for PSEye until they released Move).
 
What silhouette recognition ? You mean like Kungfu Live ?

They have tons of things to do for PSEye. I am hoping they revamp PS Move's XMB control for one. It was terrible in PS3.

If they want to showcase PSEye, IMHO one sample level from KF Live, the Tank war drawing recognition, a Wonderbook "user manual", a 3D photo + video app, and a way to scan objects + environment would make sense.

Heck, I may even prefer a stereoscopic 3D PS4 XMB with 3D PIP video chat.

Right now, I only saw snippets from Playroom. They seem superficial.
Kind of. If you notice, that flying robot thing does a circle around the player, and you can't see it while it's behind the player. They must do some silhouette recognition in order to achieve that effect.

And yes, some cool stuff was already done on the PSEye from PS3 (some technical videos were very funny, like the face recognition and head replacing stuff), but I'm hoping this time I can play to DanceStar Party without a controller. As I said, I like the controller and I prefer it to free-controller experiences in some aspects, such as table tennis, archery and all that stuff, but in a dancing game it just doesn't feel right to me.
 
Given that the gpu can write to the cpu caches through onion, why not?

From the description of APU memory behavior given so far, this does not happen. Writes don't push data to other non-shared caches. They can invalidate shared lines, but at some point the CPU will have to read the data back in. AMD's caches, like most, broadcast an invalidate to other caches when writing.

Onion's write traffic goes to main memory, snooping caches and invalidating as necessary.
 
A good first indication will probably be if the playroom demo that is now included with all PS4s apparently will support 3D tvs. If that is the case, then that's a good start right there. But I haven't seen any indication of that.

I imagine PlayRoom will support 3DTVs, but I also imagine that'll be the last venture by Sony in that space (like EoJ was the first and last Sony game for PSEye until they released Move).

I remember before I dropped off B3D last Summer, I commented that Sony should drop PSEye from the basic bundle and focus on core games. I still stand by that remark because (i) good [PSEye] things take time to make, and (ii) they should focus on getting Gaikai up asap.

They should only showcase PSEye when they have the goods. We have already seen a truckload of experiments on PS3. IMHO, they should let Dr. Marks partner with a senior executive to integrate PSEye into PS4. That way, they don't have to worry about title specific revenue upfront. That Tank war demo got assimilated into a tiny part of EyeToy for no good reason. D^8

As for 3DTV + PSEye, I hope they do it when HMZ-T3 is launched, or when Occulus Rift partnership is firmed up. :devilish:

But what I really want is 3D object and environment scanning and editing. I may not even use a PSEye. Another Sony stereoscopic 3D camera mounted on HMZ-T3 is good too.
 
Does this hUMA stuff need specific coding to give performance? Is it just a secret sauce for later in the generation?

The programmer definitely needs to be aware of it and code according to its characteristics. But it should be getting used day-to-day for good performance, even for launch titles.
 
So how does this coherent access work on the PS4?

Let's say I have a shared buffer between the CPU and GPU, the CPU modifies a value in that buffer (so that value is in the CPU L1 now), I assume the simplest method for coherency would be to invalidate corresponding addresses in the GPU caches and force them to refetch from memory? Unless there's another port and bus that allows a fast copy from one cache to the other, otherwise I don't see the performance gain.

I think the biggest gain would be in terms of software development, it certainly simplifies that.

I think there should be performance gain since the programmer can potentially layer more work without disturbing existing ones.
 
Your service request :

Response and Service Request History:

It's not as much that he was misquoted as much as he wasn't quoted at all. You can see the report at http://www.heise.de/newsticker/meldu...t-1939716.html , which everyone has been quoting, but there is no quote from Marc Diana here, it is either in need of confirmation (which is unlikely, since we cannot confirm hardware functionality for other companies), or is just a misinterpretation of the information given to the writer. If there was a quote there we could analyze it further, but they don't have any sort of exact quote, so I am pretty sure that they just tried to draw some conclusions (IE Kaveri supports hUMA, and Kaveri is coming out at the same time as the PS4, so PS4 supports hUMA, which is erroneous as any chip used by Playstation will be custom-made to their specs, and isn't a Kaveri chip).

In order to update this service request, please respond, leaving the service request reference intact.

Best regards,

AMD Global Customer Care

http://www.neogaf.com/forum/showpost.php?p=77493461&postcount=1
 
it reads like there having to downplay what was stated because of their relationship with both sony and ms. it doesnt look good to publicly up one party and downplay the other part that not good business. but doent mean what was stated isnt true.
 
We can't tell one way or another. If the comment was legitimate, it was almost certainly breaking protocol, so the chance of getting a quote will be low. It doesn't really matter in this thread though. We know PS4 has hUMA. It's XB1 where it's now an uncertainty.
 
AMD actually mentioned kaveri in a reply? i'm shocked, they normally seem petrified to speak it's name outside of the usual "It's coming".

But it does seem to indicate that we cannot rely on the quote.
 
AMD Global Customer Care gave some hilarious answers to Wii U GPU too. They have gone rogue

They also allegedly gave the aforementioned reply about the PS4 huma article in under 15min.. And we're meant to believe some low level customer service guy had all this information on hand?

I'm dubious about that support ticket being real.
 
C't at least is a generally really great German IT tech magazine, similar to Byte (? Not sure if I remember correctly). Used to read it a bunch in the past. Will buy the issue when it comes out.
 
They also allegedly gave the aforementioned reply about the PS4 huma article in under 15min.. And we're meant to believe some low level customer service guy had all this information on hand?

I'm dubious about that support ticket being real.
Now i don't know what to believe. :???:
 
Status
Not open for further replies.
Back
Top