(((interference)))
Veteran
What I have been told is that effectively using GPU compute is hard to do and, like with the PS3's SPUs, teams will struggle with it at first.
That is what Tim Sweeney once stated:Cerny specifically said the PS4 was designed to be easy to learn, hard to master, and that compute was more for 2-3 years down the road, for those teams looking to really dig deep.
So by that timetable we shouldn't be surprised.
I didn't know compute shaders where supposed to be on SPU level of difficulty, you hear a lot about them on PC and I never got that vibe.
Though it is pretty old and possibly before pretty "C like" languages made it to GPU bag of tools.if the cost (time, money, pain) to develop an efficient single-threaded algorithm for central processing unit is X, then it will cost two times more to develop a multithreaded version costs, three times more to develop Cell/PlayStation 3 version and ten times more to create a current GPGPU version. Meanwhile, considering the game budgets, over two times higher expenses are uneconomical for the majority of software companies!
What I have been told is that effectively using GPU compute is hard to do and, like with the PS3's SPUs, teams will struggle with it at first.
Here is what I found, you can easily find IT cursus for web development, or your usual database, C or java, etc. with overall low academic requirements but to even get started in the field of GPU programming you pretty much have to be well advanced in an tough (strong math background) engineering/university cursus.
From there I guess it ain't easy at all /one of the toughest field in IT.
It depends on the developer and what tools are provided, if Sony provides a general compute library like they did for the SPU's then the adoption rate of compute will massively increase, although it should be A LOT easier to use then the SPU's its a (relative to the SPU's) a mature platform and much nicer to work with.
The problems associated with compute don't seem to be trivially solvable with a magic library. For SPU Sony just provided a scheduler, but compute already has a scheduler (the GPU).
I haven't done much compute yet, but I wrote a simple copy compute shader (almost the simplest compute shader possible) and noticed while researching that there are a large number of ways to write it, all with wildly varying performance. Scheduling will be a problem as there are many numbers to tweak. Debugging will be a problem. Data layout may be a problem. Ensuring memory is synchronized correctly will be a problem... potentially quite a large one.
If the trade-offs are good, though, compute will see a lot of use. I'm always annoyed by this idea that developers are idiot children and can't wrap their heads around something unless Sony makes Baby Einstein videos about it or something. Every place shipping complicated titles has smart people behind their tech. They'll continue to make informed decisions about what tech to use and not use and people will continue to make uninformed second-guesses online.
The only dedicated hardware the PS4 lacks compared to the Xbox One are there to solve problems that don't exist on the PS4 (Kinect/ESRAM/Move Engines).
Richard's declaration that the PS4 is unbalanced is complete conjecture and based on assuming everything about the Xbox One is better than it seems, and everything about the PS4 is somehow worse than it seems.
Agree.
Exactly how can this be.?
Considering that Cerny has been very vocal about how well balance the PS4 hardware is,and MS has been very obscure about anything GPU and CPU related about the xbox one,they talk more about the cloud than the GPU and CPU.
This article is made to fit the xbox one,the 7850 is 1.76TF yet it perform better than the 1.79TF 7790,even the OC one doesn't top the 7850 from benchmarks i have seen,so how could a 12CU Bonaire GPU with 200mhz lower clock get represented by a 16CU GPU with 32ROP's and more than 500Gflops different.
I would have respect more if they used a 7790 downclock to 700mhz or 650mhz to compensate for the 2 extra CU,than what they did,but i guess that would have yield a much different result,also their representation if the PS4 GPU is just as bad.
With respect, how on earth do you know what developers in hundreds of teams across the globe are doing?Well, besides the middleware solutions devs aren't using GPGPU for much in their engines - or at least not yet.
Very true. On current gen, Sony's first party developers and their Advanced Technology Group guys did a great job of helping third parties leverage the SPUs. You could argue that was more out of necessity than benevolence but researching and sharing is now their culture and I hope this will continue with compute next gen. This will also help Xbox One devs.Having compute resources available on both systems as well as PCs is a fairly major difference between the industry's interest in SPUs and CUs today. So progress, if it can be made, will occur more rapidly since all cross-platform games could benefit.
Also, Cerny said PS3 hardware "isn't a 100% round". I'm not aware of him talking about how balanced it is (though obviously, he's not going to speak ill of his own hardware if asked anyway).
I don't understand the ambiguity, it's clear as day to me but allow me to translate: If you look at the PS4 from the traditional balance of CPU and GPU resources required for games, there is more ALU than you would expect. This is intentional. We think, in a year or two, the use of compute to achieve tasks will be greater and we don't want devs to have to scale back on graphics to free up ALU resources to make it happen.Mark Cerny said:"The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.".
Mark Cerny said:"The vision is using the GPU for graphics and compute simultaneously. Our belief is that by the middle of the PlayStation 4 console lifetime, asynchronous compute is a very large and important part of games technology."
Bottom line: it looks unbalanced now but its been designed for developers to grow into. If Sony's prediction pans out, Xbox One devs are either going to have to scale back on compute for whatever task (less particles or physics interactions for example) or scale back on graphics some to accommodate the compute workload.
With respect, how on earth do you know what developers in hundreds of teams across the globe are doing?
I don't, but i've heard that devs are finding compute difficult.
I don't, but i've heard that devs are finding compute difficult, even among Sony's first party devs there's only one studio investing heavily in GPU compute.
How do you know this?
The same way I could correct Richard on the details of the PS4 memory reservation...
The same way I could correct Richard on the details of the PS4 memory reservation...
That's not dissimilar to how AMD back with the Radeon 1900 -> Radeon 1950 were forward looking in predicting that shaders would start to dominate graphics workloads more and more. That didn't generally happen, however, until years after the lifetime of the product.
Just because something forward looking is put in, doesn't necessarily mean it'll get the amount of use the hardware designer thought predicted it would.
The same way I could correct Richard on the details of the PS4 memory reservation...
Right, and I believe you're talking to developers of which there will be thousands, probably tens of thousands, of individuals who know the details of the SDK and PS4's memory allocation, but what I don't believe is the experiences of PS4 developers - all PS4 developers - are known to other PS4 developers.