Could PlayStation 4 breathe new life into Software Based Rendering?

Status
Not open for further replies.
If I take it from what happened on PS3, one possibility could be that the CU would actually drive the engine. On PS3, initially the PPU would drive the SPU jobs by distributing and feeding the work to the SPUs. But the PPU itself was actually a bottleneck in this, and making the SPU the main job distributer turned out to be more efficient, leaving the PPU to other work.

I don't know if this is possible with CUs, but the AMD presentation suggests that it just might be. And in that case, that would indeed free up additional CPU resources, which are currently claimed to be a more likely bottleneck than GPU resources are.
 
Let's post this from Cerny again

"There are a broad variety of techniques we've come up with to reduce the vertex bottlenecks, in some cases they are enhancements to the hardware," said Cerny. "The most interesting of those is that you can use compute as a frontend for your graphics."

This technique, he said, is "a mix of hardware, firmware inside of the GPU, and compiler technology. What happens is you take your vertex shader, and you compile it twice, once as a compute shader, once as a vertex shader. The compute shader does a triangle sieve -- it just does the position computations from the original vertex shader and sees if the triangle is backfaced, or the like. And it's generating, on the fly, a reduced set of triangles for the vertex shader to use. This compute shader and the vertex shader are very, very tightly linked inside of the hardware."

So backface culling for one.
 
Where did I say anything about them not rendering polygons anymore?
This IS the software based rendering thread, yes? One which you yourself started by the way. So tell me... Why would you render polygons...in software...when you have hardware rasterizers that could draw the same poly much faster?

Software based rendering - when used today, in realtime 3D context - typically entails non-polygonal rendering (voxels, splines, bezier patches, solid geometry modeling, whathaveyou), since we already have very competent hardware accelerators if polys is what you want to use. Offline software renderers as used by the movie visual effects studios and such also use polygons a lot of course, but those aren't exactly realtime rendered so is a different discussion.

The GPU is designed so they can use a good mixture of fixed function graphics with compute I'm just saying that they will start using compute a lot more.
Yes, but like I said, just because you're using GPU compute doesn't mean you're doing software rendering.

You even see the slides when AMD is talking about using less fixed function graphics & writing compute graphic pipelines yet you're still trying to make it seem as if I'm talking crazy.
No, I'm more inclined to you seeing buzzwords on a slide and then jumping to conclusions that aren't there, or at least not in the form you think they are. (Or using commonly used terminology in novel ways, causing confusion.) GPU compute is just a fancy new name for stuff that has existed in more primitive versions since directx 9 came out ages ago now. SSAO, tone mapping, FXAA, bokeh DoF and things like that are all implemented as GPU compute, today, on graphics rendered using rasterized polygons. There's no indications any of that is going away, at all.

Not sure what you're going on about, TBH. In order for the industry to shift to a dramatically different method of rendering there has to be incentive to do so; drawbacks to be overcome and benefits to be gained. You don't just do it for the hell of it, or because of fking powerpoint marketing slides. Where are the drawbacks and the benefits?
 
Where did I say that modern GPU do not have programmable shaders?
You didn't. But you were talking about moving from fixed function pipelines, yet the current GPU is very programmable, so not 'fixed function'.

I asked was he joking because he responded with "how exactly is it fixed function pipeline when it's programmable?" when I was explaining that more graphical tasks could be moved over to compute.
You didn't describe any graphical workloads that could be moved to compute, so Taisui asked for clarification over what you meant, and you're clarification talked of fixed-function hardware...


so would you kindly define what's "software rendering" in you opinion?

A rendering pipeline that's programmable.
By the way I said Software Based Rendering.

huh? The shaders are already...programmable

Yes but most of the graphics is done with the fixed function pipeline & you're adding to it with the compute shaders.

With the PS4 they could end up running graphic pipelines that are mostly done in compute.

how exactly is it fixed function pipeline when it's programmable?

Please explain what you mean by 'programmable pipeline'. What are the workloads that can be moved to compute, and how are these workloads not currently programmable so that "compute == software based rendering" and "shaders != software based rendering".

There's certainly a discussion there, but you haven't expressed it such that anyone can get involved because everyone's still crashing heads over the definitions.
 
what makes something software anymore...it all runs on programmable hardware


in opengl4, for example, i write the T&L code (or use someone else's)...
 
FWIW my definition is software rendering = pure general purpose CPU code (inc. modern SIMD FPU code) with no use of specialist co-processors such as GPUs. Regardless of the programmable nature of a modern GPU it is still fundamentally geared towards large scale matrix ops and if your code is 'branchy' and serial it will be a very poor match even for GPGPU (to my understanding of it). I look forward to being corrected! :D
 
FWIW my definition is software rendering = pure general purpose CPU code (inc. modern SIMD FPU code) with no use of specialist co-processors such as GPUs. Regardless of the programmable nature of a modern GPU it is still fundamentally geared towards large scale matrix ops and if your code is 'branchy' and serial it will be a very poor match even for GPGPU (to my understanding of it). I look forward to being corrected! :D

The PS4 GPGPU was designed for Efficient branching

GPU+RAM+CPU = Beyond Fast!
- Plenty of power for a true Next Gen Game Experience
-  8 CPU cores
-  High polygon throughput
-  High pixel performance
- Efficient branching in
GPU Shaders


http://twvideo01.ubm-us.net/o1/vault/gdceurope2013/Presentations/825424RichardStenson.pdf
 
When they say "efficient branching", are they talking about something that was already part of the GCN architecture, or a Sony customization?
 
When they say "efficient branching", are they talking about something that was already part of the GCN architecture, or a Sony customization?

Not sure but it was also mentioned in the VGleaks documents I'm guessing it might have something to do with the 8 ACE's that's able to queue up to 64 compute jobs.

Read more at: http://www.vgleaks.com/orbis-gpu-compute-queues-and-pipelines/



http://www.vgleaks.com/world-exclusive-ps4-in-deep-first-specs/

GPU

AMD R10x series GPU @ 800 MHz (Tahiti)
aggregate 10x RSX performance, 1.84 TFlops
DX”11.5″
cheap branching
18 compute units, focus on fine grained compute & tessellation
 
That guys "insider" info is so lame. SDKs is improving (like every other generation as time passes), Uncharted will have amazing graphics (no shit), Microsoft will show an exclusive (when have they not?), both studios will pad their presentations with 3rd party content (as usual). I mean, what a genius.
 
That guys "insider" info is so lame. SDKs is improving (like every other generation as time passes), Uncharted will have amazing graphics (no shit), Microsoft will show an exclusive (when have they not?), both studios will pad their presentations with 3rd party content (as usual). I mean, what a genius.

Well he told me about Call of Duty / Battlefield PS4 & Xbox One resolutions months before anyone else knew about it & told me not to say anything & I kept it to myself & it came out to be true so I'll take his word for it whenever I see his inside info.
 
Last edited by a moderator:
Well he told me about Call of Duty / Battlefield PS4 & Xbox One resolutions months before anyone else knew about it & told me not to say anything & I kept it to myself & it came out to be true so I'll take his word for it whenever I see his inside info.

He was wrong about several things one example is The Witcher 3 resolution.
 
Status
Not open for further replies.
Back
Top