with some specialised fixed function hardware for effects not possible on xenos
It's comments like this I don't get they just seem like wish full thinking.
Exactly what fixed function hardware would you add to a moden GPU to make it better?
with some specialised fixed function hardware for effects not possible on xenos
It's comments like this I don't get they just seem like wish full thinking.
Exactly what fixed function hardware would you add to a moden GPU to make it better?
You don't need fixed-function hardware for that.
I havnt got a clue to be honest, but I was making reference to bg assassin and another member who mentioned something like that.
Yes, I am aware of that. My point was that if it is hardwired into the GPU, it will have significant performance gains. So, if that is the case it may be easier to implement these effects and still have enough juice left over for ones that are not hardwired.
I havnt got a clue to be honest, but I was making reference to bg assassin and another member who mentioned something like that.
It is you who needs to do the reading mate, I never said I was guessing I said quite clearly I was referencing someone else's (ie bg assassin &someone else with sketchy English who claims to either have one or knows someone who does..quite a few pages back)
So instead of just coming on to post a negative rant about something you have obviously not even bothered to read, why not contribute some special insights of yourself?..no wait you can't can you because you would have to ...er guess?
If one member comes on this thread who claims to have inside knowledge and says the gpu will be on par on a lot of areas but have some special functions that would take it above xenos/rsx, then unless everyone has their own dev kits laying around then I'm inclined to take that onboard.
Why shouldn't I reference that?
And you can't go judge the performance solely based on those builds 'till the end
You don't need fixed-function hardware for that.
french toast's post was specifically talking about enabling features not possible on Xenos. I guess one could argue that a possible effect that takes too much time could benefit from fixed-function, high speed silicon. The significant problem with fixed-function hardware is you tie yourself into an art-style. Programmable hardware that has ended up driving forward and deferred renderers, various lighting implementations, added SSAO and postFXAA, would have been lost to us if the hardware was locked in to fixed MSAA forward rendering or somesuch. Maybe a fast blur device would be very useful, as gaussian blurs have a lot of use, but I don't know if hardware can be designed to do that faster than a GPU. Per pixel blur radius is not a trivial probelm to solve AFAIK. And if Wuu has got hardwre blur, does it mean every game is giong to look miniature?!The aim doesn't have to be to add new functionality. Rather, the purpose could be to perform common code faster, and/or at lower power draw. Those with experience writing current graphics code might be able to say what functionality has gained sufficient traction to warrant hardwiring.
That's always an option, but a think a good sync/fence implementation* and some batching discipline can largely address that (given the user has control of how many virtual contexts the gpu will see). I mean, yes, a context needs to be flushed before you can re-use it, but you don't have to sit there blocked waiting for the flush to occur.Random thought on the WiiU GPU, and the earlier mention of Nintendo specific functionality.
It occurred to me that what I would want to do is render the tablet screen at a different frame rate than the main display, possibly with one or the other taking priority.
It's actually somewhat difficult to do efficiently purely in software, the OS could handle it, much like a desktop OS, but there is a large overhead for doing that, since every time you change context you have to flush the pipleline.
It occurred to me that given the fixed requirement, one TV + upto 2 tablets, you could just implement multiple contexts into hardware.
I also noted that the "dedicated" fixed functions are just one of many possibilities. ERP proposed something that's possible. If we are looking at a GPGPU, then those features may tie in with how the GPU handles computing tasks. Which in part may deal with how Nintendo will handle lighting in their games.
dedicated Voxel setup and hardware support for a voxel tree
But is it not a waste of transistors? what if only UE4 use voxel tree for lighting?
If you put in fixed hardware for a feature, it'll get used. But it does mean tying dowen the look and feel of your games. eg. Imagine if PS1 had a hardwired lens-flare chip, because lens-flare was the fashion. Every game would have thrown in lens-flare just to make use of the hardware despite the artistic conflicts that would have in many titles.