What are the use cases to justify the extra bandwidth there?
I wonder if a game dev shouldn't expose these in retail in an options menu. Let the user play around with it like a TV setting (showing in-game result).
I think we're at the point where there isn't much harm in having some "advanced" settings under options. We don't need anything as extreme as settings on PC games, but a few visual options to disable post processing you don't like, or change scaling filters, or whatever would be nice. I'd really love to turn off lens flares.
Why foot the bill and effort in collaboration and have your competitor take one of your off the shelf components and wreck your home.
I wonder if a game dev shouldn't expose these in retail in an options menu. Let the user play around with it like a TV setting (showing in-game result).
I think we're at the point where there isn't much harm in having some "advanced" settings under options.
It supports Lanczos scaling... that's pretty good, imho. It's my favourite scaling method on my HTPC. Makes me wonder why a lot of games are so extremely oversharpened if they use the scaler, though.
Also... exposing these methods to the user would be a big win!
It could be that the filter options available at the time were those more prone to ringing (i.e. sinc or the high-tap lanczos).Makes me wonder why a lot of games are so extremely oversharpened if they use the scaler, though.
I wonder if a game dev shouldn't expose these in retail in an options menu. Let the user play around with it like a TV setting (showing in-game result).
It supports Lanczos scaling... that's pretty good, imho. It's my favourite scaling method on my HTPC. Makes me wonder why a lot of games are so extremely oversharpened if they use the scaler, though.
Also... exposing these methods to the user would be a big win!
I sort of agree in exposing these as advanced settings BUT in this particular case im torn becase as a developer myself I would rather control this then let the user set it..
If I create a Title game
1. Display plane 1 = HUD - 1080p
2. Display plane 2 = 3D world - 900p -1080p
and I dynamically scale the resolution of the 3D world based on GPU load (in big fight scenes with dozens of bosses on screen ill render lower res) then I may want to programmatically apply a different 'upscale filter' at that time ?!
So I'd rather leave this to the developer to control
...actually, it is more likely that MS and Sony paid AMD to make custom design changes/additions, which basically meant free R&D for AMD to use later in their products (i.e. TruAudio looks similar to PS4 audio).
For what I have understood, current AMD APUs use Garlic/Onion bus, whereas XB1 uses a somewhat different approach - which we might see in future on AMD GPUs.
R&D costs, and most managers dont invest enough in it, trying to monetize in the short/mid term for their bonuses/division balance in health.
XB1 also uses Garlic and Onion bus, (but it seems that they used different coherent bus between CPU and GPU (with different technology). Joe Macri (Corporate VP & Product CTO of AMD Global Business Unit) talked about Chive bus which is similar to Onion+ since both of them are coherent buses between CPU and GPU, but Chive uses different technology.
http://pc.watch.impress.co.jp/docs/column/kaigai/20140129_632794.html
Also, the same site speculated that XB1 is using Chive instead of Onion+ in their architecture diagrams: