Machine Learning to enhance game image quality

That seems a verbatim copy of the previous article (though vice versa, I'm sure). Which is disappointing because it's presenting that image as official.
 
For consumer point of view, I guess this would be good for many 3rd world countries with data caps &low speeds.

In first world countries it doesn't matter for mobile users as unlimited& uncapped 3G can be cheap as 5€/month and 50-300mpbs un-capped 4G is 10-20€/month.

And for home users it is usually useless also, when 100/10 cable with no limits for data costs 9.90-19.90/month, it is easy to stream 1080p quality.

For ISP/server farms it could be better, if it doesnt eat too much of computing power.

And places like African&middle-eastern/asian countries with data caps.

Well, at least this is how connections are&cost here at Finland, so I am assuming that most 1st world countries have similar prices&no caps, why would they if place like Finland (in country side there can be only few person/km^2 and too much of willderness to count and 3G at least works maybe 90% of the country) can do it
 
Well, at least this is how connections are&cost here at Finland, so I am assuming that most 1st world countries have similar prices&no caps, why would they if place like Finland (in country side there can be only few person/km^2 and too much of willderness to count and 3G at least works maybe 90% of the country) can do it
Your country has limited population fairly localised as I understand it and a government with a focus on supporting infrastructure. Contrast that with the UK where it's mostly private enterprise and government intervention to try to get it to treat people fairly. 10/100 isn't an option. 20 Mbps is superfast broadband here (although faster speeds are included) and availablility is patchy.

From https://checker.ofcom.org.uk/broadband-coverage
This is about 10 miles across in my locality, in the affluent county of Surrey

Image1.png

Reality is it's a lot cheaper for a lot of smaller nations to roll out faster broadband where they haven't hundreds of years of legacy crap in the way. Uncapped 4G is £24 a month.

Never make assumptions about other nations based on your own!
 
For consumer point of view, I guess this would be good for many 3rd world countries with data caps &low speeds.

Doing more with less is always good. The big part for me is that this will allow a higher quality image using less processing power/Energy. Is the same idea with checkerboard rendering (Render at 1440P and create new pixels to fill a 4K screen), but instead of being a one size fits all type of method, that causes artifacts, this is a customized/Smart upscaling that will allow each game to have scene dependent AA. In theory each frame could have the best filters that help the resulting image to be as similar as possible to the real thing. I really want to see examples of games rendered at 1080P and then upscaled to 4K. We are already starting to see a trend for dedicated hardware to accelerate AI and I think this will be a big part of the future.
 
That seems a verbatim copy of the previous article (though vice versa, I'm sure). Which is disappointing because it's presenting that image as official.

Do you really thing it looks that bad? The background image has the same AA applied compared to the zoomed square.

ai-antialiasing.png


"The left inset shows an aliased image that is jaggy and pixelated. NVIDIA’s AI anti-aliasing algorithm produced the larger image and inset on the right by learning the mapping from aliased to anti-aliased images."
 
The right insert is blurred, spreading pixel-level details over multiple pixels. All the small details are blurred. The width of the edges is a couple of pixels instead of being a single line. Other AA techniques are far superior.
 
Talking about AA techniques, is there freely available TAA style like in Frostbite? Or is SMAA T2x the only one available for use?
"DecimaSiggraph2017-final.pptx" seems to have indication of using simple FXAA + self done Temporal element into it..
 
I will post examples once they are available.

LG Display, Sogang University Jointly Develops AI-Based VR

"The core of the the newly developed technology is an algorithm that can generate ultra-high resolution images from low-resolution ones in real time. Deep learning technology makes this conversion possible without using external memory devices.

The new technology boosts power efficiency and optimizes the algorithm, realizing high resolutions on mobile products. It cuts motion to photon latency and motion blurs to one fifth or less the current level by slashing system loads when operating displays for VR."


http://www.businesskorea.co.kr/news/articleView.html?idxno=22604
 
Back
Top