Digital Foundry Article Technical Discussion [2020]

Discussion in 'Console Technology' started by BRiT, Jan 1, 2020.

Thread Status:
Not open for further replies.
  1. Aaron Elfassy

    Newcomer

    Joined:
    Apr 17, 2016
    Messages:
    80
    Likes Received:
    106
    That's not how hdr is supposed to work. The pq eotf has fixed nit levels. There's nothing stopping display manufacturers from adding dynamic tonemapping - all of them actually do this now. Dolby vision has just introduced "Dolby vision iq" for this exact reason. Basically the tonemapping is tied to an ambient light sensor in the tv and the image is adjusted accordingly - while maintaining the creator's intent.

    A good read is polyphony digital's paper on hdr. Basically there's a sweet spot - like the diffuse white point. All hdr displays should be able to display this sweet spot (100-300 nit range). As long as the sweet spot is hit you can go as dark in the low end and as bright in the high end as you like. The better the display, the better the range and dimension to the image.
     
    Cyan and BRiT like this.
  2. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    well, when you do.... I'll left it that, and play mysterious. Still, you don't need an awesome monitor to notice how....it...., trust me on this one. Think of how mucho you love Raytracing, which I can understand despite only having a non capable of RT graphics card -1080-.

    If youtube is tonemapping the HDR video, yes, I guess it will never look the same as true SDR. The issues of working with two colour spaces at the same time -the OS also has sometimes a hard time and odd things can happen-.

    I have HDR enabled on the monitor all the time -using the W10 slider to adjust and equal both as much as I can- and it surprised me to see the HDR tag in the video-. I think it was a first for DF.

    Keep up the good work, Alex!
     
  3. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    I'm looking at Calman for HDR Dolby Vision and their workflow for LG has two modes, one for reference viewing (dark) and one for daytime viewing that has the same peak brightness but an overall higher average picture level to compensate for ambient light. I'm not sure how it works, but it's not dynamic tonemapping. Without some good solution for ambient light, HDR is basically garbage. Any standard that forces you to see crushed blacks under your normal viewing condition would be missing the point.
     
    BRiT likes this.
  4. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    depending on the game, it doesnt need to look realistic, but it it does.....

    Have you tried Forza Horizon 4 with HDR? You can see the difference in real time. It's night and day.

    Switch HDR off and HDR on and you are going to notice the difference in shadowing just in the paint of the car while the action is paused. In game it's just that it is a different game -looks wise-.

    What looks like an amazing game, becomes the superlative of amazing as of a sudden. And once you enable HDR going back to SDR the game looks dull and lacks colour depth and shadowing, it becomes something of a washed out image -though still soooo good-.

    Forza Horizon 4 has a very easy calibration tool to set the HDR right. The sweet spot for my monitor is 1500 nits -when the FM logo disappears completely as indicated-, in HDR for Games mode. My monitor goes up to 500 nits in HDR mode -so it isnt something out of anyone's league, yet the difference is so pronounced!-.

    Also, have you tried Gears 5?

    This one has a HDR tool to set the nits that you might like a lot, because the clouds look very realistic.

    In this case, I followed what I see in real life, because the calibration image they use shows the sun appearing above the horizon in between the clouds as a reference.

    So I started moving the and 500 nits looked dull, 1000 nits looked overly bright losing detail.

    Then I found a sweet spot where the edges of the clouds were shining as in real life when the sun appears in between and when I looked at the nits, it turns out that the sweet spot was also 1500 nits.

    The result is that playing Gears 5 is...otherworldly. :mrgreen:

    You can see the blue lights of the gears' armor reflecting on the character's faces, and the colors are so intense..

    When you are in the first level, and enter the cave, when the ceiling of the cave falls, what was so dark...

    Becomes very brilliant because of the open gap on the ceiling where the outside light filters.

    And you need to squint your eyes because of the luminance intensity.

    Still not as in real life, maybe in a few years when 10000 nits monitors will be available, but it certainly adds to the atmosphere of games where materials and light -like with raytracing- behave like the real world, even if they are artistically more or less realistic.

    cheers @Silent_Buddha
     
  5. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    could this be an evolved variation of AMD RIS?

    Apparently, an Xbox game studio is experimenting with shipping low res textures to be upscaled in real time by an AI.

    https://wccftech.com/an-xbox-game-s...-res-textures-to-be-ai-upscaled-in-real-time/

    [​IMG]
     
    Nesh likes this.
  6. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,317
    That picture while impressive is from something completely different and not something that can currently be done in real-time. So, it could be misleading if someone looks at that and thinks it's representative of what MS can accomplish in real-time.

    Now, it's possible that MS may be able to do something similar in real-time, but we don't know as there are no public demonstrations of it yet.

    Regards,
    SB
     
    Cyan likes this.
  7. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    perhaps in the mobile space to use with the cloud? One can live without HDR -have many games without it that impress me sooooo much-, even without raytracing -the only tick I am missing as of now-, but on a phone or tablet for games to look good and save battery/storage space a game with this technology could be very interesting.

    I think I saw that same image somewhere, a few years ago, but can't recall exactly.
     
  8. turkey

    Veteran

    Joined:
    Oct 21, 2014
    Messages:
    1,112
    Likes Received:
    883
    Location:
    London
    Would not have it any other way. I know its just sharpening but I mentioned it as that seems to be exactly what the AI upscale on the shield is and its in AMDs toolkit for Sony and Microsoft

    I seem to recall it produced reasonable results.

    Did DF cover AMD RIS and its perceived quality and performance gains when AMD announced it? I seem to remember a comparison video but cannot find it now.

    I had a quick Google and top result is similar to what I remember

    https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

    Bottom Line
    Radeon Image Sharpening is genuinely impressive. It doesn’t require any developer implementation and it works well by sharpening the image which can be useful in a variety of situations.

    After spending more time with the feature, we feel the best use case is for image downsampling with high resolution displays. A sharpened 1800p image was typically as good as a native 4K image in our testing, which means you can happily use this configuration with Navi GPUs to gain ~30% more performance for a minimal quality loss. Downsampling all the way to 1440p didn’t deliver as good results, so the sweet spot is around that 70 to 80 percent resolution scale.

    The article the concludes in favor of ris over DLSS but they are old titles and the new DLSS implementations I believe are far better. Both will have their place but Sony and Microsoft are in team red so this could be something that is adopted. Also this is a blind driver application, I would assume the consoles sdk would allow tweaking to help dial in quality and artists vision into the application.

    Edit: further thoughts
    Possibly going off pieste here.
    This works with soft images and not angular geometry.
    Ps4 Pro Mark Cerny talked about 4k geometry and lower res shading that would eliminate the dumb upscale and artifacts, but may be superceded by temporal techniques now.
    We know Microsoft has VRS and this would compliment that well, if it was applied intelligently with info from the game it could use the vrs data(and more) to know where to sharpen more or less.
     
    #88 turkey, Feb 6, 2020
    Last edited: Feb 6, 2020
    Silent_Buddha and DavidGraham like this.
  9. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    There is a conscious implementation of it in the form of Contrast Adaptive Sharpening, it's used under the name of AMD FidelityFX in several titles, it's implemented in the engine itself and is vendor agnostic, and seems to selectively apply sharpening where it matters.

    I believe next gen consoles will try to lessen the impact of 4K resolution using Smart Sharpening or AI Upscaling or Variable Rate Shading or a combination of them. This is badly needed if next gen games are to increase presentation quality significantly or use RT.
     
    turkey likes this.
  10. zed

    zed
    Legend

    Joined:
    Dec 16, 2005
    Messages:
    6,415
    Likes Received:
    2,139
    Why on earth would you not ship the small textures and then once its downloaded on the HD upscale them only once BEFORE the game starts, one time only
    i.e. Its apparent whoever wrote the article has no actual understanding of what they are doing
     
    milk likes this.
  11. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    2,756
    Likes Received:
    2,206
    I read this article when it was originally published, and I couldn't help but get stuck up on the term "downsampling". Isn't it upsampling when you take a 1800p image and scale it up to 2160p regardless of any sharpening filters? Wouldn't downsampling be the opposite, when you take a higher res image reduce the sample rate of a higher resolution image down to a lower resolution image.

    Anyway, RIS vs DLSS is no longer a vendor vs vendor issue, since RIS's algorithm is open source and has been implemented on the driver level by nVidia as well.

    Hardware Unboxed did a fairly in depth video comparing RIS vs DLSS when RIS was released. Pretty sure they did follow up videos when nVidia released their sharpening also.
     
    turkey likes this.
  12. turkey

    Veteran

    Joined:
    Oct 21, 2014
    Messages:
    1,112
    Likes Received:
    883
    Location:
    London
    Devils advocate says perhaps due to expensive, hard to expand, limited physical storage space on the hdd?
     
  13. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    I'm wondering if there's some way they could use ai to upscale a small block of a texture that would fit into cache. Basically trade alu for vram bandwidth, keep the data in cache while it's needed. So some version of reading a block from the texture into gpu cache and then ai upscale before sampling, maybe even avoiding a write back to vram. Or maybe with virtual texturing, tiled resources, the virtual texture stores the upscaled texture. Seems the simplest case. Not sure if the tile sizes typically map to what fits into cache. Keep the virtual texture in vram, but you only have to selectively upscale the tiles that you need, which would roughly fit into cache.
     
  14. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,317
    With games like RDR 2 already taking up 150 GB of valuable storage space, it's possible that if nothing is done, game sizes will grow significantly for next gen consoles.

    While deduplication of stored data will help, developers will still want to take advantage of the increased resources available to them.

    Better compression is obviously one way to accomplish this, but it doesn't come free as more aggressive compression algos are significantly more computationally expensive than what is currently used in games.

    In a similar vein, real-time AI upscaling of textures is another way to address this, but at unknown cost. And, on top of that the two approaches aren't mutually exclusive. You could combine real-time AI upscaling with better compression for even greater storage savings.

    But, as mentioned all of this comes at a cost. So, it'll certainly be interesting to see what each company does WRT this problem.

    Regards,
    SB
     
  15. milk

    milk Like Verified
    Veteran

    Joined:
    Jun 6, 2012
    Messages:
    3,977
    Likes Received:
    4,101
    AI texture upscalling, technically, IS a form of data compression.
     
    Silent_Buddha likes this.
  16. zed

    zed
    Legend

    Joined:
    Dec 16, 2005
    Messages:
    6,415
    Likes Received:
    2,139
    Mate, do you know how slow thjs image scaling is? I tested it once a single image took like a minute
    True my machine aint the best, so I looked up what someone else has gotten with their NVIDIA Tesla P40 - GPU
    one image 3 = seconds
    yet they want to do this with 100s of images per frame at 60fps :lol2: on weaker hardware no doubt as I assume they are not restricting it to $5000 GPUs
    Sure with all algorithms theres room for improvements, but here we are talking about orders of magnitude improvements required
     
  17. turkey

    Veteran

    Joined:
    Oct 21, 2014
    Messages:
    1,112
    Likes Received:
    883
    Location:
    London
    I probably came over as pedantic or preachy. I was just trying to be concise out of lazyness and cold hands when walking.

    I cannot comment on the practicality, but hdd space is the only reason I can see you might do it. . Bandwidth is cheap for most users so I dont think they care about file sizes for distribution, certainly we have already passed 100gb for a title and what's a couple of extra gigs between friends.
     
  18. zed

    zed
    Legend

    Joined:
    Dec 16, 2005
    Messages:
    6,415
    Likes Received:
    2,139
    Nothing wrong with pedantry, & I do agree with what you said. Its just I fail to see how they speed this up enough to be practical.
    Also from what I've seen yes for some images it upscaled ok/good but with others it was crap
     
  19. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,317
    Something similar to DLSS. You spend X number of hours/days/weeks training on Y numbers of GPUs in a server farm. Once you have an acceptable model you apply that model to a game and the model is executed in real-time. The longer you train it, the better the results will be.

    Hence,

    From the interview that was posted above... https://wccftech.com/an-xbox-game-s...-res-textures-to-be-ai-upscaled-in-real-time/

    So, similar to DLSS, best results will come from training a ML model for assets in each specific game. Potentially even multiple ML models per game.

    Regards,
    SB
     
    #99 Silent_Buddha, Feb 8, 2020
    Last edited: Feb 8, 2020
  20. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,317
    Yeah, I was trying to explain that in a way that more people might understand. IE - that it saves space and can be combined with other compression techniques that most people are at least somewhat familiar with. In my head as I typed that I was going to mention that it's basically just another form of compression, but by the time I finished the paragraph, I'd obviously forgotten. :) Getting old sucks. :p

    Regards,
    SB
     
    milk and Cyan like this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...