John Carmack: Doom 4 will support Partially Resident Textures (Tiled Resources)

Discussion in 'Console Technology' started by Cyan, Sep 21, 2013.

  1. gurgi

    Regular

    Joined:
    Jul 7, 2003
    Messages:
    605
    Likes Received:
    1

    lol, are we seriously talking about directx in a doom thread?
     
  2. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,502
    Likes Received:
    5,035
    This isn't John Carmack's id anymore. Zenimax is in charge now and they're a brutal bunch of suits and bean counters.
     
  3. Zeross

    Regular

    Joined:
    Jun 3, 2002
    Messages:
    280
    Likes Received:
    11
    Location:
    France
    Like Rage before, Doom will have a DirectX rendering back-end to address the Xbox platform (and a GNM back-end for PS4). But I guess that on Windows it will still use OpenGL.
     
  4. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,497
    Likes Received:
    599
    Location:
    WI, USA
    Oblivion was PC+360 initially. PS3 wasn't even out yet.

    The main evilness I've read of is Morrowind Xbox getting special half assed QA allowances to make a timely appearance on Xbox. And back then there weren't patches for console games, even on Xbox. The smart person doesn't buy a Bethesda HQ-developed game that can't receive patches or mods.
     
  5. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,368
    Likes Received:
    2,559
    Here ya go
     
  6. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,163
    Likes Received:
    1,172
    Could the jpeg decoder logic on the XB1 be of benefit here? In terms offloading the gpu or cpu?

    MegaTexture does involve using jpeg or jpeg like variable bit compression to more efficiently use the HDD bandwidth and then transcode to DXT on the fly.
     
    #46 dobwal, Jul 21, 2014
    Last edited by a moderator: Jul 21, 2014
  7. SlimJim

    Banned

    Joined:
    Aug 29, 2013
    Messages:
    590
    Likes Received:
    0
    Oblivion launched +8 months before PS3 was on the market; 14 months if you look at PAL territories. But the games always ran like total ass, indeed.

    I'll never buy a game of that developer because of all the game breaking bugs that are never patched out. I can tell you; even with 8 (5.5) GB of RAM, they will still have the retail 1.20 game crashing PS4's and Xbox Ones.
     
  8. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    Jpeg is probably just used for the Kinect camera.

    I don't believe the chip can do it between the HDD and the RAM.

    How clipmap works, the textures are already organized and stored in a way that it's efficient to be 'found' on the media/drive and loaded into the RAM. Besides you can still have compressed textures stored in the RAM (and they are, always), as the decompression happens on the GPU anyways. (and it was essentially just trolling, so please just ignore)
     
    #48 taisui, Jul 22, 2014
    Last edited by a moderator: Jul 22, 2014
  9. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,163
    Likes Received:
    1,172
    Well, i only asked because the vgleak documents place the dmas associated with compression logic under exclusive control of the running title while the system shares the 4th with the running title. The docs describes using the one DMA during system's time slice of the gpu and specifically references the use of jpeg decompression on textures.

    "The same move engine that supports LZ decoding also supports JPEG decoding. Just as with LZ, JPEG decoding operates as an extension on top of the standard DMA modes. For instance, a title may decode from main RAM directly into a sub-rectangle of a tiled texture in ESRAM. The move engines contain no hardware JPEG encoder, only a decoder.

    Read more at: http://www.vgleaks.com/world-exclusive-durangos-move-engines"
     
  10. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    That's what I said, it has nothing to do with the HDD, which was what you asked earlier.
    Jpeg and DXT are both compression, it makes sense to store on the media in the best form to be consumed, it makes no sense to transcode from Jpeg to DXT.

     
    #50 taisui, Jul 22, 2014
    Last edited by a moderator: Jul 22, 2014
  11. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,163
    Likes Received:
    1,172
    But isnt megatexture using jpeg basically as a way to maximize HDD bandwidth. Jpeg compresses at higher rates at better quality versus DXT while being useless as a format that a gpu can directly use.

    Plus isnt transcoding from jpeg to dxt exactly whats going on. Thats what i interpreted when i read up on megatextures ( id/nvidia version). I may be wrong though but i watched a video of a google guy that discouraged jpeg to dxt transcoding becuase of the extra artifacting it introduced. They advocated an enhanced form of compression by simply taking a dxt texture and applying entropy encoding and delta coding to chop it down to almost jpeg like levels.

    https://code.google.com/p/crunch/
    This is something i ran into while reading up the subject months ago.

    I not asking if the xb1 can directly decode jpegs off the hdd, im asking if the decoder can be realized as a benefit because it relieves the gpu or cpu from having to perform that duty.
     
    #51 dobwal, Jul 22, 2014
    Last edited by a moderator: Jul 22, 2014
  12. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    So if we take a step back and make some assumptions, here's what the flow would be like

    HDD (jpeg) -> DDR3 (jpeg) -> decoder -> ESRAM (raw) -> CPU/GPU (transcode) -> DDR3 (DXT)

    I'm not convinced that this is more ideal.
     
  13. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,373
    Likes Received:
    475
    Location:
    Finland
    Yup, reason for HDPhoto compression was loading speed and storage limitations.
    http://mrelusive.com/publications/papers/Real-Time-Texture-Streaming-&-Decompression.pdf

    You might want to look into Jon Olicks research on DXT re-compression as well.
    http://www.jonolick.com/
    Seems to be nice alternative to HDPhoto.
    I'm pretty sure it would work something like this..
    HDD (jpeg) -> DDR3 (jpg) -> decoder -> CPU/GPU (transcode) -> DDR3 (DXT)
    No need to use ESRAM for this, especially for raw data.
    JPEG might be good to store to main memory to provide fast access to larger area in texture.

    What comes to actual usage of PRT in idtech game they need to use several PRT textures in a scene as it doesn't support large enough textures..
    I wonder if additional layer of indirection would be the right way to go? (Vertex level?)
     
    #53 jlippo, Jul 22, 2014
    Last edited by a moderator: Jul 22, 2014
  14. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,173
    Location:
    La-la land
    You would have to go back to RAM after decoder stage, as it's a DMA engine. You couldn't pipe it straight into the CPU (or GPU); these devices can't store anything per se, they fetch their data as-needed from RAM.
     
  15. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    41,727
    Likes Received:
    12,801
    Location:
    Under my bridge
    In addition to Grall,

    XB1
    HDD (jpeg) -> DDR3 (jpg) -> decoder -> DDR3 (RGB data) -> CPU/GPU (transcode) -> DDR3 (DXT)

    PS4
    HDD (jpeg) -> GDDR5 (jpg) -> CPU/GPU (decode + transcode) -> GDDR5 (DXT)

    It depends on how much effort the decode is as to whether it's a saving or not. An obvious HW progression seems to be inclusion for support for JPEG decoding on GPU, so JPEG'd tile data can be used directly.
     
    #55 Shifty Geezer, Jul 22, 2014
    Last edited by a moderator: Jul 22, 2014
  16. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,008
    Likes Received:
    859
    Location:
    Planet Earth.
    PS4 doesn't use DDR3 though Shifty... ;)
     
  17. TheWretched

    Regular

    Joined:
    Oct 7, 2008
    Messages:
    830
    Likes Received:
    23
    Didn't Rage have a CUDA implementation on Nvidia systems for texture de/recompression?

    Because... if the DMAs could be used to transcode the texture data for free, that work could be done by Compute on PS4 "for free" (as in the same game on both consoles 12 vs 18 ALU, 6 left over for free).
     
  18. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    It would make sense for both console to just do the transcoding with the combination of GPU/CPU.

    Does anyone know if the unified memory and the bigger standard media remove the need to store the texture in jpeg form?
     
  19. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,568
    Likes Received:
    4,210
    The megatextures are huge. I think they'd still need to be compressed for a system with 5GB of available RAM.
     
  20. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    well, I'd assume there's some sort of streaming involved before getting into the actual clipmap, didn't they manage to make it happen on the 360/PS3 with Rage?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...