So that means we have to upgrade to Windows 8 to play D4?
I guess I'm not playing D4, then. L2NotLoseSales, id.
lol, are we seriously talking about directx in a doom thread?
So that means we have to upgrade to Windows 8 to play D4?
I guess I'm not playing D4, then. L2NotLoseSales, id.
lol, are we seriously talking about directx in a doom thread?
It would if Bethsoft simply refuses to use the available hardware features on PS4.
They're just evil enough that I might believe they would do something like that. And Bethy has long-standing asslickery relationship with MS, so they might have financial incentive to make this version look/run better for whatever reason (this stretching back to the days of Oblivion launching first on 360 IIRC, all expansions for Skyrim launched first on 360 as well.)
I want to see the Tiled Resources technology running in real time on a PC or consoles.
How so? It has no advantage over other platforms.
It would if Bethsoft simply refuses to use the available hardware features on PS4.
They're just evil enough that I might believe they would do something like that. And Bethy has long-standing asslickery relationship with MS, so they might have financial incentive to make this version look/run better for whatever reason (this stretching back to the days of Oblivion launching first on 360 IIRC, all expansions for Skyrim launched first on 360 as well.)
Could the jpeg decoder logic on the XB1 be of benefit here? In terms offloading the gpu or cpu?
I don't believe the chip can do it between the HDD and the RAM.MegaTexture does involve using jpeg or jpeg like variable bit compression to more efficiently use the HDD bandwidth and then transcode to DXT on the fly.
Jpeg is probably just used for the Kinect camera.
I don't believe the chip can do it between the HDD and the RAM.
How clipmap works, the textures are already organized and stored in a way that it's efficient to be 'found' on the media/drive and loaded into the RAM. Besides you can still have compressed textures stored in the RAM (and they are, always), as the decompression happens on the GPU anyways. (and it was essentially just trolling, so please just ignore)
Well, i only asked because the vgleak documents place the dmas associated with compression logic under exclusive control of the running title while the system shares the 4th with the running title. The docs describes using the one DMA during system's time slice of the gpu and specifically references the use of jpeg decompression on textures.
"The same move engine that supports LZ decoding also supports JPEG decoding. Just as with LZ, JPEG decoding operates as an extension on top of the standard DMA modes. For instance, a title may decode from main RAM directly into a sub-rectangle of a tiled texture in ESRAM. The move engines contain no hardware JPEG encoder, only a decoder."
MegaTexture does involve using jpeg or jpeg like variable bit compression to more efficiently use the HDD bandwidth and then transcode to DXT on the fly.
That's what I said, it has nothing to do with the HDD, which was what you asked earlier.
Jpeg and DXT are both compression, it makes sense to store on the media in the best form to be consumed, it makes no sense to transcode from Jpeg to DXT.
Yup, reason for HDPhoto compression was loading speed and storage limitations.But isnt megatexture using jpeg basically as a way to maximize HDD bandwidth. Jpeg compresses at higher rates at better quality versus DXT while being useless as a format that a gpu can directly use.
Plus isnt transcoding from jpeg to dxt exactly whats going on. Thats what i interpreted when i read up on megatextures ( id/nvidia version). I may be wrong though but i watched a video of a google guy that discouraged jpeg to dxt transcoding becuase of the extra artifacting it introduced. They advocated an enhanced form of compression by simply taking a dxt texture and applying entropy encoding and delta coding to chop it down to almost jpeg like levels.
https://code.google.com/p/crunch/
This is something i ran into while reading up the subject months ago.
I not asking if the xb1 can directly decode jpegs off the hdd, im asking if the decoder can be realized as a benefit because it relieves the gpu or cpu from having to perform that duty.
I'm pretty sure it would work something like this..So if we take a step back and make some assumptions, here's what the flow would be like
HDD (jpeg) -> DDR3 (jpeg) -> decoder -> ESRAM (raw) -> CPU/GPU (transcode) -> DDR3 (DXT)
I'm not convinced that this is more ideal.
You would have to go back to RAM after decoder stage, as it's a DMA engine. You couldn't pipe it straight into the CPU (or GPU); these devices can't store anything per se, they fetch their data as-needed from RAM.I'm pretty sure it would work something like this..
HDD (jpeg) -> DDR3 (jpg) -> decoder -> CPU/GPU (transcode) -> DDR3 (DXT)
In addition to Grall,I'm pretty sure it would work something like this..
HDD (jpeg) -> DDR3 (jpg) -> decoder -> CPU/GPU (transcode) -> DDR3 (DXT)