If Rage had 4x the texel density

Ah, yes it does indeed "Use as high of a resolution texture as your heart desires" is a lie for reasons outside of megatexture
 
Rage data on disk used lossy compression and it was not a fixed size like DXT, right? Now if I have a source texture at 8kx8k pixels and I generate two textures out of it, 2kx2x and 1kx1k while making sure the compression algorithm parameters stay the same how will the new resized texture size be? Am I right to assume it's likely bigger than 25% of the original?

My reasoning is that for the high-res texture the adjacent pixels are likely more similar than on the lower resolution one and thus all kinds of wavelet compression algorithms should be a bit more efficient. How much exactly, I have no clue. It will definitely depend on specific textures but some general ballpark number would be nice or just even confirmation if my theory makes any sense at all :)

Basiaclly where I'm trying to arrive is if increasing the texture pixel density by X would the install/download size also increase by X or less/more.
 
Why in the world would they have to do such a silly thing when they could replace just a chunk of it. You can replace parts of a JPEG, PNG, BMP etc... why would you think you cannot replace just a part of the megatexture?

Of course, BF2 used diff patching to get away with smaller patches. What tends to happen though is that this process complicates cumulative patching later on where you either give up universal patches that work on any given game version or you end up having to create patches to support all permutations possible (e.g. different versions, languages, w/ DLC, etc.) which blows up the patch file size anyway.

That's even assuming the new MT would be exactly the same as the old one except for that one changed tile which, when you throw in the lossy compression and the visibility compression determination they used could very likely mean every single pixel could be off by a single bit and the diffpatch would generate a 100% diff file anyway! :p Realistically, they could probably blindly crop and live with a, hopefully not noticeable to the naked eye, seam around the replaced tile.

Rage data on disk used lossy compression and it was not a fixed size like DXT, right?

IIRC, it uses Microsoft's HD Photo which MS let them have royalty-free, no doubt to mitigate the XBOX's DVD storage deficit.

Basiaclly where I'm trying to arrive is if increasing the texture pixel density by X would the install/download size also increase by X or less/more.

It depends on the source art. If the original source art is very noisy, more density means more noise which means less compression at the same quality, OTOH, if similar to JPEQ, HD Photo compression size does not increase linearly with source size. A quick search didn't wield any size comparison with HD Photo though.
 
Does anyone remember fractal compression ?
what happened to that
from wiki
"An inherent feature of fractal compression is that images become resolution independent[7] after being converted to fractal code. This is because the iterated function systems in the compressed file scale indefinitely. This indefinite scaling property of a fractal is known as "fractal scaling"."
 
Of course, BF2 used diff patching to get away with smaller patches. What tends to happen though is that this process complicates cumulative patching later on where you either give up universal patches that work on any given game version or you end up having to create patches to support all permutations possible (e.g. different versions, languages, w/ DLC, etc.) which blows up the patch file size anyway.

That's even assuming the new MT would be exactly the same as the old one except for that one changed tile which, when you throw in the lossy compression and the visibility compression determination they used could very likely mean every single pixel could be off by a single bit and the diffpatch would generate a 100% diff file anyway! :p Realistically, they could probably blindly crop and live with a, hopefully not noticeable to the naked eye, seam around the replaced tile.

It kinda depends how it is done too. It is quite possible that they do not compress the entire thing at once but in sections to begin with. I don't really see why it would matter much with megatexture though if you have the issues you are discussing. Lets say they skip a patch so they don't change one area of the megatexture, big whoopty doo. It still loads up fine, just looks slightly different. It isn't as if it is calling ME_wall_block3_scuffed_ivy. So if it isn't updated no problem.
 
There's a very simple reason that the texture resolution in Rage isn't 4x higher, and it has nothing to do with the amount of graphics memory. Here's a picture to illustrate:

cd_drives-multiple_cd_duplicator_copier_1_11_tower_164_m.jpg
 
Cute. For another laugh, see my discussion with Wuliheron here which prompted me to create this thread. That guy is a piece of work.
 
Didn't think AnandTech forums were that bad... makes Beyond3D look like a safe heaven, even the dreaded console forums...
;p
 
Didn't think AnandTech forums were that bad... makes Beyond3D look like a safe heaven, even the dreaded console forums...
;p

It is just an example of typical human nature refusing to admit that they were incorrect. There are too many folks like that out and about in the world.
 
It kinda depends how it is done too. It is quite possible that they do not compress the entire thing at once but in sections to begin with. I don't really see why it would matter much with megatexture though if you have the issues you are discussing. Lets say they skip a patch so they don't change one area of the megatexture, big whoopty doo. It still loads up fine, just looks slightly different. It isn't as if it is calling ME_wall_block3_scuffed_ivy. So if it isn't updated no problem.

True, for Rage which is primarily a SP game, but in MP games you have to make sure the clients are all running the same texture resources to prevent passive cheating through contrast changes, outright whiting out walls to make adversaries stand out, etc.

<tangent>

We ran into this problem with ETQW where a full-quality MT was between 200 and 400 mb which was significant back in 2007; and still is if you realise most people will get the new map while they're on a server and the map rotates so the server has to send the new map to all the players who don't have it.

A common strategy for us was to use a ultra low-quality 50Mb or so MT while beta testing our maps and only ship the full quality MT when we released the final version to prevent excessive download sizes and wait queues.

We did petition Splash Damage to allow the MT checksum to have both values exactly for this reason but AFAIK it never went through exactly because it could be used for cheating since the 50Mb MT was significantly low-resolution and made players stand out more.

</tangent>

With diffpatched you have to make sure you update all (network-sensitive) resources on all subsequent patches for all different deployment permutations. The more patches you release the worse the problem gets.
 
There's a very simple reason that the texture resolution in Rage isn't 4x higher, and it has nothing to do with the amount of graphics memory. Here's a picture to illustrate
They could still have offered an optional patch as a (torrent) download. People that care are generally big enough enthusiasts to be capable of downloading tens or even hundreds of GBs without problems. I know I could and would if it was possible :)
 
They could still have offered an optional patch as a (torrent) download. People that care are generally big enough enthusiasts to be capable of downloading tens or even hundreds of GBs without problems. I know I could and would if it was possible :)

This is precisely what they should do, but as id likes to open source their engines and as the assets are not included when they open source them there are probably business people saying "hey wait a minute here we don't want the assets out in the wild"
 
there are probably business people saying "hey wait a minute here we don't want the assets out in the wild"
And the original game disk you can buy from (e)shop with the data doesn't qualify as "assets out in the wild"? Interesting.
 
Well yes and no. They theoretically have less value than the hypothetical 80gb pack. Nevertheless you are right, I did not think that through all the way. Though I don't know that I am convinced that the business people are not thinking something similar anyway :) Maybe they just think torrents are the devil.
 
Embed a torrent(-like) client within the game itself. Users install the 20GB base textures, then progressively download the pieces of detail textures as they move about the world and share them back to their peers. Since it's coupled with the game the engine can prioritize which tiles are needed and know which are resident; the user doesn't need to download the entire megatexture before playing; the burden on id's seed servers is greatly lessened the more players there are. In a linear game, the needed tiles could be prioritized in bulk based on region and purged some time after the section is passed, if the user doesn't want to dedicate the entirety of the needed disk space. Streaming detail textures could incentivize always-online DRM; in-game perks could in turn incentivize generous seeding; torrent checksums could help mitigate multiplayer cheating.

Of course, this would require fat pipes and huge drives, and place additional burdens on already-stressed disk IO. Nontrivial performance hit for constant validation/consistency, reliance on an unpredictable peer pool, and probably disk fragmentation issues.
 
Does anyone remember fractal compression ?
what happened to that
from wiki
"An inherent feature of fractal compression is that images become resolution independent[7] after being converted to fractal code. This is because the iterated function systems in the compressed file scale indefinitely. This indefinite scaling property of a fractal is known as "fractal scaling"."
Yes, I do remember it.
The questions, with regards to TC, are
  1. Does it allow easy random access to texels without decompressing whole swathes of data
  2. Is it cheap to implement (I still think that's important but I've seen some scary proposals)
  3. Not too slow to compress - I think this is where fractal compression has trouble
 
Back
Top