Volumes Of Fun http://www.volumesoffun.com/phpBB3/ |
|
Volume compression http://www.volumesoffun.com/phpBB3/viewtopic.php?f=2&t=82 |
Page 1 of 1 |
Author: | AndiNo [ Wed Nov 10, 2010 5:53 pm ] |
Post subject: | Volume compression |
Hi again! Now that my game is progressing nicely I've come up with another question. Does the current volume class support or use some kind of compression for the voxels? If not, is this planned for the future? What I'm thinking of is this: When you use setVoxel on a volume the volume has to internally unpack the voxels first, then set the voxel and then pack them up again. This would be a bit slower than usual but could save a great amount of memory. Ideally the user could decide if he wants to use this or not. |
Author: | milliams [ Wed Nov 10, 2010 7:44 pm ] |
Post subject: | Re: Volume compression |
I'm not completely sure but I'm I'm fairly sure that at least homogeneous blocks (blocks which are all the same value) are compressed in some way. With this method there is almost no penalty to compress/uncompress since it's simply a look-up. There's certainly code in PolyVox for doing this sort of thing but I'm unsure if it's currently being utilised. Take a look at library/PolyVoxCore/include/Volume.{h,inl} for the details. |
Author: | David Williams [ Wed Nov 10, 2010 8:51 pm ] |
Post subject: | Re: Volume compression |
Yes, milliams is basically right. The volume class stores it's data as a collection of blocks, each of which is 32x32x32 voxels by default. If a block is homogeneous (all the voxels are the same) then it is shared with other blocks with the same value. So if you have 10 blocks which all contain just the value '0' then this block is only actually stored once. The resulting compression is not that high (something like 25%-50%) but it has no effect of read speed. Writing can be slower if a block has to be unshared but this doesn't happen very often. Blocks are unshared as required, but if a block becomes homogenous then PolyVox does not automatically know this. You can occasionally call Volume::tidyUpMemory() to search for homogeneous blocks and share them. There is some potential to improve the system, but actually I haven't found it to be a big issue. For example, given a 1024^3 volume, the rendering is more of a problem than the memory requirements. [Edit:] Oh, and you should also look at the documentation in Volume.h. This is one class that is actually quite well documented ![]() |
Author: | AndiNo [ Wed Nov 10, 2010 11:23 pm ] |
Post subject: | Re: Volume compression |
David Williams wrote: each of which is 32x32x32 voxels by default. Since all my volumes are only 32^3 voxel in size I assume the benefit of this will be quite small, even if I make the blocks smaller.David Williams wrote: For example, given a 1024^3 volume, the rendering is more of a problem than the memory requirements. You're probably right there. I wanted to know if any compression exists just to make sure...David Williams wrote: [Edit:] Oh, and you should also look at the documentation in Volume.h. This is one class that is actually quite well documented Hehe right! I read it just now. Problem with these documentations is you never know when they were last updated ![]() ![]() On a side note: I was wrong choosing 32^3 volumes because of the recreation speed when changing them. After I exchanged my voxel-picking algorithm with a really fast one there is almost no lag/stutter when removing voxels. Obviously the problem was my brute-force picking algorithm ![]() |
Page 1 of 1 | All times are UTC |
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group http://www.phpbb.com/ |