Data compression is the compacting of data by decreasing the number of bits which are stored or transmitted. Consequently, the compressed information needs less disk space than the initial one, so more content can be stored on the same amount of space. You'll find many different compression algorithms that work in different ways and with some of them only the redundant bits are erased, which means that once the data is uncompressed, there is no decrease in quality. Others erase excessive bits, but uncompressing the data following that will lead to lower quality in comparison with the original. Compressing and uncompressing content needs a large amount of system resources, particularly CPU processing time, therefore every hosting platform that uses compression in real time must have enough power to support this feature. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of keeping the actual code.