Comparison of Effects of Entropy Coding Schemes Cascaded with Set Partitioning in Hierarchical Trees

WT (Wavelet Transform) is considered as landmark for image compression because it represents a signal in terms of functions which are localized both in frequency and time domain. Wavelet sub-band coding exploits the self-similarity of pixels in images and arranges resulting coefficients in different sub-bands. A much simpler and fully embedded codec algorithm SPIHT (Set Partitioning in Hierarchical Trees) is widely used for the compression of wavelet transformed images. It encodes the transformed coefficients depending upon their significance comparative to the given threshold. Statistical analysis reveals that the output bit-stream of SPIHT comprises of long trail of zeroes that can be further compressed, therefore SPIHT is not advocated to be used as sole mean of compression. In this paper, wavelet transformed images have been initially compressed by using SPIHT technique and to attain more compression, the output bit streams of SPIHT are then fed to entropy encoders; Huffman and Arithmetic encoders, for further de-correlation. The comparison of two concatenations has been carried out by evaluating few factors like Bit Saving Capability, PSNR (Peak Signal to Noise Ratio), Compression Ratio and Elapsed Time. The experimental results of these cascading demonstrate that SPIHT combined with Arithmetic coding yields better compression ratio as compared to SPIHT cascaded with Huffman coding. Whereas, SPIHT once combined with Huffman coding is proved to be comparatively efficient.


INTRODUCTION
translated and scaled version of particular wavelets are employed to decompose an image [2]. It is preferred because of its inherent property of being redundant and shift invariant [3]. generates an embedded bit stream as output [5]. In this scheme, higher energized coefficients are encoded first and encoding process can be terminated at any point when the distortion metric or a target rate is achieved [4].
SPIHT, an improved version of EZW, was proposed by Amir and William [6]. It is WT based image compression algorithm that generates an embedded bit stream, gives better PSNR and CRs for different types of gray scale images [7]. Although with SPIHT, better compression is achieved by searching more zero-trees and denoting them by separate tree root from the tree [8] yet memory requirements of SPIHT are large enough to handle [9][10].
It has also been observed that most of the values of wavelet coefficients are below the given threshold; hence output of SPIHT consists of a number of binary strings of zeros and ones that contains similarity and provides room for further compression [11]. This additional compression on the output stream of SPIHT can be achieved by using various types of entropy encodings.
In this paper output of SPIHT algorithm has been cascaded with two types of entropy encoders; Arithmetic and Huffman encoder, to evaluate their performance in terms of compression and efficiency. The paper has been organized in 6 sections in a way that Section-1introduces image compression using DWT and embedded encoding.
Section-2 enunciates SPIHT algorithm in detail. Section 3 and 4 describe about the cascading of SPIHT with Arithmetic and Huffman coding respectively. This is followed by simulation and results in section-5. Section-6 concludes the paper.

TREES
SPIHT yields high PSNR than EZW because of a special symbol that indicates the significance of child nodes of significance parent, and separation of child nodes from second generation descendants [12][13][14]. After wavelet transformation of pictorial data, the decomposed image consists of sub-bands, where coefficients form a tree like structure. In Fig. 1 Here Sn (Z) is a set of significant coordinates Z, and ci, j is the coefficient value at coordinate (i,j) [4].
Initially every pixel consider as insignificant pixel with reference to calculated threshold. This algorithm comprises of different passes, given as follows:

Sorting Pass
Verify for significance of each entry of the LIP. Transmit Continue updating all three lists LSP, LIS and LIP relevant to their significance.

Refinement Pass
For all the entries present in the LSP, Transmit MSB found at i th location.

Renewing Quantization Step Pass
Decrease values of 'j' by 1 during this pass and repeat steps of sorting pass, refinement pass and quantization steps pass of the algorithm. Keep repeating unless 'j' becomes zero. At the decoder end same procedure is repeated in reverse order. The output of encoder is fed to the decoder. By employing further entropy coding for the output of the SPIHT encoder more compression is achieved.

Analysis of SPIHT
We illustrate the concept of this algorithm with the help of this example given in Tables 1-2

CASCADING WITH ARITHMETIC CODING
To understand Arithmetic coding following concept should be kept in mind: In this algorithm variable length of Symbols are encoded using variable length code-words.
A single Arithmetic code-word is allocated to all the symbols in the message jointly.
The symbol and code-word has no one to one correspondence. During the symbolization of the SPIHT output bit-stream there remains one, two or three bits. The information of these remaining bits is added to the Bit Header as shown in Table 3.

CASCADING WITH HUFFMAN CODING
Huffman coding is dependent upon the frequency of appearance of pixels in an image. It has the following features: In this algorithm variable length code-words allot fixed length of symbols [15].
This technique decode symbols uniquely i.e. no code word contains the prefix of previous code-word. Fig. 3 represents the flow of algorithm.
Cascading of SPIHT with Huffman coding is done in the same way as was done with Arithmetic coding. Initially Symbolization is done on SPIHT output bit-stream by using a combination of three bits each and then that symbol is further given to Huffman Coder block. Table 4 is drawn for the Lena Image of 512x512 at 0.5 bpp. That shows the symbols with their probabilities and allotted code-words. This cascading results with the same bit header as discussed in Table 3.