One of the basic problems in source coding is to find a prefix-free code for a given source. A code is prefix-free if no codeword is a prefix of any other one. A sequence of prefix-free codewords can be decoded with the minimum possible delay. The prefix-free property is useful in applications such as video compression, where the encoded data must be decoded instantaneously. It is well known that the codeword lengths of any binary prefix-free code satisfy the Kraft inequality. Conversely, a prefix-free code can be constructed with any given set of codelengths satisfying the Kraft inequality. Another important constraint in encoding a source is minimizing the average codelength of the code. This constraint is useful because it helps reduce the consumption of expensive resources, such as hard disk space and transmission bandwidth. The Kraft inequality constraint makes the entropy a lower bound on the average codeword length. Hence the redundancy, which is defined as the difference between the average codelength and the entropy, is usually considered as a measure for evaluating the performance of a prefix-free code. An optimal code, i.e. the instantaneous code which has the minimum average codeword length (or equivalently the minimum redundancy) for an information source, can be obtained using the Huffman algorithm. However, sometimes we may need a faster but suboptimal algorithm for designing a code, for instance when the number of the symbols of a source is very large and the symbol probability vector varies quickly in time, or if we intend to encode just a few output symbols of a source. In such cases one may prefer to use a suboptimal code such as fixed-length code, uniform code, M code or Shannon code. In this thesis we intend to study the overall performance of these suboptimal codes. The redundancy of each code is considered as a random variable on the set of all sources with symbols, and the mean and the variance of the redundancy of each code is studied through exact formulation or simulation. Since all the information sources with symbols carry the same importanceredundancy of the Uniform code is less than . Words Redundancy, the Kraft inequality, the Huffman code, the Shannon code, M code, Minave Criterion