UNIVERSIDAD DE LAS AMÉRICAS-PUEBLA

COLLEGE OF ENGINEERING

DEPARTMENT OF COMPUTING, ELECTRONICS AND MECHATRONICS

IE 361 Information Theory – Homework 1 (Source: Digital Communications Fundamentals and Applications, Bernard Sklar)

1. A discrete source generates three independent symbols A, B, and C with probabilities 0.9, 0.08 and 0.02, respectively. Determine the entropy of the source.

2. Design a binary Huffman code for a discrete source of three independent symbols A, B, and C with probabilities 0.9, 0.08 and 0.02, respectively. Determine the average code length for the code.

3. A discrete source generates two dependent symbols A and B with conditional probabilities P(A|A) = 0.8, P(B|A) = 0.2, P(A|B) = 0.6 and P(B|B) = 0.4.

a) Determine the probabilities of symbols A and B.

b) Determine the entropy of the source.

c) Determine the entropy of the source if the symbols were independent with the same probabilities.

4. An input alphabet (a keyboard on a word processor) consists of 100 characters.

a) If the keystrokes are encoded by a fixed-length code, determine the required number of bits for the encoding.

b) We make the simplifying assumption that 10 of the keystrokes are equally likely and that each occurs with a probability of 0.05. We also assume that the remaining 90 keystrokes are equally likely. Determine the average number of bits required to encode this alphabet using a variable length Huffman code.

5. A given source alphabet consists of 300 words, of which 15 occur with probability 0.06 each and the remaining 285 words occur with probability 0.00035 each. If 1000 words are transmitted each second, what is the average rate of information transmission?

6. Find the average capacity in bits per second that would be required to transmit a high-resolution black-and-white TV signal at the rate of 32 frames per second if each picture is made of 2 x 106 picture elements and 16 different brightness levels. All picture elements are assumed to be independent and all levels have equal likelihood occurrence.

7. For color TV, the system, of the previous question, additionally provides for 64 different shades of color. How much more capacity is required for a color system compared to the black and white system? Find the required capacity if 100 of the possible brightness-color combinations occur with probability 0f 0.001, and 624 of the combinations occur with probability of 0.00064.