# Information Theory MCQs for ESE

In this section, you will get all types of Information Theory MCQs for ESE which were asked in previous years’ papers or which can be asked in upcoming exams.

## Information Theory MCQs for ESE

1. A source deliver symbols X1, X2, X3 and X4 with probabilities 1/2, 1/4, 1/8 and 1/8 respectively. The entropy of the system is

• (a) 1.75 bits per second
• (b) 1.75 bits per symbol
• (c) 1.75 symbols per second
• (d) 1.75 symbols per bit

Answer: (b) 1.75 bits per symbol

Explanation:

H(x)=\sum_{i = 1}^{4}p(i)\log_{2}\frac{1}{p(i)},

H(x)=\frac{1}{2}\log_{2}2+\frac{1}{4}\log_{2}4+\frac{1}{8}\log_{2}8+\frac{1}{8}\log_{2}8

H(x) = 1.75 bits per symbol

2. In a single error correcting Hamming code, the number of message bits in a block is 26. The number of check bits in the block would be

• (a) 3
• (b) 4
• (c) 5
• (d) 7

Explanation: In a Hamming code, for k check bits and n message bits,

2^k - 1 - k \geq n \Rightarrow 2^k - 1 - k \geq 26 \; \Rightarrow k = 5

3. To permit the selection of 1 out of 16 equiprobable events, the number of bits required is

• (a) 2
• (b) log10 16
• (c) 8
• (d) 4

Explanation: Number of bits required, n = log2 16 = 4

4. The code which provides for parity check is

• (a) Baudot
• (b) ASCII
• (c) EBCDIC
• (d) Excess-3

5. Which one of the following statements is correct? Shannon’s channel capacity formula indicates that in theory

• (a) by using proper channel codes, we can get an error-free transmission on a noise channel.
• (b) it is not possible to her an error-free transmission on a noise channel, since there will always be some error in the detected signal for finite noise or any channel.
• (c) it is true only for some wired channels and not wireless channels.
• (d) it works only for analog signals and not for digital signals on any channel.

Answer: (b) it is not possible to her an error-free transmission on a noise channel, since there will always be some error in the detected signal for finite noise or any channel.

6. A communication channel has a bandwidth of 100 MHz. The channel is extremely noisy such that signal power is very much below the noise power. What is the capacity of this channel?

• (a) 100 Mbps
• (b) 50 Mbps
• (c) 2400 bps
• (d) Nearly 0 bps

Explanation: Channel capacity, C = BW \log_{2} \left( 1 + \frac{S}{N} \right )

but S/N is almost zero for extremely noisy channels. So C = 0 bps

7. A Source produces 26 symbols with equal probability. What is the average information produced by this source?

• (a) < 4 bits/symbol
• (b) 6 bits/symbol
• (c) 8 bits/symbol
• (d) Between 4 and 6 bits/symbol

Answer: (d) Between 4 and 6 bits/symbol

Explanation: For equiprobable symbols, the average information H(x) = \log_{2}M, Where M is the number of symbols.

H(x) = \log_{2}26 = 4.7 bits/symbol

8. In order to permit the selection of 1 out of 16 equiprobable events, what s the number of bits required?

• (a) 8
• (b) 4
• (c) log10 16
• (d) 2

Explanation: For equiprobable symbols, the average information H(x) = \log_{2}M, Where M is the number of symbols.

H(x) = \log_{2}16 = 4 bits/symbol

9. Which one of the following is the code that is very close to trellis-coded modulation?

• (a) Combines analog and digital modulation
• (b) Combines modulation and encoding
• (c) Encodes following trellis diagram
• (d) Combines amplitude and frequency modulation

Answer: (c) Encodes following trellis diagram

10. A discrete source prodices 8 symbols and is memory-less. Its entropy is

• (a) 1 bit/symbol only
• (b) 2 bits/symbol only
• (c) 3 bits/symbol only
• (d) ≤ 3 bits/symbol

Explanation: When all symbols are equiprobable, then the entropy will be maximum.H(x)_{max} = \log_{2}8=3 bits/symbol.

Since the symbols are not mentioned as equiprobable, therefore entropy H(x) ≤ 3 bits/symbol

11. Source S1 produces 4 discrete symbols with equal probability. Source S2 produces 6 discrete symbols with equal probability. if H1 and H2 are entropies of source S1 and S2 respectively, then which one of the following is correct?

• (a) H1 is always less than H2
• (b) H1 is always greater than H2
• (c) H1 is always equal to H2
• (d) H2 is 1.5 times H1 only

Answer: (a) H1 is always less than H2

12. The entropy of a digital source is 2.7 bits/symbol. it is producing 100 different symbols per second. The source is likely to be which one of the following?

• (a) A binary source
• (b) A quaternary source
• (c) An octal source

Explanation: Let the digital source produce r digital signals. Considering source equiprobable,

H = \log_{r}100 2.7 = \frac{\log_{10}100}{\log_{10}r} \therefore r = 5.5

13. A good line code should have which of the following?

1. Favorable PSD
2. Low intersymbol interference
4. Transparency

Select the correct answer using the code given below:

• (a) 1, 3 and 4
• (b) 1, 2 and 4
• (c) 2, 3 and 4
• (d) 1, 2 ad 3

14. Consider the following codes:

1. Hamming code
2. Huffman code
3. Shannon-Fano code
4. Convolution code

Which of these are source codes?

• (a) 1 and 2 only
• (b) 2 and 3 only
• (c) 3 and 4 only
• (d) 1, 2, 3 and 4

Answer: (b) 2 and 3 only

Explanation: Hamming codes and consolation codes are error-correcting codes.

15. The average information associated with an extremely unlikely message is zero. What is the average information associated with an extremely likely message?

• (a) Zero
• (b) Infinity
• (c) Depends on the total number of messages
• (d) Depends on the speed of transmission of the message

Answer: (c) Depends on the total number of messages

Hello friends, my name is Trupal Bhavsar, I am the Writer and Founder of this blog. I am Electronics Engineer(2014 pass out), Currently working as Junior Telecom Officer(B.S.N.L.) also I do Project Development, PCB designing and Teaching of Electronics Subjects.

This site uses Akismet to reduce spam. Learn how your comment data is processed.