In this section, you will get all types of Information Theory MCQs for ESE which were asked in previous years’ papers or which can be asked in upcoming exams.
Sr. No | Topic Wise Communication Systems MCQs for ESE |
---|---|
1. | Analog Communication Systems MCQs for ESE |
2. | Random Variables MCQs For ESE |
3. | Digital Communication MCQ for ESE |
4. | Information Theory MCQs for ESE |
Information Theory MCQs for ESE
1. A source deliver symbols X1, X2, X3 and X4 with probabilities 1/2, 1/4, 1/8 and 1/8 respectively. The entropy of the system is
- (a) 1.75 bits per second
- (b) 1.75 bits per symbol
- (c) 1.75 symbols per second
- (d) 1.75 symbols per bit
2. In a single error correcting Hamming code, the number of message bits in a block is 26. The number of check bits in the block would be
- (a) 3
- (b) 4
- (c) 5
- (d) 7
3. To permit the selection of 1 out of 16 equiprobable events, the number of bits required is
- (a) 2
- (b) log10 16
- (c) 8
- (d) 4
4. The code which provides for parity check is
- (a) Baudot
- (b) ASCII
- (c) EBCDIC
- (d) Excess-3
5. Which one of the following statements is correct? Shannon’s channel capacity formula indicates that in theory
- (a) by using proper channel codes, we can get an error-free transmission on a noise channel.
- (b) it is not possible to her an error-free transmission on a noise channel, since there will always be some error in the detected signal for finite noise or any channel.
- (c) it is true only for some wired channels and not wireless channels.
- (d) it works only for analog signals and not for digital signals on any channel.
6. A communication channel has a bandwidth of 100 MHz. The channel is extremely noisy such that signal power is very much below the noise power. What is the capacity of this channel?
- (a) 100 Mbps
- (b) 50 Mbps
- (c) 2400 bps
- (d) Nearly 0 bps
7. A Source produces 26 symbols with equal probability. What is the average information produced by this source?
- (a) < 4 bits/symbol
- (b) 6 bits/symbol
- (c) 8 bits/symbol
- (d) Between 4 and 6 bits/symbol
8. In order to permit the selection of 1 out of 16 equiprobable events, what s the number of bits required?
- (a) 8
- (b) 4
- (c) log10 16
- (d) 2
9. Which one of the following is the code that is very close to trellis-coded modulation?
- (a) Combines analog and digital modulation
- (b) Combines modulation and encoding
- (c) Encodes following trellis diagram
- (d) Combines amplitude and frequency modulation
10. A discrete source prodices 8 symbols and is memory-less. Its entropy is
- (a) 1 bit/symbol only
- (b) 2 bits/symbol only
- (c) 3 bits/symbol only
- (d) ≤ 3 bits/symbol
11. Source S1 produces 4 discrete symbols with equal probability. Source S2 produces 6 discrete symbols with equal probability. if H1 and H2 are entropies of source S1 and S2 respectively, then which one of the following is correct?
- (a) H1 is always less than H2
- (b) H1 is always greater than H2
- (c) H1 is always equal to H2
- (d) H2 is 1.5 times H1 only
12. The entropy of a digital source is 2.7 bits/symbol. it is producing 100 different symbols per second. The source is likely to be which one of the following?
- (a) A binary source
- (b) A quaternary source
- (c) An octal source
- (d) A hexadecimal source
13. A good line code should have which of the following?
- Favorable PSD
- Low intersymbol interference
- Adequate timing content
- Transparency
Select the correct answer using the code given below:
- (a) 1, 3 and 4
- (b) 1, 2 and 4
- (c) 2, 3 and 4
- (d) 1, 2 ad 3
14. Consider the following codes:
- Hamming code
- Huffman code
- Shannon-Fano code
- Convolution code
Which of these are source codes?
- (a) 1 and 2 only
- (b) 2 and 3 only
- (c) 3 and 4 only
- (d) 1, 2, 3 and 4
15. The average information associated with an extremely unlikely message is zero. What is the average information associated with an extremely likely message?
- (a) Zero
- (b) Infinity
- (c) Depends on the total number of messages
- (d) Depends on the speed of transmission of the message