Fundamentals of Information Theory
Practice Questions
Marks 1
1

Consider an additive white Gaussian noise (AWGN) channel with bandwidth $W$ and noise power spectral density $\frac{N_o}{2}$. Let $P_{a v}$ denote the average transmit power constraint. Which one of the following plots illustrates the dependence of the channel capacity $C$ on the bandwidth $W$ (keeping $P_{a v}$ and $N_0$ fixed)?

GATE ECE 2025
2

The generator matrix of a $(6,3)$ binary linear block code is given by

$$ G=\left[\begin{array}{llllll} 1 & 0 & 0 & 1 & 0 & 1 \\ 0 & 1 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 1 & 0 \end{array}\right] $$

The minimum Hamming distance $d_{\min }$ between codewords equals___________ (answer in integer).

GATE ECE 2025
3

A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _______ .

GATE ECE 2024
4
Which one of the following graphs shows the Shannon capacity (channel capacity) in bits of a memory less binary symmetric channel with crossover probability P?
GATE ECE 2017 Set 2
5
Let $$\left( {{X_1},\,{X_2}} \right)$$ be independent random variables, $${X_1}$$ has mean 0 and variance 1, while $${X_2}$$ has mean 1 and variance 4. The mutual information I $$\left( {{X_1},\,{X_2}} \right)$$ between $${{X_1}}$$ and $${{X_2}}$$ in bits is ________________.
GATE ECE 2017 Set 1
6
An analog baseband signal, band limited to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur independently with probabilities $${p_1}$$ = $${p_4}$$ = 0.125 and $${p_2}$$ =$${p_3}$$. The information rate (bits/sec) of the message source is ____________________
GATE ECE 2016 Set 3
7
A discrete memoryless source has an alphabet $$({a_1},\,{a_2},\,{a_3},\,{a_4})\,$$ with corresponding probabilities$$\left( {{1 \over 2}\,\,,{1 \over 4},\,{1 \over 8},\,\,{1 \over 8}\,} \right)$$. The minimum required average codeword length in bits to represent this source for error-free reconstruction is__________________________
GATE ECE 2016 Set 2
8
A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of the first symbol by a small amount $$\varepsilon $$ and decreases that of the second by $$\varepsilon $$. After encoding, the entropy of the source
GATE ECE 2012
9
In a base band communications link, frequencies up to 3500 Hz are used for signaling. Using a raised cosine pulse with 75% excess bandwidth and for no inter-symbol interference, the maximum possible signaling rate in symbols per second is
GATE ECE 2012
10
An analog signal is band-limited to 4kHz. sampled at the Nyquist rate and the samples are quantized into 4 levels. The quantized levels are assumed to be independent and equally probable, bit rate is
GATE ECE 2011
Marks 2
1

The random variable $X$ takes values in $\{-1,0,1\}$ with probabilities $P(X=-1)=P(X=1)$ and $\alpha$ and $P(X=0)=1-2 \alpha$, where $0<\alpha<\frac{1}{2}$.

Let $g(\alpha)$ denote the entropy of $X$ (in bits), parameterized by $\alpha$. Which of the following statements is/are TRUE?

GATE ECE 2025
2

$X$ and $Y$ are Bernoulli random variables taking values in $\{0,1\}$. The joint probability mass function of the random variables is given by:

$$ \begin{aligned} & P(X=0, Y=0)=0.06 \\ & P(X=0, Y=1)=0.14 \\ & P(X=1, Y=0)=0.24 \\ & P(X=1, Y=1)=0.56 \end{aligned} $$

The mutual information $I(X ; Y)$ is ___________(rounded off to two decimal places).

GATE ECE 2025
3

The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be truthfully answered. The average number of questions when asked in the most efficient sequence, to determine the chosen symbol, is _____________ (rounded off to two decimal places).

Symbols a b c d e f g h
Frequency of occurrence $$\frac{1}{2}$$ $${1 \over 4}$$ $${1 \over 8}$$ $${1 \over {16}}$$ $${1 \over {32}}$$ $${1 \over {64}}$$ $${1 \over {128}}$$ $${1 \over {128}}$$

GATE ECE 2023
4

The transition diagram of a discrete memoryless channel with three input symbols and three output symbols is shown in the figure. The transition probabilities are as marked. The parameter $$\alpha$$ lies in the interval [0.25, 1]. The value of .. for which the capacity of this channel is maximized, is __________ (rounded off to two decimal places).

GATE ECE 2022 Communications - Fundamentals of Information Theory Question 8 English

GATE ECE 2022
5

Consider communication over a memoryless binary symmetric channel using a (7, 4) Hamming code. Each transmitted bit is received correctly with probability (1 $$-$$ $$\in$$), and flipped with probability $$\in$$. For each codeword transmission, the receiver performs minimum Hamming distance decoding, and correctly decodes the message bits if and only if the channel introduces at most one bit error. For $$\in$$ = 0.1, the probability that a transmitted codeword is decoded correctly is __________ (rounded off to two decimal places).

GATE ECE 2022
6
Consider a binary memoryless channel characterized by the transition probability diagram shown in the figure. GATE ECE 2017 Set 2 Communications - Fundamentals of Information Theory Question 14 English The channel is
GATE ECE 2017 Set 2
7
A voice-grade AWGN (additive white Gaussian noise) telephone channel has a bandwidth of 4.0 kHz and two-sided noise power spectral density $${\eta \over 2} = 2.5\, \times \,{10^{ - 5}}$$ Watt per Hz. If information at the rate of 52 kbps is to be transmitted over this channel with arbitrarily small bit error rate, then the minimum bit energy $${E_b}$$ (in mJ/bit) necessary is ________________
GATE ECE 2016 Set 3
8
A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events:
$${x_0}$$ : a " zero " is transmitted
$${x_1}$$ : a " one " is transmitted
$${y_0}$$ : a " zero " is received
$${y_1}$$ : a " one " is received

The following probabilities are given:
$$P({x_0}) = \,{3 \over 4},\,\left( {\,\left. {{y_0}} \right|{x_0}} \right) = \,{1 \over 2},\,\,and\,P\,\,\left( {\,\left. {{y_0}} \right|{x_1}} \right) = \,{1 \over 2}$$.
The information in bits that you obtain when you learn which symbol has been received (while you know that a " zero " has been transmitted) is _____________

GATE ECE 2016 Set 2
9
Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilities of occurrence $$P = \left\{ {{1 \over 2},\,{1 \over 4},\,{1 \over 8},\,{1 \over {16}},\,{1 \over {32}},......} \right\}$$. The entropy of the source (in bits) is__________.
GATE ECE 2016 Set 1
10
A fair coin is tossed repeatedly until a 'Head' appears for the first time. Let L be the number of tosses to get this first 'Head'. The entropy H (L) in bits is _______________.
GATE ECE 2014 Set 1
11
Consider the Z- channel given in the figure. The input is 0 or 1 with equal probability. GATE ECE 2014 Set 4 Communications - Fundamentals of Information Theory Question 15 English If the output is 0, the probability that the input is also 0 equals____________________________________
GATE ECE 2014 Set 4
12
The capacity of band-limited additive white Gaussian noise (AWGN) channel is given by $$C = \,W\,\,{\log _2}\left( {1 + {P \over {{\sigma ^2}\,W}}} \right)$$ bits per second (bps), where W is the channel bandwidth, P is the average power received and $${{\sigma ^2}}$$ is the one-sided power spectral density of the AWGN.
For a Fixed $${{P \over {{\sigma ^2}\,}} = 1000}$$, the channel capacity (in kbps) with infinite band width $$(W \to \infty )$$ is approximately
GATE ECE 2014 Set 2
13
A communication channel with AWGN operating at a signal to noise ratio SNR > > 1 and band width B has capacity $${{C_1}}$$ . If the SNR is doubled keeping B constant, the resulting capacity $${{C_2}}$$ is given
GATE ECE 2009
14
A memory less source emits n symbols each with a probability p. The entropy of the source as a function of n
GATE ECE 2008
15
A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of symbols, the most efficient source encoder would have average bit rate as
GATE ECE 2006
16
A video transmission system transmits 625 picture frames per second. Each frame consists of a $$400\,\, \times \,\,400$$ pixel grid with 64 intensity levels per pixel. The data rate of the system is
GATE ECE 2001
17
A binary source has symbol probabilities 0.8 and 0.2. If extension coding (blocks of 4 symbols) is used, the lower and upper bounds on the average code word length are :
(a)lower___________.
(b) higher_________.
GATE ECE 1991
18
An image uses $$512\, \times \,512$$ picture elements. Each of the picture elements can take any of the 8 distinguishable intersity levels. The maximum entropy in the above image will be
GATE ECE 1990
19
A source produces 4 symbols with probabilities $${1 \over 2},\,{1 \over 4},\,{1 \over 8}\,\,and\,\,{1 \over 8}.$$ For this source, a practical coding scheme has an average codeword lenght of 2 bits/symbols. The efficiency of the code is :
GATE ECE 1989