Communications
Analog Communication Systems
Marks 1Marks 2
Digital Communication Systems
Marks 1Marks 2Marks 8Marks 10
Random Signals and Noise
Marks 1Marks 2Marks 4
Fundamentals of Information Theory
Marks 1Marks 2
Noise In Digital Communication
Marks 1Marks 2
1
GATE ECE 2022
Numerical
+2
-0

Consider communication over a memoryless binary symmetric channel using a (7, 4) Hamming code. Each transmitted bit is received correctly with probability (1 $$-$$ $$\in$$), and flipped with probability $$\in$$. For each codeword transmission, the receiver performs minimum Hamming distance decoding, and correctly decodes the message bits if and only if the channel introduces at most one bit error. For $$\in$$ = 0.1, the probability that a transmitted codeword is decoded correctly is __________ (rounded off to two decimal places).

Your input ____
2
GATE ECE 2017 Set 2
MCQ (Single Correct Answer)
+2
-0.6
Consider a binary memoryless channel characterized by the transition probability diagram shown in the figure. GATE ECE 2017 Set 2 Communications - Fundamentals of Information Theory Question 14 English The channel is
A
Lossless
B
Noiseless
C
Useless
D
Deterministic
3
GATE ECE 2016 Set 2
Numerical
+2
-0
A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events:
$${x_0}$$ : a " zero " is transmitted
$${x_1}$$ : a " one " is transmitted
$${y_0}$$ : a " zero " is received
$${y_1}$$ : a " one " is received

The following probabilities are given:
$$P({x_0}) = \,{3 \over 4},\,\left( {\,\left. {{y_0}} \right|{x_0}} \right) = \,{1 \over 2},\,\,and\,P\,\,\left( {\,\left. {{y_0}} \right|{x_1}} \right) = \,{1 \over 2}$$.
The information in bits that you obtain when you learn which symbol has been received (while you know that a " zero " has been transmitted) is _____________

Your input ____
4
GATE ECE 2016 Set 1
Numerical
+2
-0
Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilities of occurrence $$P = \left\{ {{1 \over 2},\,{1 \over 4},\,{1 \over 8},\,{1 \over {16}},\,{1 \over {32}},......} \right\}$$. The entropy of the source (in bits) is__________.
Your input ____
GATE ECE Subjects
Signals and Systems
Network Theory
Control Systems
Digital Circuits
General Aptitude
Electronic Devices and VLSI
Analog Circuits
Engineering Mathematics
Microprocessors
Communications
Electromagnetics