🃏Engineering Probability Unit 20 – Signal Processing & Communication Systems
Signal processing and communication systems form the backbone of modern information technology. These fields analyze, modify, and transmit signals to extract information and enable efficient communication. Probability theory provides the mathematical framework for quantifying uncertainty and analyzing random phenomena in these systems.
Key concepts include random variables, stochastic processes, and noise models. Signal detection and estimation techniques are crucial for extracting information from noisy signals. Information theory establishes fundamental limits on communication efficiency, while performance metrics like bit error rate and channel capacity quantify system effectiveness.
Signal processing involves the analysis, modification, and synthesis of signals to extract information or enhance signal characteristics
Communication systems enable the transmission of information from a source to a destination over a channel
Probability theory provides a mathematical framework for quantifying uncertainty and analyzing random phenomena in signal processing and communication systems
Random variables are mathematical functions that map outcomes of random experiments to numerical values
Stochastic processes are collections of random variables indexed by time or space, used to model time-varying or spatially distributed signals
Noise refers to unwanted random disturbances that corrupt signals, while interference is the presence of unwanted signals that disrupt the desired signal
Signal detection involves deciding whether a signal is present or absent in the presence of noise, while signal estimation aims to determine the values of unknown signal parameters
Information theory quantifies the amount of information in a message and establishes fundamental limits on the efficiency of communication systems
Probability Theory Foundations
Probability is a measure of the likelihood of an event occurring, expressed as a number between 0 and 1
An event with probability 0 is impossible, while an event with probability 1 is certain
The sample space Ω is the set of all possible outcomes of a random experiment
Events are subsets of the sample space, representing specific outcomes or combinations of outcomes
The probability of an event A is denoted as P(A) and satisfies the axioms of probability:
Non-negativity: P(A)≥0 for all events A
Normalization: P(Ω)=1
Countable additivity: For mutually exclusive events A1,A2,…, P(⋃i=1∞Ai)=∑i=1∞P(Ai)
Conditional probability P(A∣B) is the probability of event A occurring given that event B has occurred, calculated as P(A∣B)=P(B)P(A∩B) when P(B)>0
Independent events are events where the occurrence of one does not affect the probability of the other, satisfying P(A∩B)=P(A)P(B)
Random Variables in Signal Processing
Random variables are functions that assign numerical values to the outcomes of a random experiment
Discrete random variables take on a countable set of values, while continuous random variables can take on any value within a specified range
The probability mass function (PMF) pX(x) describes the probability distribution of a discrete random variable X, where pX(x)=P(X=x)
The probability density function (PDF) fX(x) characterizes the probability distribution of a continuous random variable X, where P(a≤X≤b)=∫abfX(x)dx
The cumulative distribution function (CDF) FX(x) gives the probability that a random variable X takes on a value less than or equal to x, defined as FX(x)=P(X≤x)
Important properties of random variables include:
Expected value (mean): E[X]=∑xxpX(x) for discrete X, E[X]=∫−∞∞xfX(x)dx for continuous X
Variance: Var(X)=E[(X−E[X])2], measures the spread of the distribution around the mean
Common distributions in signal processing include the Gaussian (normal) distribution, uniform distribution, and Poisson distribution
Stochastic Processes
A stochastic process is a collection of random variables {X(t),t∈T} indexed by a parameter t, usually representing time or space
Stochastic processes are used to model signals that vary randomly over time or space, such as noise, interference, or time-varying channels
The mean function μX(t)=E[X(t)] describes the average behavior of the process at each time instant t
The autocorrelation function RX(t1,t2)=E[X(t1)X(t2)] measures the correlation between the process values at different time instants t1 and t2
A process is wide-sense stationary (WSS) if its mean function is constant and its autocorrelation function depends only on the time difference τ=t2−t1
The power spectral density (PSD) SX(f) characterizes the distribution of signal power over frequency for a WSS process, obtained as the Fourier transform of the autocorrelation function
Examples of stochastic processes include white Gaussian noise, Brownian motion, and Markov processes
Noise and Interference Models
Noise is an unwanted random disturbance that corrupts signals in communication systems and signal processing applications
Thermal noise, also known as Johnson-Nyquist noise, is caused by the random motion of electrons in electronic components and is modeled as additive white Gaussian noise (AWGN)
AWGN has a constant power spectral density over all frequencies and a Gaussian probability distribution
Shot noise occurs in electronic devices due to the discrete nature of electric charge and is modeled as a Poisson process
Flicker noise, also called 1/f noise, has a power spectral density that is inversely proportional to frequency and is present in many electronic systems
Interference refers to the presence of unwanted signals that disrupt the desired signal, such as co-channel interference in wireless communications
Interference can be modeled as a deterministic signal (e.g., sinusoidal interference) or as a random process (e.g., multiple access interference in cellular networks)
The signal-to-noise ratio (SNR) and signal-to-interference ratio (SIR) are important metrics for quantifying the impact of noise and interference on signal quality
Signal Detection and Estimation
Signal detection is the task of deciding whether a signal is present or absent in the presence of noise
The binary hypothesis testing problem formulates signal detection as a choice between two hypotheses: H0 (signal absent) and H1 (signal present)
The likelihood ratio test (LRT) is a common approach to signal detection, comparing the likelihood ratio L(x)=fX∣H0(x)fX∣H1(x) to a threshold to make the decision
The Neyman-Pearson lemma states that the LRT is the most powerful test for a given false alarm probability
Signal estimation aims to determine the values of unknown signal parameters from noisy observations
Common estimation techniques include maximum likelihood estimation (MLE), which chooses the parameter values that maximize the likelihood function, and least squares estimation (LSE), which minimizes the sum of squared errors
The Cramér-Rao lower bound (CRLB) provides a lower bound on the variance of any unbiased estimator, serving as a benchmark for estimator performance
Information Theory Basics
Information theory, developed by Claude Shannon, provides a mathematical framework for quantifying information and analyzing the fundamental limits of communication systems
Entropy H(X) measures the average amount of information contained in a random variable X, defined as H(X)=−∑xpX(x)log2pX(x) for discrete X
Joint entropy H(X,Y) quantifies the amount of information in the joint distribution of two random variables X and Y
Conditional entropy H(X∣Y) measures the amount of information in X given knowledge of Y
Mutual information I(X;Y) quantifies the amount of information shared between X and Y, defined as I(X;Y)=H(X)−H(X∣Y)
The channel capacity C is the maximum rate at which information can be reliably transmitted over a noisy channel, given by C=maxpX(x)I(X;Y)
The source coding theorem states that a source can be compressed to a rate close to its entropy with negligible loss of information
The channel coding theorem establishes that reliable communication is possible over a noisy channel if the transmission rate is below the channel capacity
Communication System Performance Metrics
Performance metrics quantify the effectiveness and efficiency of communication systems in transmitting information
The bit error rate (BER) is the average number of bit errors per unit time, measuring the reliability of the communication system
The symbol error rate (SER) is the average number of symbol errors per unit time, relevant for systems that transmit symbols representing multiple bits
The channel capacity, measured in bits per second (bps), represents the maximum rate at which information can be reliably transmitted over the channel
Bandwidth efficiency, expressed in bits per second per Hertz (bps/Hz), quantifies how efficiently the available bandwidth is utilized for information transmission
Energy efficiency, measured in bits per Joule (bps/J), indicates the amount of information that can be transmitted per unit of energy consumed
The spectral efficiency, given in bits per second per Hertz per unit area (bps/Hz/m^2), measures the spatial utilization of the spectrum in wireless communication systems
Latency, or delay, is the time taken for a message to travel from the source to the destination, critical in real-time applications
Throughput is the actual rate of successful information delivery over the communication channel, taking into account factors such as protocol overhead and retransmissions