๐Probabilistic Decision-Making Unit 3 โ Discrete & Continuous Distributions
Discrete and continuous distributions form the backbone of probabilistic decision-making. These mathematical models describe the likelihood of different outcomes in random events, from coin flips to stock prices. Understanding their properties and applications is crucial for analyzing uncertainty and making informed choices.
Probability mass functions, density functions, and cumulative distribution functions are key tools for working with distributions. Expected values, variances, and other parameters help summarize and compare distributions. Mastering these concepts enables better risk assessment, forecasting, and decision-making in fields ranging from finance to engineering.
Study Guides for Unit 3 โ Discrete & Continuous Distributions
Probability distributions describe the likelihood of different outcomes in a random experiment or process
Random variables represent the possible outcomes of a random event and can be discrete (countable) or continuous (uncountable)
Probability mass functions (PMFs) define the probability of each possible value for a discrete random variable
Probability density functions (PDFs) specify the relative likelihood of a continuous random variable taking on a specific value
Cumulative distribution functions (CDFs) give the probability that a random variable is less than or equal to a particular value
Expected value is the average outcome of a random variable over many trials, calculated as the sum or integral of each value multiplied by its probability
Variance and standard deviation measure the spread or dispersion of a probability distribution around its expected value
Types of Probability Distributions
Discrete distributions have a countable number of possible outcomes, such as the number of defective items in a batch (Bernoulli, binomial, Poisson)
Continuous distributions have an uncountable, infinite number of possible outcomes within a range, like the time until a machine fails (normal, exponential, uniform)
Joint distributions describe the probabilities of two or more random variables occurring together
Marginal distributions are derived from joint distributions by summing or integrating over the other variables
Conditional distributions give the probabilities of one variable given specific values of another
Multivariate distributions involve multiple random variables, which can be discrete, continuous, or a mix of both
Some distributions can be either discrete or continuous depending on the context and assumptions (geometric, negative binomial)
Discrete Distributions
Bernoulli distribution models a single trial with two possible outcomes (success or failure), with probability of success $p$
Binomial distribution describes the number of successes in a fixed number of independent Bernoulli trials, with parameters $n$ (trials) and $p$ (success probability)