Markov's Inequality is a fundamental result in probability theory that provides an upper bound on the probability that a non-negative random variable exceeds a certain value. It states that for any non-negative random variable X and any positive value a, the probability that X is greater than or equal to a is at most the expected value of X divided by a, formally expressed as $P(X \geq a) \leq \frac{E[X]}{a}$. This inequality is especially useful in establishing bounds when specific distributions are not known.
congrats on reading the definition of Markov's Inequality. now let's actually learn it.