Markov's Inequality is a fundamental result in probability theory that provides an upper bound on the probability that a non-negative random variable exceeds a certain positive value. Specifically, it states that for any non-negative random variable X and any positive value a, the probability that X is greater than or equal to a is less than or equal to the expected value of X divided by a, expressed mathematically as P(X \geq a) \leq \frac{E[X]}{a}. This inequality is useful for establishing bounds on probabilities without needing to know the entire distribution of the variable.
congrats on reading the definition of Markov's Inequality. now let's actually learn it.