The rule for scaling means refers to how the expected value of a transformed random variable changes when that variable undergoes linear transformations, such as scaling and shifting. Specifically, if a random variable X is transformed by a linear equation, say Y = aX + b, the expected value of Y can be calculated as E[Y] = aE[X] + b. This concept is essential for understanding how transformations affect the statistical properties of random variables.
congrats on reading the definition of rule for scaling means. now let's actually learn it.