Catastrophic forgetting refers to the phenomenon where a neural network loses previously learned information upon learning new information. This is especially critical in online learning and continual adaptation, where systems are expected to learn incrementally over time. When a model updates its weights to accommodate new data, it may inadvertently overwrite the information stored from older data, leading to a decline in performance on earlier tasks.
congrats on reading the definition of catastrophic forgetting. now let's actually learn it.