Catastrophic forgetting refers to the phenomenon where a neural network forgets previously learned information upon learning new tasks. This issue is particularly relevant in the context of transfer learning, where models are adapted from one task to another. When a model is fine-tuned on new data, it can lead to a significant decline in performance on the original task, highlighting the challenges of maintaining knowledge across different domains.
congrats on reading the definition of catastrophic forgetting. now let's actually learn it.