Mathematical Methods for Optimization
Mini-batch gradient descent is an optimization algorithm used to train machine learning models by updating the model's parameters using a small subset of the training data, or mini-batch, instead of the entire dataset. This method strikes a balance between the computational efficiency of stochastic gradient descent and the stability of batch gradient descent, making it particularly effective in handling large datasets common in machine learning and data science applications.
congrats on reading the definition of mini-batch gradient descent. now let's actually learn it.