Evolutionary Robotics
Mini-batch gradient descent is an optimization algorithm used to train neural networks by updating the model's weights based on a small, random subset of the training data, rather than the entire dataset or a single data point. This approach combines the benefits of both stochastic and batch gradient descent, allowing for faster convergence while maintaining a stable learning process. It strikes a balance between the efficiency of processing large batches and the noisiness of updates from individual examples.
congrats on reading the definition of mini-batch gradient descent. now let's actually learn it.