Gradient Descent
A foundational optimization algorithm used in machine learning to find the minimum of a function. It iteratively adjusts parameters by moving in the direction of the steepest descent – the negative gradient – of the function's error surface. This process continues until a minimum is reached or a stopping criterion is met.
5 connections Built by Tim Jones