Save my gradient optimization algorithms code.
SG improved in three ways:
Direction1-Noise Reduction Method
Direction2-Variance Reduction Method (SG have residual, can't use fixed stepsize)
Direction3-Second Order Method (escaping saddle point)
Basic-GradientDescentVariants:
{Batch Gradient Descent } Algorithm: BatchGradientDescent.py
{Stochastic Gradient Descent } Algorithm: StochasticGradientDescent.py
{Mini-Batch Gradient Descent } Algorithm: Mini-Batch_GradientDescent.py
Direction1-Noise Reduction Method:
{ } Algorithm: .py
Direction2-Variance Reduction Method:
{ } Algorithm: .py
Direction3-Second Order Method:
{ } Algorithm: .py
Paper:
Optimization Methods for Large-Scale Machine Learning
Paper:An Overview of Gradient Descent Optimization Algorithms
Github user:summersunshine1/optimize
Github user:tsycnh/mlbasic