Skip to content

GitHubyen/GradientDescent_OptimizationAlgorithms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gradient Descent Optimization Algorithms

Introduction:

Save my gradient optimization algorithms code.

SG improved in three ways:

Direction1-Noise Reduction Method
Direction2-Variance Reduction Method (SG have residual, can't use fixed stepsize)
Direction3-Second Order Method (escaping saddle point)

Algorithm list:

Basic-GradientDescentVariants:

{Batch Gradient Descent } Algorithm: BatchGradientDescent.py
{Stochastic Gradient Descent } Algorithm: StochasticGradientDescent.py
{Mini-Batch Gradient Descent } Algorithm: Mini-Batch_GradientDescent.py

Direction1-Noise Reduction Method:

{ } Algorithm: .py

Direction2-Variance Reduction Method:

{ } Algorithm: .py

Direction3-Second Order Method:

{ } Algorithm: .py

References

Paper: Optimization Methods for Large-Scale Machine Learning
Paper: An Overview of Gradient Descent Optimization Algorithms
Github user: summersunshine1/optimize
Github user: tsycnh/mlbasic

About

Save my Stochastic Gradient optimization algorithm code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages