Adaptive Learning Rate SGD AlgorithmforSVM

Journal Title: Scholars Journal of Physics, Mathematics and Statistics - Year 2017, Vol 4, Issue 4

Abstract

Stochastic gradient descent (SGD) is a simple and effective algorithm for solving the optimization problem of support vector machine, where each iteration operates on a single training example. The run-time of SGD does not depend directly on the size of the training set, the resulting algorithm is especially suited for learning from large datasets. However, the problem of stochastic gradient descent algorithm is that it is difficult to choose the proper learning rate. A learning rate is too small, which leads to slow convergence, while a learning rate that is too large can hinder convergence and cause fluctuate. In order to improve the efficiency and classification ability of SVM based on stochastic gradient descent algorithm, three algorithms of adaptive learning rate SGD are used to solve support vector machine, which are Adagrad, Adadelta and Adam. The experimental results show that the algorithm based on Adagrad, Adadelta and Adam for solving the linear support vector machine has faster convergence speed and higher testing precision.

Authors and Affiliations

Shuxia Lu, Zhao Jin

Keywords

Related Articles

Observations on the Hyperbola Y^2=72X^2+1

The binary quadratic equation y^2=72x^2+1 is considered and a few interesting properties among the solutions are presented.Employing the integral solutions of the equation under consideration, a special pythagorean trian...

Some Particular Examples for the Natural Lift Curve in Minkowski 3-Space

In this study, we give some particular examples for the natural lift curves of the spherical indicatries of tangent, principal normal, binormal vectors.

Lattice Points on the Homogeneous Cone 8(x^2+y^2 )-15xy=56z^2

The ternary quadratic equation 8(x^2+y^2 )-15xy=56z^2 representing a cone is analysed for its non-zero distinct integer points on it. Employing the integer solutions, a few interesting relations between the solutions and...

Another ProofoftheBrezis-Lieb Lemma

The Breis-Lieb Lemma was first came up with by the famous French mathematician HaimBrezis and American mathematician Elliott Lieb, it is an improvement of Fatou's Lemma, which has numerous applications mainly in calculus...

Calibrating pipe friction coefficient of the oilfield water injection system based on sensitivity analysis

In order to improve the accuracy of hydraulic model for the oilfield water injection network, the model need to be calibrated, so the method of sensitivity analysis was proposed to calibrate pipe friction coefficient in...

Download PDF file
  • EP ID EP385890
  • DOI -
  • Views 73
  • Downloads 0

How To Cite

Shuxia Lu, Zhao Jin (2017). Adaptive Learning Rate SGD AlgorithmforSVM. Scholars Journal of Physics, Mathematics and Statistics, 4(4), 178-184. https://europub.co.uk/articles/-A-385890