【转】Machine Learning Algorithm

Gradient Descent, Logistic Regression, Naive Bayes, SVM

Posted by Brian on October 1, 2017

1. Gradient Descent Algorithm

where $x_k$ ,denotes the vector of scores for all the pixels for the class $k \in { 1, \cdots, L }$. The per-class unaries are denoted by $b_k$ , and the pairwise terms $\hat{A}$ are shared between each pair of classes. The equations that follow are derived by specializing the general inference (Eq.2) and gradient equations (Eq.3, 4) to this particular setting. Following simple manipulations, the inference procedure becomes a two step process where we first compute the sum of our scores $\sum\nolimits_{i}^{}x_i$ , followed by $x_k$ . the scores for the class $k$ as :

Derivatives of the unary terms with respect to the loss are obtained by solving

Finally, the gradients of $\hat{A}$ are computed as

2. Logistic Regression Algorithm

3. Naive Bayes Algorithm

4. SVM Algorithm

MathJax TeX Test Page