aiacademy: 機器學習 logistci regression & 實作:ch04
Tags: aiacademy, knn, machine-learning, precision-recall-f1Score
logistic regression
gradient ascent
和大神教的有點不一樣 LOL…
靠一樣!只是 負號 提出來
- Similarly, we can apply gradient descent, stochastic gradient descent, and mini-batch gradient descent to solve logistic regression
Multi-calss logistic regression
這邊介紹到 softmax function
就要看這篇
Q & A
-
Quize:
-
What is logistic regression?
-
Why do we usually maximize log-likelihood function (instead of likelihood function)?
-
What is the cross entropy loss (for logistic regression)
-
-
Answer:
Evaluation (classification)
看看 Coursera 我的筆記
- Classification metrics
- Accuracy
- Precision
- Recall
- F1 score
- Precision recall curve
- ROC curve and AUC (Area Under Curve)
Accuracy
Confusino matrix
Precision
Recall
Precision and recall tradeoff
- if we want a very high precision
- return only the most confident positive instances
- if we wnat a very high recall
- return all the instance
- the two metrics are ususally a tradeoff
F1 Score
ROC Curve
- Precisions, recalls (TPRs), and FPRs of different thresholds
- ROC Curve
Logistic Regression
- 1
- 2
Logistic Regression in Scikit-Leran
C 值越大 對weight 的控制力越弱
multi_class:
ovr
,multinominal
short summary
F1-Score and AUC ROC
Threshold
F1 Score
Prcision vs. Recall / ROC:TPR vs. FPR