1 minute read

Tags: ,

Learning Curves : article

Learning Curves

  • 當訓練集增加,error也增加
  • error 會趨於穩定,當訓練集多過一定量 Imgur

High Bias

  • 如果 learning algorithm 是在 high bias 的狀態,增加多的 training data 不會有影響。 Imgur

  • Imgur

High Variance

-如果 learning algorithm 是在 high variance 的狀態,增加多的 training data 可能會有幫助唷。 Imgur

  • Imgur

Deciding Waht to Do Next Revisited : article

  • Our decision process can be broken down as follows:
    • Getting more training examples: Fixes high variance
    • Trying smaller sets of features: Fixes high variance
    • Adding features: Fixes high bias
    • Adding polynomial features: Fixes high bias
    • Decreasing λ: Fixes high bias
    • Increasing λ: Fixes high variance

Diagnosting Neural Networks

  • A neural network with fewer parameters is prone to underfitting. It is also computationally cheaper.
  • A large neural network with more parameters is prone to overfitting. It is also computationally expensive. In this case you can use regularization(increaseλ) to address the overfitting.

用一層 (a single hidden layer) 是很棒棒的 default 開始.

Model Complexity Effects:

  • Lower-order polynomials (low model complexity) have high bias and low variance. In this case, the model fits poorly consistently.

  • Higher-order polynomials (high model complexity) fit the training data extremely well and the test data extremely poorly. These have low bias on the training data, but very high variance.

  • In reality, we would want to choose a model somewhere in between, that can generalize well but also fits the data reasonably well.