Tuesday, January 31, 2017

L1 vs L2

Objective Function = Loss Function + Regularization Penalty/Term

L1 loss = least absolute error = Sum(| y - f(x) |)
L2 loss = least square error = Sum((y - f(x))^2)

L1 vs L2 properties (loss function)

regularization term in order to prevent the coefficients to fit so perfectly to overfit.

L1 reg = the sum of the weights.
L2 reg =  the sum of the square of the weights

L1 vs L2 properties (regularization)

No comments:

Post a Comment