Posts by Tags

L2 regularization

L2 Regularization from Probabilistic Perspective

6 minute read

Published:

Hi everyone! In the previous post we have noted that least-squared regression is very prone to overfitting. Due to some assumptions used to derive it, L2 loss function is sensitive to outliers i.e. outliers can penalize the L2 loss function heavily, messing up the model entirely. You must have been aware that adding regularization terms helps to improve the robustness of the model. The regulazired L2 loss is expressed as follow:

linear regression

Linear Regression from Maximum Likelihood View

4 minute read

Published:

Hi everyone! In this post, we will discuss a very simple subject, particularly for those who has been studying machine learning: linear regression. However, we will try to cover it from the maximum likelihood angle of view. Hopefully this can give you some intuition of how linear regression objective function could be derived.

maximum likelihood

Linear Regression from Maximum Likelihood View

4 minute read

Published:

Hi everyone! In this post, we will discuss a very simple subject, particularly for those who has been studying machine learning: linear regression. However, we will try to cover it from the maximum likelihood angle of view. Hopefully this can give you some intuition of how linear regression objective function could be derived.

variational inference

Variational Inference Step-by-Step (Part 3: Mean Field V.I. cont.)

5 minute read

Published:

Hello everyone! Thanks for following the series so far. This post will be very likely to be the last part of the series. Due to the depth of the topic itself, we would not be able to cover it from end to end. Nonetheless, by the end of this post, we expected you to at least be familiar with the derivation of mean field variational inference. Enjoy!

Variational Inference Step-by-Step (Part 2: Mean Field V.I.)

6 minute read

Published:

Hi everyone, welcome to my second post of Variational Inference step-by-step introduction! In the previous post we have explained the reason that motivates the use of variational inference. We also introduced the importance of KL-divergence and how it used in the main idea of variational inference. In this post, we will discuss the idea of finding the variational distribution and how mean field variational inference could help us to do that.

Variational Inference Step-by-Step (Part 1: Motivation)

10 minute read

Published:

Hi everybody! For the past few days I have been struggling to study Variational Inference from many resources. I found that many textbooks jump into rigorous details even before I can quite understand the intuition behind it. Then these series of tutorial videos by Chieh Wu from Northeastern University came to the rescue. His explanation is excellent and very intuitive for me. I wrote this step-by-step explanations series of post mainly as lecture notes for myself that heavily follows his videos. I hope this could be useful for those who share the same frustation in learning the subject. Enjoy!