Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
Hi everyone! In the previous post we have noted that least-squared regression is very prone to overfitting. Due to some assumptions used to derive it, L2 loss function is sensitive to outliers i.e. outliers can penalize the L2 loss function heavily, messing up the model entirely. You must have been aware that adding regularization terms helps to improve the robustness of the model. The regulazired L2 loss is expressed as follow:
Published:
Hi everyone! In this post, we will discuss a very simple subject, particularly for those who has been studying machine learning: linear regression. However, we will try to cover it from the maximum likelihood angle of view. Hopefully this can give you some intuition of how linear regression objective function could be derived.
Published:
Hello everyone! Thanks for following the series so far. This post will be very likely to be the last part of the series. Due to the depth of the topic itself, we would not be able to cover it from end to end. Nonetheless, by the end of this post, we expected you to at least be familiar with the derivation of mean field variational inference. Enjoy!
Published:
Hi everyone, welcome to my second post of Variational Inference step-by-step introduction! In the previous post we have explained the reason that motivates the use of variational inference. We also introduced the importance of KL-divergence and how it used in the main idea of variational inference. In this post, we will discuss the idea of finding the variational distribution and how mean field variational inference could help us to do that.
Published:
Hi everybody! For the past few days I have been struggling to study Variational Inference from many resources. I found that many textbooks jump into rigorous details even before I can quite understand the intuition behind it. Then these series of tutorial videos by Chieh Wu from Northeastern University came to the rescue. His explanation is excellent and very intuitive for me. I wrote this step-by-step explanations series of post mainly as lecture notes for myself that heavily follows his videos. I hope this could be useful for those who share the same frustation in learning the subject. Enjoy!
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in Journal 1, 2009
This paper is about the number 1. The number 2 is left for future work.
Recommended citation: Your Name, You. (2009). "Paper Title Number 1." Journal 1. 1(1). http://academicpages.github.io/files/paper1.pdf
Published in Journal 1, 2010
This paper is about the number 2. The number 3 is left for future work.
Recommended citation: Your Name, You. (2010). "Paper Title Number 2." Journal 1. 1(2). http://academicpages.github.io/files/paper2.pdf
Published in Journal 1, 2015
This paper is about the number 3. The number 4 is left for future work.
Recommended citation: Your Name, You. (2015). "Paper Title Number 3." Journal 1. 1(3). http://academicpages.github.io/files/paper3.pdf
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.