Kernels make a whole lot more sense in the context of support vector machines, but in this series of videos I'm going to try to introduce you to kernels from the change of basis approach. Perhaps at a later point I'll make a video series about SVMs to solidify things, but I think I'd rather create a video that focuses more on the dual and primal and where it is used. I guess we'll see how ambitious I become, anyways, Enjoy!

Why do we need Kernels?
The Dual Form
Kernel Properties
Example and Summary

Linear Regression [Part 3]

Ok, the third part in this series if to help you better understand use cases of these different types of linear regression. This is obviously not exhaustive, but it may "get the wheels turning" regarding your own internalization for the use of this model. Enjoy!

Linear Regression [Part 2] - Bayesian

In this second part my 3 part series, I discuss a Bayesian formulation for linear regression. I compare and contrast it with the direct linear algebra approach from the first part. This lecture assumes you are familiar with Bayes formula and standard concepts in probability like posterior and likelihood. Dive in!

Linear Regression [Part 1]

I remember being fairly confused when I was in my first machine learning course and we covered the topic of linear regression. Looking back, I've realized that the reason I was so confused was because the different ways of doing linear regression were not layed out in manner that made the differences explicit. What I've tried to do in this 3 part series regarding linear regression is to break the information into digestible chunks. 
In this first part, I cover linear regression from the linear algebra approach. This is probably the approach most students who have experience in statistics be familiar with. Enjoy!