Transforming distributions with Normalizing Flows
What are normalizing flows and why should we care?
What are normalizing flows and why should we care?
My highlights from the AKBC 2020 conference.
Many problems in machine learning deal with the idea of making two probability distributions to be as close as possible. In the simpler case where we only ha...
In this notebook we are interested in the problem of inference in a probabilistic model that contains both observed and latent variables, which can be repres...
In this problem we are faced with a set of $k$ bandits, each of which gives an uncertain reward. The main purpose is to find the set of actions that lead to ...
In all the notebooks we’ve seen so far, we have made the assumption that the observations correspond directly to realizations of a random variable. Take the ...
Neural networks are very popular function approximators used in a wide variety of fields nowadays and coming in all kinds of flavors, so there are countless ...
When we perform linear regression using maximum likelihood estimation, we obtain a point estimate of the parameters $\mathbf{w}$ of the linear model. The Bay...
Digital signals are all around us. From the phone in our pockets to the massive infrastructure behind the Internet, they have enabled a wide variety of techn...
In this notebook we will deal with two interesting applications of Fisher’s linear discriminant: dimensionality reduction, and classification. This discrimin...
In this notebook we will examine the method of Principal Component Analysis (PCA), which can be used to project data onto lower-dimensional spaces while maxi...
This is the second notebook I write related to linear regression, because it’s time to apply this model to a real dataset, starting with the Boston housing d...
In this notebook we will examine different ways to train linear models for a regression problem. In this problem we have a dataset of N input variables $\mat...