Real World Real Time and Five Papers for Mike Tipping

Season 5, Episode 3,   Feb 15, 2019, 01:11 AM

In season five episode three we chat about take a listener question about Five Papers for Mike Tipping, take a listener question on AIAI and chat with Eoin O'Mahony of Uber

Here are Neil's five papers. What are yours?

Stochastic variational inference by Hoffman, Wang, Blei and Paisley

http://arxiv.org/abs/1206.7051

A way of doing approximate inference for probabilistic models with potentially billions of data ... need I say more?

Austerity in MCMC Land: Cutting the Metropolis Hastings by Korattikara, Chen and Welling

http://arxiv.org/abs/1304.5299

Oh ... I do need to say more ... because these three are at it as well but from the sampling perspective. Probabilistic models for big data ... an idea so important it needed to be in the list twice. 

Practical Bayesian Optimization of Machine Learning Algorithms by Snoek, Larochelle and Adams

http://arxiv.org/abs/1206.2944

This paper represents the rise in probabilistic numerics, I could also have chosen papers by Osbor...

In season five episode three we chat about take a listener question about Five Papers for Mike Tipping, take a listener question on AIAI and chat with Eoin O'Mahony of Uber


Here are Neil's five papers. What are yours?

Stochastic variational inference by Hoffman, Wang, Blei and Paisley

http://arxiv.org/abs/1206.7051

A way of doing approximate inference for probabilistic models with potentially billions of data ... need I say more?


Austerity in MCMC Land: Cutting the Metropolis Hastings by Korattikara, Chen and Welling

http://arxiv.org/abs/1304.5299

Oh ... I do need to say more ... because these three are at it as well but from the sampling perspective. Probabilistic models for big data ... an idea so important it needed to be in the list twice. 


Practical Bayesian Optimization of Machine Learning Algorithms by Snoek, Larochelle and Adams

http://arxiv.org/abs/1206.2944

This paper represents the rise in probabilistic numerics, I could also have chosen papers by Osborne, Hennig or others. There are too many papers out there already. Definitely an exciting area, be it optimisation, integration, differential equations. I chose this paper because it seems to have blown the field open to a wider audience, focussing as it did on deep learning as an application, so it let's me capture both an area of developing interest and an area that hits the national news.


Kernel Bayes Rule by Fukumizu, Song, Gretton

http://arxiv.org/abs/1009.5736

One of the great things about ML is how we have different (and competing) philosophies operating under the same roof. But because we still talk to each other (and sometimes even listen to each other)  these ideas can merge to create new and interesting things. Kernel Bayes Rule makes the list.


http://www.cs.toronto.edu/~hinton/absps/imagenet.pdf

An obvious choice, but you don't leave the Beatles off lists of great bands just because they are an obvious choice.