I am a fourth-year PhD student working with Arthur Gretton at Gatsby Computational Neuroscience Unit, UCL. I am expected to finish my PhD study in September 2017. I received M.Eng. from Tokyo Institute of Technology where I worked with Masashi Sugiyama on supervised feature selection using squared-loss mutual information. Before that I was a research assistant working with Thanaruk Theeramunkong on a Thai news relations discovery project. I received B.Sc. in Computer Science from SIIT, Thammasat university, Thailand.
My works are listed on this page. I occasionally update my blog summarizing what I learn. Some photos I have taken are on Flickr. I also maintain a web site for Gatsby's machine learning journal club.
Contact: Wittawat Jitkrittum (วิทวัส จิตกฤตธรรม) ( )
Nov 2016 . A new paper. Cognitive Bias in Ambiguity Judgements: Using Computational Models to Dissect the Effects of Mild Mood Manipulation in Humans in PLOS One.
Oct 2016. An Adaptive Test of Independence with Analytic Kernel Embeddings, a fast nonparametric independence test. Python code here.
Aug 2016. Interpretable Distribution Features with Maximum Testing Power is accepted to NIPS 2016 as a full oral presentation. See our 2-minute introduction video here.
May 2016. Interpretable Distribution Features with Maximum Testing Power: a linear-time nonparametric two-sample test which returns a set of local features indicating why the null hypothesis is rejected. Python code available on Github.
Dec 2015. K2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings: summary statistic free approximate Bayesian computation with kernel embeddings. Accepted to AISTATS 2016.
Nov 2015. We released the code for Locally Linear Latent Variable Model (LL-LVM), which was accepted to NIPS 2015. Check our Matlab code here on Github.
Jun 2015. Bayesian Manifold Learning: Locally Linear Latent Variable Model (LL-LVM): a probabilistic model for non-linear manifold discovery.
Mar 2015. Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages: a fast, online algorithm for nonparametric learning of EP message updates. Source code available here.