I am a postdoctoral researcher at Empirical Inference Department, Max Planck Institute for Intelligent Systems working with Bernhard Schölkopf. I work in the field of machine learning (intersection of computer science and statistics). More specifically, my research interests range from non-parametric statistical tests, approximate Bayesian inference, to kernel-based feature representation of data. I completed my PhD study at Gatsby Computational Neuroscience Unit, UCL where I worked with Arthur Gretton on various topics related to kernel-based non-parametric statistical tests. I received MEng from Tokyo Institute of Technology where I worked with Masashi Sugiyama on supervised feature selection using squared-loss mutual information. Before that I was a research assistant working with Thanaruk Theeramunkong on a Thai news relations discovery project. I received BSc in Computer Science from SIIT, Thammasat university, Thailand.
My works are listed on this page. Software packages I released can be found here. I occasionally update my blog summarizing what I learn. Some photos I have taken are on Flickr.
I am always open for a research discussion. If there is a chance that I could contribute to your projects, please get in touch to discuss.
Contact: Wittawat Jitkrittum (วิทวัส จิตกฤตธรรม) ( )
I give a talk at Workshop on Functional Inference and Machine Intelligence (19-21 Feb 2018), Tokyo.
The podcast we did with This Week in Machine Learning & AI on our NIPS Best Paper 2017 was published here.
I started working remotely as a postdoc at Max Planck Institute for Intelligent Systems with Bernhard Schölkopf.
I gave a 50-minute presentation on A Linear-Time Kernel Goodness-of-Fit Test (NIPS 2017 best paper). Download slides.
I gave an oral presentation at NIPS 2017 on A Linear-Time Kernel Goodness-of-Fit Test. The paper was selected for one of three best paper awards at NIPS 2017. Python code here. Presentation slides here. Talk video here.
Our paper A Linear-Time Kernel Goodness-of-Fit Test has been accepted for oral presentation at NIPS 2017. Python code.
I gave an oral presentation at ICML 2017 on our new linear-time independence test. The slides are here.
Uploaded slides for the linear-time nonparametric goodness-of-fit test.
We have released Python code for our linear-time nonparametric goodness-of-fit test.
I gave a talk on a new linear-time dependence measure at the UCL workshop on the theory of big data. See the slides here.
A new fast nonparametric goodness-of-fit test. See A Linear-Time Kernel Goodness-of-Fit Test.
Our paper: An Adaptive Test of Independence with Analytic Kernel Embeddings is accepted to ICML 2017.
A new paper. Cognitive Bias in Ambiguity Judgements: Using Computational Models to Dissect the Effects of Mild Mood Manipulation in Humans in PLOS One.
An Adaptive Test of Independence with Analytic Kernel Embeddings, a fast nonparametric independence test. Python code here.
Interpretable Distribution Features with Maximum Testing Power is accepted to NIPS 2016 as a full oral presentation. See our 2-minute introduction video here.
Interpretable Distribution Features with Maximum Testing Power: a linear-time nonparametric two-sample test which returns a set of local features indicating why the null hypothesis is rejected. Python code available on Github.
K2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings: summary statistic free approximate Bayesian computation with kernel embeddings. Accepted to AISTATS 2016.
We released the code for Locally Linear Latent Variable Model (LL-LVM), which was accepted to NIPS 2015. Check our Matlab code here on Github.
Bayesian Manifold Learning: Locally Linear Latent Variable Model (LL-LVM): a probabilistic model for non-linear manifold discovery.
Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages: a fast, online algorithm for nonparametric learning of EP message updates. Source code available here.
Updated: 12-Feb-18
Based on Hyde Theme.