I am a postdoctoral researcher at Empirical Inference Department, Max Planck Institute for Intelligent Systems working with Bernhard Schölkopf. I work in the field of machine learning (intersection of computer science and statistics). More specifically, my research interests range from non-parametric statistical tests, approximate Bayesian inference, to kernel-based feature representation of data. I completed my PhD study at Gatsby Computational Neuroscience Unit, UCL where I worked with Arthur Gretton on various topics related to kernel-based non-parametric statistical tests. I received MEng from Tokyo Institute of Technology where I worked with Masashi Sugiyama on supervised feature selection using squared-loss mutual information. Before that I was a research assistant working with Thanaruk Theeramunkong on a Thai news relations discovery project. I received BSc in Computer Science from SIIT, Thammasat university, Thailand.
I am always open for a research discussion. If there is a chance that I could contribute to your projects, please get in touch to discuss. If you are a bachelor, master, or PhD student working in a field related to machine learning, and are looking for an internship opportunity (for 2-6 months) at the Empirical Inference department, please contact me.
Contact: Wittawat Jitkrittum (วิทวัส จิตกฤตธรรม) (
Max Planck Institute for Intelligent Systems
Phone: +49 7071 601
I will speak at the Symposium on frontier research in information science and technology at VISTEC, Thailand.
Our paper titled “Informative Features for Model Comparison” was accepted to NIPS 2018. Given two samples from two candidate models, and a reference sample (observed data), the goal is to design a statistical test to determine which of the two samples is closer to the reference sample. This has an application for comparing two GAN models.
I was invited to speak at Department of Mathematics and Computer Science, Chulalongkorn University. Topic: A Technical Introduction to Kernel Goodness-of-Fit Testing.
I was invited to give a talk at the Workshop on Functional Inference and Machine Intelligence (19-21 Feb 2018), Tokyo. Slides.
I started working remotely as a postdoc at Max Planck Institute for Intelligent Systems with Bernhard Schölkopf.
I gave an oral presentation at NIPS 2017 on A Linear-Time Kernel Goodness-of-Fit Test. The paper was selected for one of three best paper awards at NIPS 2017. Python code here. Presentation slides here. Talk video here.
I gave an oral presentation at ICML 2017 on our new linear-time independence test. The slides are here.
A new fast nonparametric goodness-of-fit test. See A Linear-Time Kernel Goodness-of-Fit Test.
Our paper: An Adaptive Test of Independence with Analytic Kernel Embeddings is accepted to ICML 2017.
An Adaptive Test of Independence with Analytic Kernel Embeddings, a fast nonparametric independence test. Python code here.
Interpretable Distribution Features with Maximum Testing Power is accepted to NIPS 2016 as a full oral presentation. See our 2-minute introduction video here.
Interpretable Distribution Features with Maximum Testing Power: a linear-time nonparametric two-sample test which returns a set of local features indicating why the null hypothesis is rejected. Python code available on Github.
K2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings: summary statistic free approximate Bayesian computation with kernel embeddings. Accepted to AISTATS 2016.
We released the code for Locally Linear Latent Variable Model (LL-LVM), which was accepted to NIPS 2015. Check our Matlab code here on Github.
Bayesian Manifold Learning: Locally Linear Latent Variable Model (LL-LVM): a probabilistic model for non-linear manifold discovery.
Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages: a fast, online algorithm for nonparametric learning of EP message updates. Source code available here.
Based on Hyde Theme.