I am a postdoctoral researcher at Empirical Inference Department, Max Planck Institute for Intelligent Systems working with Bernhard Schölkopf. I work in the field of machine learning (intersection of computer science and statistics). More specifically, my research interests range from non-parametric statistical tests, approximate Bayesian inference, to kernel-based feature representation of data. I completed my PhD study at Gatsby Computational Neuroscience Unit, UCL where I worked with Arthur Gretton on various topics related to kernel-based non-parametric statistical tests. I received MEng from Tokyo Institute of Technology where I worked with Masashi Sugiyama on supervised feature selection using squared-loss mutual information. Before that I was a research assistant working with Thanaruk Theeramunkong on a Thai news relations discovery project. I received BSc in Computer Science from SIIT, Thammasat university, Thailand.
I am always open for a research discussion. If there is a chance that I could contribute to your projects, please get in touch to discuss.
Contact: Wittawat Jitkrittum (วิทวัส จิตกฤตธรรม) ( )
I was invited to speak at Department of Mathematics and Computer Science, Chulalongkorn University. Topic: A Technical Introduction to Kernel Goodness-of-Fit Testing.
I was invited to give a talk at the Workshop on Functional Inference and Machine Intelligence (19-21 Feb 2018), Tokyo. Slides.
I started working remotely as a postdoc at Max Planck Institute for Intelligent Systems with Bernhard Schölkopf.
I gave an oral presentation at NIPS 2017 on A Linear-Time Kernel Goodness-of-Fit Test. The paper was selected for one of three best paper awards at NIPS 2017. Python code here. Presentation slides here. Talk video here.
I gave an oral presentation at ICML 2017 on our new linear-time independence test. The slides are here.
A new fast nonparametric goodness-of-fit test. See A Linear-Time Kernel Goodness-of-Fit Test.
Our paper: An Adaptive Test of Independence with Analytic Kernel Embeddings is accepted to ICML 2017.
An Adaptive Test of Independence with Analytic Kernel Embeddings, a fast nonparametric independence test. Python code here.
Interpretable Distribution Features with Maximum Testing Power is accepted to NIPS 2016 as a full oral presentation. See our 2-minute introduction video here.
Interpretable Distribution Features with Maximum Testing Power: a linear-time nonparametric two-sample test which returns a set of local features indicating why the null hypothesis is rejected. Python code available on Github.
K2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings: summary statistic free approximate Bayesian computation with kernel embeddings. Accepted to AISTATS 2016.
We released the code for Locally Linear Latent Variable Model (LL-LVM), which was accepted to NIPS 2015. Check our Matlab code here on Github.
Bayesian Manifold Learning: Locally Linear Latent Variable Model (LL-LVM): a probabilistic model for non-linear manifold discovery.
Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages: a fast, online algorithm for nonparametric learning of EP message updates. Source code available here.
Based on Hyde Theme.