I am a postdoctoral researcher at Empirical Inference Department, Max Planck Institute for Intelligent Systems working with Bernhard Schölkopf. I work in the field of machine learning (intersection of computer science and statistics). More specifically, my research interests range from non-parametric statistical tests, approximate Bayesian inference, to kernel-based feature representation of data. I completed my PhD study at Gatsby Computational Neuroscience Unit, UCL where I worked with Arthur Gretton on various topics related to kernel-based non-parametric statistical tests. I received MEng from Tokyo Institute of Technology where I worked with Masashi Sugiyama on supervised feature selection using squared-loss mutual information. Before that I was a research assistant working with Thanaruk Theeramunkong on a Thai news relations discovery project. I received BSc in Computer Science from SIIT, Thammasat university, Thailand.
My works are listed on this page. Software packages I released can be found here. I occasionally update my blog summarizing what I learn. Some photos I have taken are on Flickr.
I am always open for a research discussion. If there is a chance that I could contribute to your projects, please get in touch to discuss. If you are a bachelor, master, or PhD student working in a field related to machine learning, and are looking for an internship opportunity (for 2-6 months) at the Empirical Inference department, please contact me.
Contact: Wittawat Jitkrittum (วิทวัส จิตกฤตธรรม) (
)
Address:
Wittawat Jitkrittum
Max Planck Institute for Intelligent Systems
Max-Planck-Ring 4
72076 Tuebingen
Germany
Phone: +49 7071 601
I will speak at the Symposium on frontier research in information science and technology at VISTEC, Thailand.
Our paper titled “Informative Features for Model Comparison” was accepted to NIPS 2018. Given two samples from two candidate models, and a reference sample (observed data), the goal is to design a statistical test to determine which of the two samples is closer to the reference sample. This has an application for comparing two GAN models.
I was invited to speak at Department of Mathematics and Computer Science, Chulalongkorn University. Topic: A Technical Introduction to Kernel Goodness-of-Fit Testing.
I was invited to speak at VISTEC, Wangchan Valley, Rayong, Thailand. Topic: Machine Learning Fundamentals. Event details. Slides.
I was invited to give a talk at the BKK Machine Learning Meetup event. Topic: Introduction to kernel methods for comparing distributions. Slides.
I was invited to give a talk at the Workshop on Functional Inference and Machine Intelligence (19-21 Feb 2018), Tokyo. Slides.
The podcast we did with This Week in Machine Learning & AI on our NIPS Best Paper 2017 was published here.
I started working remotely as a postdoc at Max Planck Institute for Intelligent Systems with Bernhard Schölkopf.
I gave a 50-minute presentation on A Linear-Time Kernel Goodness-of-Fit Test (NIPS 2017 best paper). Download slides.
I gave an oral presentation at NIPS 2017 on A Linear-Time Kernel Goodness-of-Fit Test. The paper was selected for one of three best paper awards at NIPS 2017. Python code here. Presentation slides here. Talk video here.
Our paper A Linear-Time Kernel Goodness-of-Fit Test has been accepted for oral presentation at NIPS 2017. Python code.
I gave an oral presentation at ICML 2017 on our new linear-time independence test. The slides are here.
Uploaded slides for the linear-time nonparametric goodness-of-fit test.
We have released Python code for our linear-time nonparametric goodness-of-fit test.
I gave a talk on a new linear-time dependence measure at the UCL workshop on the theory of big data. See the slides here.
A new fast nonparametric goodness-of-fit test. See A Linear-Time Kernel Goodness-of-Fit Test.
Our paper: An Adaptive Test of Independence with Analytic Kernel Embeddings is accepted to ICML 2017.
A new paper. Cognitive Bias in Ambiguity Judgements: Using Computational Models to Dissect the Effects of Mild Mood Manipulation in Humans in PLOS One.
An Adaptive Test of Independence with Analytic Kernel Embeddings, a fast nonparametric independence test. Python code here.
Interpretable Distribution Features with Maximum Testing Power is accepted to NIPS 2016 as a full oral presentation. See our 2-minute introduction video here.
Interpretable Distribution Features with Maximum Testing Power: a linear-time nonparametric two-sample test which returns a set of local features indicating why the null hypothesis is rejected. Python code available on Github.
K2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings: summary statistic free approximate Bayesian computation with kernel embeddings. Accepted to AISTATS 2016.
We released the code for Locally Linear Latent Variable Model (LL-LVM), which was accepted to NIPS 2015. Check our Matlab code here on Github.
Bayesian Manifold Learning: Locally Linear Latent Variable Model (LL-LVM): a probabilistic model for non-linear manifold discovery.
Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages: a fast, online algorithm for nonparametric learning of EP message updates. Source code available here.
Updated: 09-Oct-18
Based on Hyde Theme.