Machine Learning, (Deep) Reinforcement Learning and Learning to Learn abstractions, e.g. transfer, multi-task, curriculum, zero-shot learning, etc

Mainly interested in innovative applications to deep learning, especially vision related ones and expanding unsupervised learning via research in Variational Autoencoders and Generative Adversarial networks. Also, very interested in the area of explorative reinforcement learning, which can provide general purpose models that can then be applied to more specific areas.

Representation learning, relational learning for knowledge graphs, multi-task learning.

Program synthesis, especially type-driven program synthesis. Inductive logic programming, especially meta-interpretive learning. In general, I am also interested in automated reasoning, type systems, SMT solvers, and declarative programming.

Machine learning, deep neural networks, representation learning, image and natural language understanding, unsupervised learning.

Measurement Error, Bayesian Parametric and Nonparametric Statistics, Regression, Extreme Value Theory.

Theoretical machine learning and its applications (vision, NLP), neural networks, ML safety and explainability.

Primarily in Bayesian Machine Learning, Time-Series Modelling and, to some extent, Deep Learning. Especially interested in the mathematical underpinnings of the above techniques.

Theoretical notions behind computer science. In particular, automated reasoning and its relation to ontological management of data. Additionally, the mathematical and theoretical study of software modularity and other more general notions of modularity and elasticity.

Approximate Bayesian inference, generative models, Bayesian deep learning, and application of the previous topics to NLP.

The exploration of learning as both a tool to create solutions for data-rich problems and for the understandings of the underlying mechanisms that make learning, in a machine context, possible and plausible. This line of thought currently manifests as model analysis and manipulation for either robust representation learning or transfer learning.

Machine learning, neural networks and natural language processing, particularly for information extraction.

Methods, tools, and applications for machine learning, including deep learning and content-based recommendation.

Machine Learning, Unsupervised Learning, Graphical Models, (Deep) Neural Networks, Large-Scale/ Distributed Machine Learning and Optimisation. Applications in Text Mining, Computer Vision, Econometrics and Networks (Energy and Communications).

Representation learning using deep neural networks. Representing sets, with applications to information retrieval, content-based recommendation and generative models. Adversarial learning for fair decision making.

Interpretable machine learning, learning-to-learn paradigms and Bayesian deep learning with an interest in applications to energy and healthcare.

Natural language processing and machine learning applied to educational technology. Modelling student engagement in online courses through automated analysis of discussion forum messages.

Probabilistic Programming Languages. Machine Learning for Source Code. Software Verification and Synthesis.

Machine learning, optimisation and algorithms, especially as applied to energy problems.

Machine learning, Speech synthesis, Prosody, Generative models, Sampling speech from SPSS models, Expressive speech synthesis, Emotion recognition.

Machine learning, Statistics, Natural language processing, Computer vision, Neuroinformatics, Quantum computing

Neural networks, unsupervised learning, probabilistic modelling and their applications to image data.

Sentiment analysis and opinion mining for social media, multilingual sentiment analysis, text mining, topic modeling and deep learning for social media, building an NLP pipeline for social media.

Research interests mainly focus on Likelihood-free Bayesian Inference, Approximate Bayesian Computation, Probabilistic Modelling and Bayesian Deep Learning.

Computational linguistics, Compositional distributional semantics, sentiment analysis, semantic role labeling and paraphrase detection.

Interested in probabilistic modelling applied on multi-modal biological data, such as methylation, expression and accessibility. I work in developing novel statistical methods dealing with corrupting mechanisms in the process of gathering such data with single cell technologies. Past interests include optimisation and machine vision.

Network Representation Learning, semi-supervised learning on networks, community detection and large scale clustering of attributed graphs. Bio-medical applications of such methods. For example, the analysis of Autism Spectrum Disorder.

Computational Linguistics, speech recognition, development of language in children.

Deep learning and neural networks. Particular interest in knowledge distillation, few-shot learning and meta learning.

Probabilistic modelling, approximate inference, deep learning, vision as inverse graphics.

Stochastic modelling and statistical inference in biology, spatiotemporal models in molecular biophysics, stochastic reaction kinetics.

Formal and statistical semantics of natural language. Deep learning and time-series modelling. Representation learning. Linguistic theories, as well as computational models, of sarcasm.

Deep learning with application to harmonic instruments and music. Previous experience with fraud and image data.

Coming from a mathematical and statistical background means I am naturally interested in the theoretical underpinnings of machine learning algorithms. However, I am currently keen to learn more about the theoretical challenges that arise when applying such methods in the real world where there are limitations and hurdles, such as in medicine.

Computational linguistics. Natural language processing, particularly for low-resource languages. Semantics of natural language.

High performance & distributed computing, data-representation & storage, non-volatile memory, computer vision, object recognition & classification, and machine learning.

Probabilistic modelling and deep learning. Current focus: density estimation for energy-based and/or latent-variable models.

Bayesian approaches to deep learning, approximate inference, probabilistic programming languages and the application of these techniques to real world problems requiring the understanding of uncertainty.

Natural language processing, especially noisy user-generated text, language variation and change.

Machine learning, deep neural networks, human-like computing, applications to computer vision, applications to neuroinformatics.

Network representation learning, semi-supervised learning on networks, distributed algorithms for graph representation learning, large scale network analysis and community detection in attributed graphs.

Databases, especially database theory and query languages; theoretical computer science and logic and their applications to databases; problems that arise in conjunction with big data.

Machine learning for natural language processing. Language understanding for interactive models. Domain adaptation and transfer learning for cross-lingual language modelling.

Computational linguistics, natural language processing, cognitive modelling, complex networks. Identifying and analysing socio-linguistic variation in social media text.

Machine learning, Bayesian inference and analysing data from gravitational wave detectors.

Mathematical principles of machine learning, the relationship between architectures and domain invariances, networks as function spaces, neural networks and information theory, meta-learning.

Databases: query languages, relational and graph data, incomplete information. Logic in computer science, automata theory. Game theoric aspects of Blockchains.

Machine Translation, Natural Language Understanding, Low-resource NLP, Unsupervised Learning and Representation Learning for NLP.

Database theory and systems, big data processing and data mining.

Speech synthesis, prosodic modeling, speech recognition, deep learning, human speech perception