I am an Assistant Professor in the University of Toronto working on quantum-enhanced machine learning and applications of high-performance learning algorithms in quantum physics. I am also the Academic Director of the Quantum Machine Learning Program in the Creative Destruction Lab, a Data Scientist in Residence in the TD Management Data and Analytics Lab, a Faculty Affiliate in the Vector Institute for Artificial Intelligence, and an Affiliate in the Perimeter Institute for Theoretical Physics.
Outside my academic work, I cofounded the Quantum Open Source Foundation, I am the Chair-Elect of the Special Interest Group on Quantum Computing in the American Statistical Association. I serve in an advisory role for various startups, and I am a member of the NUS Overseas Colleges Alumni and an assessor in the Toronto branch of NOC.
Trained as a mathematician and computer scientist, I received my PhD from the National University of Singapore. Previously I worked in the Quantum Information Theory group in ICFO-The Institute of Photonic Sciences and in the University of Borås. I did longer research stints at several institutions, including the School of Information Systems in the Queensland University of Technology, the Quantum Information Group in the University of Tokyo, Centre for Quantum Technologies in the National University of Singapore, Tsinghua University, the Barcelona Supercomputing Center, and the Indian Institute of Science.
Quantum-enhanced machine learning: Current and near-future quantum technologies have a potential of improving learning algorithms. Of particular interest are algorithms that have a high computational complexity or that require sampling. The latter type includes many probabilistic graphical models in which not only the training phase, but also the inference phase has been infeasible at scale, prompting a need for quantum-enhanced sampling. This in turn will enable deep architectures for probabilistic models, as well as scalable implementations of statistical relational learning, both of which go beyond the black-box model of neural networks and shift the focus towards explainable artificial intelligence. While speedup is the primary consideration, we also investigate the fundamental limits of statistical learning theory in the framework of quantum physics.
Quantum many-body systems, optimization, and machine learning: Identifying the ground state of a many-particle system whose interactions are described by a Hamiltonian is an important problem in quantum physics. During the last decade, different relaxations of the previous Hamiltonian minimization problem have been proposed. These algorithms include the lower levels of a general hierarchy of semidefinite programming (SDP) relaxations for non-commutative polynomial optimization, which provide a lower bound on the ground-state energy, complementing the upper bounds that are obtainable using variational methods. The latest developments step away from optimization, and introduce machine learning as an ansatz for ground-state energy problems and for the study of quantum phase transitions. In fact, strong links between quantum many-body physics (tensor networks in particular) and deep learning are being established. We are developing a set of theoretical and numerical tools to pursue these synergies.
The pace of development in quantum computing mirrors the rapid advances made in machine learning and artificial intelligence. It is natural to ask whether quantum technologies could boost learning algorithms: this field of inquiry is called quantum-enhanced machine learning. The goal of this course is to show what benefits current and future quantum technologies can provide to machine learning, focusing on algorithms that are challenging with classical digital computers. We put a strong emphasis on implementing the protocols, using open source frameworks in Python. Prominent researchers in the field give guest lectures to provide extra depth to each major topic. These guest lecturers include Alán Aspuru-Guzik, Seth Lloyd, Roger Melko, and Maria Schuld.
In particular, we will address the following objectives:
1) Understand the basics of quantum states as a generalization of classical probability distributions, their evolution in closed and open systems, and measurements as a form of sampling. Describe elementary classical and quantum many-body systems.
2) Contrast quantum computing paradigms and implementations. Recognize the limitations of current and near-future quantum technologies and the kind of the tasks where they outperform or are expected to outperform classical computers. Explain variational circuits.
3) Describe and implement classical-quantum hybrid learning algorithms. Encode classical information in quantum systems. Perform discrete optimization in ensembles and unsupervised machine learning with different quantum computing paradigms. Sample quantum states for probabilistic models. Experiment with unusual kernel functions on quantum computers
4) Demonstrate coherent quantum machine learning protocols and estimate their resources requirements. Summarize quantum phase estimation and quantum matrix, and implement these algorithms for Gaussian processes. Self-exponentiate a quantum state for principal component analysis. Prepare a Gram matrix using quantum resources.
Machine learning has become indispensable in discovering patterns in large data sets, and the theory is at the core of a larger set of tools known as data mining. It is a mature field with an astonishing array of practical applications.
Quantum computing has the potential of taking machine learning to the next level. While hardware implementations of quantum computing systems are still in an initial phase, recent theoretical developments hint at the benefits of applying quantum methods to learning algorithms.
Computational complexity can be reduced exponentially in some cases, whereas we see quadratic reduction in others. Yet, improved learning time is just one part of the equation. Through nonconvex objective functions, quantum machine learning algorithms are more robust to noise and outliers, which makes their generalization performance better than many known classical algorithms. Examples include quantum support vector machines, learning a function by quantum process tomography, quantum neural networks, and adiabatic quantum optimization.
Quantum Machine Learning: What Quantum Computing Means to Data Mining explains the most relevant concepts of machine learning, quantum mechanics, and quantum information theory, and contrasts classical learning algorithms to their quantum counterparts.