COVID-19 information for PI Residents and Visitors
The advent of modern machine learning has ushered in rapid advances in the classification and interpretation of large data sets, sparking a revolution in areas such as image and natural language processing. Much of our current understanding of the techniques that underlie this revolution owes a great debt to insights first gleaned from condensed matter and statistical physics. This raises the important question of what further insights remain to be found at the intersection of machine learning and fields such as statistical physics, condensed matter, and quantum information. In response to this question, this workshop aims to bring together experts from a variety of backgrounds who are interested in connections between many-body physics, quantum computing and machine learning. The scope of the conference will include:
- The use of techniques from machine learning, such as neural networks or statistical learning, to tackle quantum many-body problems, such as discriminating phases of matter, analyzing phase transitions, and addressing the inverse Hamiltonian problem.
- Physics-inspired algorithms for machine learning and neural networks, such as extensions of Boltzmann machines (classical statistical mechanical learning) and connections between deep learning, the renormalization group, and tensor networks/MERA.
- Opportunities for machine learning that quantum computing will enable. This includes algorithmic advances for fault tolerant computers, as well as currently-available hardware systems such as quantum annealers.
- Mohammad Amin, D Wave Systems
- Peter Broecker, University of Cologne
- Kieron Burke, University of California, Irvine
- Matthew Fisher, Kavli Institute for Theoretical Physics
- Christopher Granade, University of Sydney
- Sergei Isakov, Google
- Ashish Kapoor, Microsoft Research
- Rosemary Ke, University of Montreal
- Seth Lloyd, Massachusetts Institute of Technology
- Andrew Millis, Simons Foundation
- Alejandro Perdomo Oritz, NASA Ames Research Center
- Barry Sanders, University of Calgary
- Maria Schuld, University of KwaZulu-Natal
- David Schwab, Northwestern University
- Cyril Stark, Massachusetts Institute of Technology
- James Steck, Wichita State University
- Damian Steiger, ETH Zurich & Google
- Miles Stoudenmire, University of California, Irvine
- GiacomoTorlai, University of Waterloo
- Mohammad Amin, D Wave Systems
- Louis-Francois Arsenault, Columbia University
- Jacob Barnett, Perimeter Institute
- Matt Beach, University of British Columbia
- Stefanie Beale, Institute for Quantum Computing
- Oleg Boulanov, Université Laval
- Daniel Brod, Perimeter Institute
- Peter Broecker, University of Cologne
- Kieron Burke, University of California, Irvine
- Juan Carrasquilla, Perimeter Institute
- Chen-Fu Chiang, SUNY
- Joshua Combes, Perimeter Institute
- Alexandre Day, Boston University
- Matthew Fisher, Kavli Institute for Theoretical Physics
- Wenbo Fu, Harvard University
- Martin Ganahl, Perimeter Institute
- Sevag Gharibian, Virginia Commonwealth University
- Victor Godet, Google
- Christopher Granade, University of Sydney
- Zhengcheng Gu, Perimeter Institute
- Gian Giacomo Guerreschi, Intel
- Guiyang Han, University of Waterloo
- Lauren Hayward-Sierens, Perimeter Institute
- Yejin Huh, University of Toronto
- Sergei Isakov, Google
- Bryan Jacobs, IARPA
- Ying-Jer Kao, National Taiwan University
- Ashish Kapoor, Microsoft Research
- Hemant Katiyar, Institute for Quantum Computing
- Rosemary Ke, University of Montreal
- Adrian Kent, Cambridge University
- Ehsan Khatami, San Jose State University
- Aaram Kim, Goethe-Universität Frankfurt am Main
- Alexandre Krajenbrink, Cambridge Quantum Computing
- Bohdan Kulchytskyy, University of Waterloo
- Joel Lamy-Poirier, Perimeter Institute
- Jaehoon Lee, University of British Columbia
- Junhyun Lee, Harvard University
- Ipsita Mandal, Perimeter Institute
- Roger Melko, Perimeter Institute & University of Waterloo
- Andrew Millis, Simons Foundation
- Ryan Mishmash, California Institute of Technology
- Robert Myers, Perimeter Institute
- Apurva Narayan, University of Waterloo
- Nam Nguyen, Wichita State University
- Chan Y. Park, Rutgers University
- Alejandro Perdomo Oritz, NASA Ames Research Center
- Anthony Polloreno, Rigetti Computing
- Pedro Ponte, Perimeter Institute
- Andrew Reeves, Grand River Regional Cancer Center
- Trevor Rempel, Perimeter Institute
- Julian Rincon, Perimeter Institute
- Nicholas Rubin, Rigetti Computing
- Wojciech Rzadkowski, University of Warsaw
- Subir Sachdev, Harvard University
- Barry Sanders, University of Calgary
- Norbert Schuch, Max-Planck-Institute of Quantum Optics
- Maria Schuld, University of KwaZulu-Natal
- David Schwab, Northwestern University
- Ivan Sergienko, Scotiabank
- Todd Sierens, Perimeter Institute
- Rajiv Singh, University of California, Davis
- Cyril Stark, Massachusetts Institute of Technology
- James Steck, Wichita State University
- Damian Steiger, ETH Zurich & Google
- Miles Stoudenmire, University of California, Irvine
- Yongchao Tang, University of Waterloo
- GiacomoTorlai, University of Waterloo
- Jordan Venderley, Cornell University
- Guillaume Verdon-Akzam, University of Waterloo
- Guifre Vidal, Perimeter Institute
- Yuan Wan, Perimeter Institute
- Chenjie Wang, Perimeter Institute
- Ching-Hao Wang, Boston University
- Shuo Yang, Perimeter Institute
- Chuck-Hou Yee, Rutgers University
Monday, August 8, 2016
Time |
Event |
Location |
9:00 – 9:30am |
Registration |
Reception |
9:30 – 9:35am |
Welcome and Opening Remarks |
Theatre |
9:35 – 10:15am |
Ashish Kapoor, Microsoft Research |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1st Floor |
11:00 – 11:45am |
Maria Schuld, University of KwaZulu-Natal |
Theatre |
11:45 – 12:30pm |
Christopher Granade, University of Sydney |
Theatre |
12:30 – 2:30pm |
Lunch |
Bistro – 2nd Floor |
2:30 – 3:15pm |
Barry Sanders, University of Calgary |
Theatre |
Tuesday, August 9, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Cyril Stark, Massachusetts Institute of Technology |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1st Floor |
11:00 – 11:45am |
David Schwab, Northwestern University |
Theatre |
11:45 – 12:30pm |
Miles Stoudenmire, University of California, Irvine |
Theatre |
12:30 – 2:30pm |
Lunch |
Bistro – 2nd Floor |
2:30 – 3:00pm |
James Steck, Wichita State University |
Theatre |
3:00pm – 3:30pm |
Rosemary Ke, MILA, University of Montreal |
Theatre |
Wednesday, August 10, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Sergei Isakov, Google |
Theatre |
10:15 - 11:00am |
Coffee Break |
Bistro – 1st Floor |
11:00 – 11:45am |
Mohammad Amin, D Wave Systems |
Theatre |
11:45 – 12:30pm |
Alejandro Perdomo-Ortiz, NASA Ames Research Center |
Theatre |
12:00 – 2:00pm |
Lunch |
Bistro – 2nd Floor |
2:00-3:30pm |
Colloquium |
Theatre |
Thursday, August 11, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Kieron Burke, University of California, Irvine |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1st Floor |
11:00 – 11:45am |
Andrew Millis, Columbia University |
Theatre |
11:45 – 12:30pm |
Juan Carrasquilla, Perimeter Institute |
Theatre |
12:30 – 2:30pm |
Lunch |
Bistro – 2nd Floor |
2:30 – 2:45pm |
Conference Photo |
TBA |
2:45 – 3:15pm |
Giacomo Torlai, University of Waterloo |
Theatre |
3:15 – 3:45pm |
Peter Broecker, University of Cologne |
Theatre |
5:30pm |
Pub Night |
Bistro – 2nd Floor |
Friday, August 12, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Damian Steiger, ETH Zurich & Google |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1st Floor |
11:00 – 11:45am |
Seth Lloyd, Massachusetts Institute of Technology |
Theatre |
12:00 – 2:30pm |
Lunch |
Bistro – 2nd Floor |
2:30 – 5:00pm |
Collaboration |
Theatre |
Mohammad Amin, D Wave Systems
Quantum Boltzmann Machine using a Quantum Annealer
Machine learning is a rapidly growing field in computer science with applications in computer vision, voice recognition, medical diagnosis, spam filtering, search engines, etc. In this presentation, I will introduce a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Model. Due to the non-commutative nature of quantum mechanics, the training process of the Quantum Boltzmann Machine (QBM) can become nontrivial. I will show how to circumvent this problem by introducing bounds on the quantum probabilities. This allows training the QBM efficiently by sampling. I will then show examples of QBM training with and without the bound, using exact diagonalization, and compare the results with classical Boltzmann training. Finally, after a brief introduction to D-Wave quantum annealing processors, I will discuss the possibility of using such processors for QBM training and application.
Peter Broecker, University of Cologne
Machine learning quantum phases of matter beyond the fermion sign problem
Kieron Burke, University of California, Irvine
Finding density functionals with machine-learning
Density functional theory (DFT) is an extremely popular approach to electronic structure problems in both materials science and chemistry and many other fields. Over the past several years, often in collaboration with Klaus Mueller at TU Berlin, we have explored using machine-learning to find the density functionals that must be approximated in DFT calculations. I will summarize our results so far, and report on two new works.
Juan Carrasquilla, Perimeter Institute
Machine Learning Phases of Matter
Matthew Fisher, Kavli Institute for Theoretical Physics
Quantum Crystals, Quantum Computing and Quantum Cognition
Quantum mechanics is down to earth - quite literally - since the electrons within the tiny crystals found in a handful of dirt manifest a dizzying world of quantum motion. Each crystal has it’s own unique choreography, with the electrons entangled in a myriad of quantum dances. Quantum entanglement
Physical approaches to the extraction of relevant information
In the first part of this talk, I will focus on the physics of deep learning, a popular subfield of machine learning where recent performance on tasks such as visual object recognition rivals human performance. I present work relating greedy training of deep belief networks to a form of variational real-space renormalization. This connection may help explain how deep networks automatically learn relevant features from data and extract independent factors of variation. Next, I turn to the information bottleneck (IB), an information theoretic approach to clustering and compression of relevant information that has been suggested as a framework for deep learning. I present a new variant of IB called the Deterministic Information Bottleneck, arguing that it better captures the notion of compression while retaining relevant information.
Cyril Stark, Massachusetts Institute of Technology
Damian Steiger, ETH Zurich & Google
Racing in parallel: Quantum versus Classical
Towards Quantum Supremacy with Near-Term Devices
Can quantum computers outperform classical computers on any computational problem in the near future? We study the problem of sampling from the output distribution of random quantum circuits.
Sampling from this distribution requires an exponential amount of classical computational resources. We argue that quantum supremacy can be achieved in the near future with approximately fifty superconducting qubits and without error correction despite the fact that quantum random circuits are extremely sensitive to errors.
Deep Learning: An Overview
Learning quantum annealing
Learning with Quantum-Inspired Tensor Networks
We propose a family of models with an exponential number of parameters, but which are approximated by a tensor network. Tensor networks are used to represent quantum wavefunctions, and powerful methods for optimizing them can be extended to machine learning applications as well. We use a matrix product state to classify images, and find that a surprisingly small bond dimension yields state-of-the-art results. Tensor networks offer many advantages for machine learning, such as better scaling for existing machine learning approaches and the ability to adapt hyperparameters during training.
Physical approaches to the extraction of relevant information
In the first part of this talk, I will focus on the physics of deep learning, a popular subfield of machine learning where recent performance on tasks such as visual object recognition rivals human performance. I present work relating greedy training of deep belief networks to a form of variational real-space renormalization. This connection may help explain how deep networks automatically learn relevant features from data and extract independent factors of variation.
Physics-inspired techniques for association rule mining
Imagine you run a supermarket, and assume that for each customer “u” you record what “u” is buying. For instance, you may observe that u=1 typically buys bread and cheese and u=2 typically buys bread and salami. Studying your dataset you suspect that generally, customers who are likely to buy cheese are likely to buy bread as well. Rules of this kind are called association rules. Mining association rules is of significant practical importance in fields like market basket analysis and healthcare.
Learning in Quantum Control: High-Dimensional Global Optimization for Noisy Quantum Dynamics
Quantum control is valuable for various quantum technologies such as high-fidelity gates for universal quantum computing, adaptive quantum-enhanced metrology, and ultra-cold atom manipulation. Although supervised machine learning and reinforcement learning are widely used for optimizing control parameters in classical systems, quantum control for parameter optimization is mainly pursued via gradient-based greedy algorithms.
Rejection and Particle Filtering for Hamiltonian Learning
Many tasks in quantum information rely on accurate knowledge of a system's Hamiltonian, including calibrating control, characterizing devices, and verifying quantum simulators. In this talk, we pose the problem of learning Hamiltonians as an instance of parameter estimation. We then solve this problem with Bayesian inference, and describe how rejection and particle filtering provide efficient numerical algorithms for learning Hamiltonians. Finally, we discuss how filtering can be combined with quantum resources to verify quantum systems beyond the reach of classical simulators.
Classification on a quantum computer: Linear regression and ensemble methods
Quantum machine learning algorithms usually translate a machine learning methods into an algorithm that can exploit the advantages of quantum information processing. One approach is to tackle methods that rely on matrix inversion with the quantum linear system of equations routine. We give such a quantum algorithm based on unregularised linear regression.
Comparing Classical and Quantum Methods for Supervised Machine Learning
Supervised Machine Learning is one of the key problems that arises in modern big data tasks. In this talk, I will first describe several different classical algorithmic paradigms for classification and then contrast them with quantum algorithmic constructs. In particular, we will look at classical methods such as the nearest neighbor rule, optimization based algorithms (e.g. SVMs), Bayesian inference based techniques (e.g. Bayes point machine) and provide a unifying framework so that we can get a deeper understanding about the quantum versions of the methods.
Pages
Scientific Organziers:
- Roger Melko, Perimeter Institute & University of Waterloo
- Miles Stoudenmire, University of California, Irvine
- Guifre Vidal, Perimeter Institute
- Nathan Wiebe, Microsoft Research