Since 2002 Perimeter Institute has been recording seminars, conference talks, public outreach events such as talks from top scientists using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities.
Recordings of events in these areas are all available and On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
Accessibly by anyone with internet, Perimeter aims to share the power and wonder of science with this free library.
Modern Machine Learning (ML) relies on cost function optimization to train model parameters. The non-convexity of cost function landscapes results in the emergence of local minima in which state-of-the-art gradient descent optimizers get stuck. Similarly, in modern Quantum Control (QC), a key to understanding the difficulty of multiqubit state preparation holds the control landscape -- the mapping assigning to every control protocol its cost function value.
Belief-propagation (BP) decoders are responsible for the success of many modern coding schemes. While many classical coding schemes have been generalized to the quantum setting, the corresponding BP decoders are flawed by design in this setting. Inspired by an exact mapping between BP and deep neural networks, we train neural BP decoders for quantum low-density parity-check codes, with a loss function tailored for the quantum setting. Training substantially improves the performance of the original BP decoders.
As quantum processors become increasingly refined, benchmarking them in useful ways becomes a critical topic. Traditional approaches to quantum tomography, such as state tomography, suffer from self-consistency problems, requiring either perfectly pre-calibrated operations or measurements. This problem has recently been tackled by explicitly self-consistent protocols such as randomized benchmarking, robust phase estimation, and gate set tomography (GST).
Site resolution in quantum gas microscopes for ultracold atoms in optical lattices have transformed quantum simulations of many-body Hamiltonians. Statistical analysis of atomic snapshots can produce expectation values for various charge and spin correlation functions and have led to new discoveries for the Hubbard model in two dimensions. Conventional approaches, however, fail in general when the order parameter is not known or when an expected phase has no clear signatures in the density basis.
Inspired by the "third wave" of artificial intelligence (AI), machine learning has found rapid applications in various topics of physics research. Perhaps one of the most ambitious goals of machine learning physics is to develop novel approaches that ultimately allows AI to discover new concepts and governing equations of physics from experimental observations. In this talk, I will present our progress in applying machine learning technique to reveal the quantum wave function of Bose-Einstein condensate (BEC) and the holographic geometry of conformal field theories.
For the past decade, there has been a new major architectural fad in deep learning every year or two.
One such fad for the past two years has been the transformer model, an implementation of the attention method which has superseded RNNs in most sequence learning applications. I'll give an overview of the model, with some discussion of non-physics applications, and intimate some possibilities for physics.
Density functional theory is a widely used electronic structure method for simulating and designing nanoscale systems based on first principles. I will outline our recent efforts to improve density functionals using deep learning. Improvement would mean achieving higher accuracy, better scaling (with respect to system size), improved computational parallelizability, and achieving reliable performance transferability across different electronic environments.
In the first part of this presentation, I will present supervised machine-learning studies of the low-lying energy levels of disordered quantum systems. We address single-particle continuous-space models that describe cold-atoms in speckle disorder, and also 1D quantum Ising glasses. Our results show that a sufficiently deep feed-forward neural network (NN) can be trained to accurately predict low-lying energy levels.
Prospective near-term applications of early quantum devices rely on accurate estimates of expectation values to become relevant. Decoherence and gate errors lead to wrong estimates. This problem was, at least in theory, remedied with the advent of quantum error correction. However, the overhead that is needed to implement a fully fault-tolerant gate set with current codes and current devices seems prohibitively large.
High-dimensional quantum systems are vital for quantum technologies and are essential in demonstrating practical quantum advantage in quantum computing, simulation and sensing. Since dimensionality grows exponentially with the number of qubits, the potential power of noisy intermediate-scale quantum (NISQ) devices over classical resources also stems from entangled states in high dimensions. An important family of quantum protocols that can take advantage of high-dimensional Hilbert space are classification tasks.
Check back for details on the next lecture in Perimeter's Public Lectures Series