**Jonathan Barrett**, University of Oxford

*Quantum causal models*

From a brief discussion of how to generalise Reichenbach’s Principle of the Common Cause to the case of quantum systems, I will develop a formalism to describe any set of quantum systems that have specified causal relationships between them. This formalism is the nearest quantum analogue to the classical causal models of Judea Pearl and others. At the heart of the classical formalism lies the idea that facts about causal structure enforce constraints on probability distributions in the form of conditional independences. I will describe a quantum analogue of this idea, which leads to a quantum version of the three rules of Pearl’s do-calculus. If time, I will end with some more speculative remarks concerning the significance of the work for the foundations of quantum theory.

**Bob Coecke**, University of Oxford

*From quantum to cognition in pictures.*

For well over a decade, we developed an entirely pictorial (and formally rigorous!) presentation of quantum theory [*]. At the present, experiments are being setup aimed at establishing the age at which children could effectively learn quantum theory in this manner. Meanwhile, the pictorial language has also been successful in the study of natural language, and very recently we have started to apply it to model cognition, where we employ GPT-alike models. We present the key ingredients of the pictorial language language as well as their interpretation across disciplines.

[*] B. Coecke & A. Kissinger (2017) Picturing Quantum Processes. A first course on quantum theory and diagrammatic reasoning. Cambridge University Press.

**Bianca Dittrich, **Perimeter Institute

*Observables and (no) time in quantum gravity*

I will explain the special requirements that observables have to satisfy in quantum gravity and how this affects deeply the notion of time. I will furthermore explore how the search for observables in classical gravity can inform the construction of a quantum theory of gravity.

**Tobias Fritz**, Max Planck Institute for Mathematics in the Sciences

*Towards synthetic Euclidean quantum field theory*

In this status report on current work in progress, I will sketch a generalization of the <a href="https://arxiv.org/abs/1710.10258">temporal type theory</a> introduced by Schultz and Spivak to a logic of space and spacetime. If one writes down a definition of probability space within this logic, one conjecturally obtains a notion whose semantics is precisely that of a Euclidean quantum field. I will sketch how to use the logic to reason about probabilities of events involving fields, sketch the relation to AQFT, and attempt to formulate the DLR equations within the logic.

Joint work with David Spivak

**Philipp Hoehn**, Institute for Quantum Optics and Quantum Information

*Quantum reference systems: Where foundations meets gravity*

Quantum foundations and (quantum) gravity are usually considered independently. However, I will demonstrate by means of quantum reference systems how tools and perspectives from quantum gravity can help to solve problems in quantum foundations and, conversely, how quantum foundation perspectives can be useful to constrain spacetime structures.

First, I will show how one can derive transformations between quantum reference frames from a gravity inspired symmetry (essentially Mach’s) principle. This principle enforces a perspective neutral theory in which choosing the perspective of a specific frame becomes a choice of gauge and all physical information is relational. This setting enables one to derive and generalize, from first principles, frame transformations that have been proposed earlier in the foundations literature. Moreover, the framework extends to the relational paradigm of dynamics, familiar from quantum gravity, and thereby provides a unifying method for changes of perspective in the quantum theory, incl. changes of both spatial and temporal quantum reference systems.

Subsequently, I will take a quantum information inspired perspective on frame synchronization and transformations. Without presupposing specific spacetime structure, I will exhibit how the Lorentz group follows from operational conditions on quantum communication, exemplifying how quantum information protocols can constrain the spacetime structures in which they are feasible.

**Adrian Kent**, University of Cambridge

*Models and Tests of Quantum Theory and Gravity*

Models that have some but not all features of standard quantum theory can be valuable in several ways, as Bell, Ghirardi-Rimini-Weber-Pearle, Hardy, Spekkens and many others have shown. One is to illuminate quantum theory and shed light on possible reaxiomatisations or reformulations. Another is to suggest experiments that might confirm some untested aspect of quantum theory or point the way to a new theory. I discuss here some models that combine quantum theory and gravity and experimental tests.

**Matthew Leifer**, Chapman University

*Measures of Preparation Contextuality*

In a large medical trial, if one obtained a ridiculously small p-value like 10^-12, one would typically move from a plain hypothesis test to trying to estimate the parameters of the effect. For example, one might try to estimate the optimal dosage of a drug or the optimal length of a course of treatment. Tests of Bell and noncontextuality inequalities are hypotheses tests, and typical p-values are much lower than this, e.g. 12-sigma effects are not unheard of and a 7-sigma violation already corresponds to a p-value of about 10^-12. Why then, in quantum foundations, are we still obsessed with proposing and testing new inequalities rather than trying to estimate the parameters of the effect from the experimental data? Here, we will try to do this for preparation contextuality, but will also make some related comments on recent loophole-free Bell inequality tests.

We introduce two measures of preparation contextuality: the maximal overlap and the preparation contextuality fraction. The latter is linearly related to the degree of violation of a preparation noncontextuality inequality, so can be estimated from experimental data. Although the measures are different in general, they can be equal for proofs of preparation contextuality that have sufficient symmetry, such as the timelike analogue of the CHSH scenario. We give the value of these measures for this scenario. Using our result, we can consider pairty-epsilon multiplexing, Alice must try to communicate two bits to Bob so that he can choose to determine either of them with high probability, but where Alice must ensure that Bob cannot guess the parity of the bits with probability greater than 1/2 + epsilon, and determine the range of epislon for which there is still an advantage in preparation contextual theories. If time permits, I will make some brief comments on how to robustify experimental tests of this result.

joint work with Eric Freda and David Schmid

**Yeong-Cherng Liang**, National Cheng Kung University

*A device-independent approach to testing physical theories from finite data*

The device-independent approach to physics is one where conclusions are drawn directly and solely from the observed correlations between measurement outcomes. This operational approach to physics arose as a byproduct of Bell's seminal work to distinguish quantum correlations from the set of correlations allowed by locally-causal theories. In practice, since one can only perform a finite number of experimental trials, deciding whether an empirical observation is compatible with some class of physical theories will have to be carried out via the task of hypothesis testing. In this talk, I will review some recent progress on this task based on the prediction-based-ratio method and discuss how it may allow us to falsify, in principle, other classes of physical theories, such as those constrained only by the nonsignaling principle, and those that are constrained to produce the so-called "almost-quantum" set of correlations. As an application, I demonstrate how this method allows us to unveil the apparent violation of the nonsignaling conditions in certain experimental data collected in a Bell test. The lesson learned from this observation will be briefly discussed.

**Nuriya Nurgalieva**, ETH Zurich

*Observers as Primitives*

Let us suppose that we are trying to build a physical theory of the universe, in order to do so, we have to introduce some primitive notions, on which the theory will be based upon. We explore possible candidates that can be considered to be such "primitives": for example, the structure of the spacetime, or quantum states. However, the examples can be given such that show that these notions are not as objective as we would want them to be. The concept of objectivity, on the other hand, is closedly linked to that one of "an observer", thus, we can at least assign it as a primitive of the theory. Now agents are themselves physical systems, and we should take this into account when we specify the ground rules of what they can do. On the one hand, we take agents and their communication as a primitive of the theory and then see which concepts can be derived from there. On the other hand, we treat agents as quantum systems themselves and investigate what kind of logic applies to their interpersonal reasoning; for that, as a guiding example we use the Frauchiger-Renner thought experiment {1,2}.

**Robert Oeckl**, Universidad Nacional Autónoma de México

*Local quantum operations and causality*

I give further details on a unification of the foundations of operational quantum theory with those of quantum field theory, coming out of a program that is also known as the positive formalism. I will discuss status and challenges of this program, focusing on the central new concept of local quantum operation. Among the conceptual challenges I want to highlight the question of causality. How do we know that future choices of measurement settings do not influence present measurement results? Should we enforce this, as in the standard formulation of quantum theory? Should this "emerge" from a fundamental theory? Does this question even make sense in a context without a fixed notion of time, such as quantum gravity? With a heavy dose of speculation (put also grounded in very concrete evidence) I find that fermionic theories might play an essential role.

**Ognyan Oreshkov**, Universite Libre de Bruxelles

*Time-delocalized quantum subsystems and operations: on the existence of processes with indefinite causal structure in quantum mechanics*

It was recently found that it is theoretically possible for there to exist higher-order quantum processes in which the operations performed by separate parties cannot be ascribed a definite causal order. Some of these processes are believed to have a physical realization in standard quantum mechanics via coherent control of the times of the operations. A prominent example is the quantum SWITCH, which was recently demonstrated experimentally. However, up until now, there has been no rigorous justification for the interpretation of such an experiment as a genuine realization of a process with indefinite causal structure as opposed to a simulation of such a process. Where exactly are the local operations of the parties in such an experiment? On what spaces do they act given that their times are indefinite? Can we probe them directly rather than assume what they ought to be based on heuristic considerations? How can we reconcile the claim that these operations really take place, each once as required, with the fact that the structure of the presumed process implies that they cannot be part of any acyclic circuit? Here, I offer a precise answer to these questions: the input and output systems of the operations in such a process are generally nontrivial subsystems of Hilbert spaces that are tensor products of Hilbert spaces associated with different times—a fact that is directly experimentally verifiable. With respect to these time-delocalized subsystems, the structure of the process is one of a circuit with a cycle, which cannot be reduced to a (possibly dynamical) probabilistic mixture of acyclic circuits. This provides, for the first time, a rigorous proof of the existence of processes with indefinite causal structure in quantum mechanics. I further show that all bipartite processes that obey a recently proposed unitary extension postulate, together with their unitary extensions, have a physical realization on such time-delocalized subsystems, and provide evidence that even more general processes may be physically admissible. These results unveil a novel structure within quantum mechanics, which may have important implications for physics and information processing.

**Paolo Perinotti**, Universita degli Studi di Pavia

*Infinite composite systems and cellular automata in operational probabilistic theories*

Cellular automata are a central notion for the formulation of physical laws in an abstract information-theoretical scenario, and lead in recent years to the reconstruction of free relativistic quantum field theory. In this talk we extend the notion of a Quantum Cellular Automaton to general Operational Probabilistic Theories. For this purpose, we construct infinite composite systems, illustrating the main features of their states, effects and transformations. We discuss the generalization of the concepts of homogeneity and locality, in an framework where space-time is not a primitive object. We show that homogeneity leads to a Cayley graph structure of the memory array, thus proving the universality of the connection between homogeneity and discrete groups. We conclude illustrating the special case of Fermionic cellular automata, discussing three relevant examples: Weyl and Dirac quantum walks, the Thirring automaton and the simplest families of automata on finite graphs.

**Ana Belen Sainz**, Perimeter Institute

*Almost quantum correlations violate the no-restriction hypothesis*

To identify which principles characterise quantum correlations, it is essential to understand in which sense this set of correlations differs from that of almost quantum correlations. We solve this problem by invoking the so-called no-restriction hypothesis, an explicit and natural axiom in many reconstructions of quantum theory stating that the set of possible measurements is the dual of the set of states. We prove that, contrary to quantum correlations, no generalised probabilistic theory satisfying the no-restriction hypothesis is able to reproduce the set of almost quantum correlations. Therefore, any theory whose correlations are exactly, or very close to, the almost quantum correlations necessarily requires a rule limiting the possible measurements. Our results suggest that the no-restriction hypothesis may play a fundamental role in singling out the set of quantum correlations among other non-signalling ones.

**Lev Vaidman**, Tel Aviv University

*Counterfactual communication protocols*

Possibility to communicate between spatially separated regions, without even a single photon passing between the two parties, is an amazing quantum phenomenon. The possibility of transmitting one value of a bit in such a way, the interaction-free measurement, was known for quarter of a century. The protocols of full communication, including transmitting unknown quantum states were proposed only few years ago, but it was shown that in all these protocols the particle was leaving a weak trace in the transmission channel, the trace larger than the trace left by a single particle passing through the channel. However, a simple modification of these recent protocols eliminates the trace in the transmission channel and makes all these protocols truly counterfactual.

**Dominic Verdon**, University of Oxford

*A compositional approach to quantum functions, and the Morita theory of quantum graph isomorphisms*

Certain nonlocal games exhibiting quantum advantage, such as the quantum graph homomorphism and isomorphism games, have composable quantum strategies which are naturally interpreted as structure-preserving functions between finite sets. We propose a natural compositional framework for noncommutative finite set theory in which these quantum strategies appear naturally, and which connects nonlocal games with recent work on compact quantum groups. We apply Morita-theoretical machinery within this framework to characterise, classify, and construct quantum strategies for the graph isomorphism game. This is joint work with Benjamin Musto and David Reutter, based on the papers 1711.07945 and 1801.09705.

**Alexander Wilce**, Susquehanna University

*Quantum axiomatics à la carte*

The past decade or so has produced a handful of derivations, or reconstructions, of finite-dimensional quantum mechanics from various packages of operational and/or information-theoretic principles. I will present a selection of these principles --- including symmetry postulates, dilational assumptions, and versions of Hardy's subspace axiom --- in a common framework, and indicate several ways, some familiar and some new, in which these can be combined to yield either standard complex QM (with or without SSRs) or broader theories embracing formally real Jordan algebras.