Hello, Bonjour, Sa-wat-dee, Vitayu! Welcome to the the seventh week of Statistical Mechanics: Algorithms and Computations from the Physics Department of Ecole normale supérieure. This lecture is the third and last one on quantum statistical mechanics, and the lecture and the entire program of this week is dedicated to a discussion, and even a celebration of quantum-indiscernability and Bose-Einstein condensation. As several times already during this course, we push very far our two main approaches Markov chain Monte-Carlo sampling and direct sampling, here for the ideal Bose gas A very powerful Markov-chain Monte-Carlo algorithm will arise in the lecture whereas a direct sampling algorithm and the detailed understanding of permutations will be the subject of this week's tutorial Of course, whenever there is a direct sampling algorithm, there is an analytic solution just around the corner and we will obtain it also. Finally, in this week's homework session, you will take over yourself the path integral Monte-Carlo program that I present here and analyze its output at high and low temperature. Before starting, let me explain our goals for this week. Our discussion of Bose-Einstein condensation will result in a short Python program that produces the following output: you see configurations x, y and z of about 1000 ideal bosons in an harmonic trap at termperature T. At a well defined temperature, Bosons clump together in the center of the trap: this is Bose-Einstein condensation. It was first achieved, in experimental harmonic traps just like ours , in 1995, by Cornell and Wieman, and also by Ketterle, and all 3 were awarded the 2001 Nobel prize in Physics for their achievement. You will create yourself the Bose-Einstein condensates later in this week, and you will also modify and analyze the program. One thing you can do by just changing 2 lines in your program is to turn off the bosonic nature of particles and make them distinguishable. You can then run your code for the same number of particles, at the same mass and the same temperature, in the same harmonic potential and you will see that at the temperature at which the bosons condense, nothing happens for the distinguishable particles. So you see that Bose-Einstein condensation is due to the bosonic nature of particles, as the name indicates. You can look at your Bose-Einstein condensates just like the atomic physicist who controls the atomic cloud through lasers. But attention! don't forget to wear your glasses to protect youself against the powerful laser light! To follow what we discuss here, you only need to understand what we discussed during the last 2 weeks: the density matrix, and the Levy quantum path in a harmonic potential. So, let's get started, with week 7 of Statistical Mechanics: Algorithms and Computations. Before running a quantum Monte-Carlo simulation for Bosons, we must understand the bosonic density matrix. Let's go back to a single particle. The partition function of a single particle at temperature T is given by.... or in other words, by the sum of all the paths from x to x, integrated over x. The partition function Z is a sum over the diagonal density matrix, as you found out for yourself during the last 2 homework sessions, and the off-diagonal density matrix intervenes in the construction of the paths. Naturally, x can be a position in a 3-dimensional space, and the paths are independent paths in x, y and z. Next, let us consider two particles. The statistical weight of a position x = (x0, x1) involves a density matrix as before and the paths are now from x0 to x0 and from x1 to x1. The partition function is an integral over all the paths x0 to x0 and from x1 to x1, integrated over x0 and integrated over x1. For non-interacting particles, we are already done: the partition function is the trace - the integral over dx0 dx1... and so on... of the diagonal paths. These paths are independent. Now, interacting systems are described by paths whose weight is modified through the Trotter decomposition: this correlates the paths. Now, we can color these paths, make them blue, yellow, green, and so on... and we see that with this formalism, we describe in fact distinguishable quantum particles. To go from distinguishable to indistiguishable particles is very easy, and we will restrict ourselves to bosons. The partition function again involves positions, here on the bottom, and the same positions on the top. These positions must be the same, because we are concerned with the diagonal density matrix, but now, there can be permutations. So you see, the green particle becomes red, and the red particle becomes blue, and so on... There is no need for colors anymore, the particles have become indistinguishable. In fact, the bosonic partition function is given by an average over all permutations of the paths from x0 to x of a permutation of 0, from x1 to x of a permutation of 1, and so on... This formula can be rigorously derived using symmetric wavefunctions, but its spirit is very clear. For interacting particles, we can now cut up the density matrix into little slices, and interfere with the Trotter decomposition. But for non-interacting particles, there is no need for intermediate slices, and no need for Trotter decomposition, so we arrive at the partition function of non-interacting ideal bosons, as shown here. So here, we have a multiple integral over paths, and a sum over permuations. The structure shown here is completely general, and the only simplification of the ideal Bose gas, is the fact that the many-body density matrix breaks up into a product over single-particle density matrices, or in other words, the fact that the paths are independent. In this week's tutorial, we will analytically describe the partition function described in the last section. But here, let us consider the sampling problem. We face, in fact, two challenges: the sampling of the permutations, and the sampling of the positions. Let us consider the sampling of the permutations first, and let us radically simplify the problem discussed here by replacing everything that depends of space by 1, so instead of sampling bosonic permutations, we consider for a few minutes the permutations of n elements in a list. At each step, we may exchange two random elements. This random transposition algorithm, a Markov-Chain algorithm, is contained in the algorithm permutation_sample.py. Look here, at the two indices i and j, and here, the exchange of L[i] and L[j]. Ouput of this program is shown here. You see the permutations represented in bottom-up fashion which means for this permutation that 0->0, 1->3, 2->1 and 3->2. The random transposition algorithm is correct. It satisfies detailed balance, is irreducible, and aperiodic, and you see that the frequency of our 24 permutations comes out just right. The second element of our simulation program is the sampling of positions. Let us look again at the average over permutations, and pick one of them, the one we just looked at, the permutation 0->0, 1->3, 2->1 and 3->2. We can represent this permutation graphically, and you see there is one cycle of length 1, and one cycle of length 3: the cycle 1->3->2->1. This cycle has a curious action. Have you seen this before? Of course ! The integral over x3 is just like it was in the convolution theorem. It gives the density matrix rho(x1, x2) at 2 beta. and the integral over x2 again can be used in the convolution theorem. It gives an integral over x1, x1, 3 beta. So in this permutation part of the partition function, we have one particle, the particle 0, at temperature beta, and the particles on the cycle of length three, 1, 3 and 2, are in fact at temperature 3 beta, this means at three times lower temperature. To sample the partition function for this particular permutation, we may use what we learned in last week's homework, we may sample x0 from the diagonal density matrix rho_harmonic of x0, x0, beta, and similarly in y and in z. Analogously, we may sample the position x1 from the diagonal density matrix rho_harmonic of x1, x1, 3 beta. The positions of x3 and x2 are the intermediate points in a Lévy construction at inverse termperature 3 beta with 3 slices at temperature beta and 2 beta. The moves in positions have no rejections. In contrast, we must sample the permutations with the metropolis acceptance probability. For example, in our permutation 0->0, 1->3, 2->1 and 3->2, let us pick 2 random elements, for example 1 and 2, and exchange where they point to. So the new permutation is 1->1 and 2->3. The old weight of the permutation is proportional to... and the new weight is proprtional to... We accept this move with the Metropolis acceptance probability min(1, pi_new/pi_old). For what follows, please take a moment to download, run and modify the program discussed in this section. The algorithm permutation_sample.py is really simple, yet it may familiarize you to the way we write permutations from bottom to top, and with the transpositions that constitute a Markov chain with the stationary probability distribution of random permutations. For the following discussions, please pull up the fact-sheet of last week, where we discussed the harmonic density matrix, and the harmonic path sampling. We have provided it again today. To simulate ideal bosons in a 3D harmonic trap, we start with the identity permutation and with random positions sampled from the diagonal harmonic density matrix in x, y and z. For each particle move, we sample a random particle, identify its permutation cycle, and sample a new Lévy quantum path for the entire cycle. For each permutation move, we sample 2 random particles like this and this, and we attempt an exchange of their permutation partners. In Python, this gives the following program, markov_harmonic_boson.py This program has 2 functions: the first function, levy_harmonic_path, is used at multiples of the inverse temperature beta, corresponding to the length of the permutation cycle. We use it to resample the positions of the entire cycle. The second function computes the off-diagonal harmonic density matrix. We use it to organize the exchange of two elements. And here is the second part of this program. After an initialization, exactly as announced, we enter a short iteration loop. We sample a random particle and compute the permutation cycle it is on. Then, we simply resample the entire path of the cycle from the Lévy quantum path. And here, we pick two particles and attempt an exhange. This is all there is to this program. In this very short program, there are no particle indices. The particle positions x, y and z, are the "keys" of a "dictionary" called "positions". These are the positions at tau=0. The "values" of this dictionary are the positions at tau=beta, the positions of the permutation partners. So here, we sample a random key, and the pop operation outputs the positions of the permutation partner. Output of markov_harmonic_bosons.py is show here. At high temperature, particles are quite far from each other, and attempts to perform a transposition are usually rejected. At lower temperature, beta becomes larger and the transpositions are accepted more easily. All of a sudden particles clamp together. The transpositions are accepted and particles are on long permutation cycles. On long permutation cycles, they seem to be at much lower temperature. In fact, they are in the ground state. This is the essence of Bose-Einstein condensation. We will treat it again in more detail in this week's tutorial. So here is the program we discussed during this lecture, markov_harmonic_bosons.py, as well as its movie version that produced the nice graphics outputs you saw all during this lecture. In this week's homework session, you will take over the steering wheel of this beautiful program, and run it at high and at low temperature. Notice this program is short enough for you to gain complete understanding of how it works. So in conclusion, we have studied in this lecture Bose-Einstein condensation, and set up a really compact Path Integral Monte-Carlo simulation for hundreds and thousands of bosons. We could look ourselves to the Bose-Einstein condensation, so now is time for me to let you play with this algorithm, and to learn how it works. More details will be provided in this week's tutorial and homework session. So now, finally, let me thank you for your attention and see you again later on this week and in further sessions of Statistical Mechanics: Algorithms and Computations.