Hello, bonjour, cama-i, ainngai, Welcome to the the sixth week of Statistical Mechanics: Algorithms and Computations, from the Physics Department of Ecole Normale supÃ©rieure. We are right now in the middle of our three-weeks module on quantum physics and quantum statistical mechanics And we have just started our second half of this course. So let me start with a brief exposition of where we have come from and where we stand right now. This exposition will be in the light of what we already learnt about the connection between sampling and integration. So, five weeks ago we started by considering pebbles on the Monte Carlo beach and on the heliport the partition function was the integral in x from -1 to 1 and the integral in y from -1 to 1 of pi(x,y) which was a constant equal to one. We then moved to a discussion of hard disks the partition function now was this integral, with the statistical weight pi(x) which was equal to 1 if the configuration was legal, and 0 if it was illegal last week we then moved to a discussion of quantum physics The partition function now was the path integral over dx_0, .., dx_(N-1) of the statistical weight which was the density matrix between x_0 and x_1, beta/N density matrix of x_1 and x_2, beta/N density matrix of x_(N-1) and x_0, beta/N. This is the path integral We sampled it using a naive path integral Monte Carlo algorithm that resembled what we already did for hard-disks. But rather than picking one disk at random, we picked one bead at random, then we moved it a little bit to the left or to the right, and accepted it with the Metropolis acceptance probability. This week, we will remain within this picture of quantum physics. We will present a providential sampling algorithm that goes back to Paul Levy in 1940. It is a rejection-free direct-sampling algorithm, something we never achieved for hard-disks. Let me briefly explain what the LÃ©vy algorithm achieves. You pick a starting position x at tau=0 and a final position x' at tau=beta and then you fill in the path between x and x' in one set. You can do this with a number of slices that is as large as you want: 100, 1000, 1000000.. Here we have a free particle, but an analogous Levy construction exists also for a particle in a harmonic trap. LÃ©vy's algorithm is really famous, and not just in quantum statistical mechanics. It exists under various names in the physics of interfaces in numerical analysis, in statistics, in financial mathematics, and in electrical engineering. The path you see here illustrates the Heisenberg uncertainty principle: the founding principle of quantum mechanics, and one of the pillars of quantum statistical physics. At low temperature (large beta) the path fluctuates wildly and this means that the particle position x is uncertain. Next week, the Levy construction algorithm will be a crucial element of many-body simulations, and then we will fathom the second pillar of quantum statistical mechanics: the indiscernability of particles, and their bosonic or fermionic nature and the curious Bose-Einstein condensation To discuss Bose-Einstein condensation, we will again mostly use the path integral language but to understand why this is a condensation we need to consider wavefunctions and energy levels. And this is what we will do in this week's tutorial where we will get down to the art of bosonic statistics and we'll solve a model of finite (small) number of particles in a three-dimensional harmonic trap In this week's homework session we consolidate what we learn in the lecture through practical calculations. For simplicity, we remain within the framework of the harmonic potential but the firework of sampling strategies and re-weighting schemes applies to a much wider range of problems than just to four little quantum particles in a harmonic trap. Again, you must understand the great range of applications of our concepts inside and outside physics. Quantum paths are closely related to interfaces as you see here for an interface between two liquids. Quantum paths are also closely related to what is studied by financial analysts. So ,let's get started with week 6 of Statistical Mechanics: Algorithms and Computations. We recall once more the final subject of last week's lecture We arrived at a representation of the partition function of a quantum particle in a harmonic potential as a sum or rather as a multiple integral over paths with a statistical weight giving the weight of each path. You see the partition function really is a sum over all paths. We sampled this partition function using a naive algorithm, naive_harmonic_path.py Its key element was that we sampled a random bead k and proposed a random move from x_k to x_k + delta_x and accepted or rejected the move with the Metropolis acceptance probability. naive_harmonic_path satisfies detailed balance, is aperiodic and irreducible but it painfully slow. If we start at a configuration as the one shown here the next configuration will be very little different from the one that we had already, because we only move one bead and we move it only by a little bit. If we made larger moves, we would get into troubles with the 1/2 rule. We would reject all the moves, and would not move either. So you see that a simulation using naive_harmonic_path would be very very slow in exploring all configurations in our system. To make progress let us move back by two steps first step: we take away the harmonic potential and consider a free particle, second step: instead of sampling the value of k at each step, we keep k fixed. We move the particle x_k for fixed values of x_(k-1) and x_(k+1) as programmed in naive_path_slice.py. So we move x_k with the Metropolis algorithm, and here is the histogram of the positions of x_k, it looks like a Gaussian, and in fact IT IS a Gaussian, because the probability pi(x_k) is put together from the density matrix rho_free(x_(k-1), x_k) and rho_free(x_k, x_(k+1)) The two are Gaussians, and their product is also a Gaussian for x_k. So the distribution pi(x_k) is a Gaussian with mean value (x_(k-1) + x_(k+1) ) / 2 and a variance sigma^2 = delta_tau / 2 Now, there is no reason in the world to use the Metropolis algorithm as we know that the distribution of x_k is a Gaussian. We should sample x_k from this Gaussian distribution with the correct mean value and variance. You might be interested to implement this idea in your version of naive_harmonic_path, as we did here for the free particle but rather than doing it, let us slightly generalize this problem and consider a position x_k sandwiched between positions x' and x'' with intervals in tau: delta_tau' and delta_tau'' The probability distribution of x_k given x' and x'' is the product of two free density matrices: the density matrix rho_free(x', x_k, delta_tau') and the density matrix rho_free(x_k, x'', delta_tau'') Both these density matrices are Gaussians and their product is also a Gaussian. Putting all this together and noticing that in this distribution x_k is the variable and x' and x'' are fixed we find that the probability distribution pi(x_k) given x' and x'' is a Gaussian with mean value and variance as shown here. Before continuing, please take a moment to download, run and modify the programs that we discussed in this section. On the Coursera website, you'll find the program naive_path_slice.py that treats the case of one variable x_k sandwiched in between x' and x'' This is an innocent little program, but it should convince you that the above formulas that we used are correct. In any case, running this program, you'll understand that using the Metropolis algorithm in this situation is useless: you might just as well sample x_k from the appropriate Gaussian. Then - again - you'll find the naive_harmonic_path.py In this algorithm, you could replace the Metropolis algorithm for the x_k by the Gaussian direct sampling but better wait for greater things to come. In naive_path_sampling we propose x_k with a flat distribution and accept it with a Gaussian, The mismatch between proposed and accepted moves generates of course the rejection rate in the Metropolis algorithm. We could modify the naive path sampling algorithm by proposing x_k with the appropriate Gaussian distribution and there would be no rejections any more. But we can put the conditional probability pi(x_k) to much better use. In fact, pi(x_k | x', x'') gives the statistical weight of all paths that start at x' that pass through x_k and that end up at x''. Let's apply this idea for x_k = x_1 x' = x_0 and x'' = x_N. We can sample the position x_1, given x_0 and x_N Between the freshly sampled value of x_1 and x_N we can then sample the value of x_2 and then of x_3 and then of x_4 and eventually the entire path In Python, this gives the following little algorithm: levy_free_path.py and look here at a configuration for a path with N=50000 produced by levy_free_path.py It has no correlation with previous paths and its construction has caused no rejection. In the limit N going to infinity, x(tau) is a continuous function of tau As a curiosity, let us discuss for a moment the return condition, that makes the Levy construction non-trivial. The path has to return to the position x' at tau=beta without this return condition, you would simply have a random walk, and you would sample the position x_1 from a given position x_0, then we would sample the position x_2 from the given position x_1 as a Gaussian. Then you would sample x_3 from x_2, x_4 from x_3, and so on, until x_N from x_(N-1). The partition function of this problem would be written as Z_random_walk = integral dx_1 .. dx_N of the density matrices rho_free(x_0, x_1, beta/N) until rho_free(x_(N-1),x_N,beta/N) And notice that we would sample over the position x_N, that means we integrate over the variable x_N. This is implemented in the algorithm continuous_random_walk.py but of course the probability to hit the position x' at tau=beta the probability that x_N = x' is equal to zero. Now, let me tell you of a simple algorithm (trivial_path.py) that will rectify the situation. What you can do is that you sample the Gaussian random walk that goes to x_N rather than x', then you pull back the position x_N by the value of (x_N - x') and you pull back all the intermediate positions by a value proportional to (x_N-x') * tau/beta. This algorithm produces output which is indistinguishable from the output of the Levy construction but I leave it to you as an exercise to prove this. Before moving on to next section, please take a moment to download, run and modify the algorithms we discussed right now. There is the algorithm levy_free_path.py In this algorithm you really can modify many aspects there is no reason to first sample x_1 than x_2, x_3 and so on you can rewrite your algorithm so that it first samples x_(N/2) and fills in the path in the lower and upper parts separately. Then there is the algorithm continuous_random_walk.py and the mysterious pull-back algorithm: trivial_free_path.py. try to understand - at least empirically - that this algorithm gives exactly the same output as the Levy construction. We have sampled so far free quantum paths continuous non-differentiable objects that correspond to the free Hamiltonian. This is possible because the free density matrix is a Gaussian and product and integrals of Gaussians are again Gaussians. For the harmonic density matrix, in Trotter approximation, the same is true. It is given by a product over three Gaussians: two for the Trotter formula and the potential, and one for the free density matrix. So at high temperature (small beta) the harmonic density matrix is given by an exponential of two terms: one in (x-x')^2 and one in (x+x')^2 Under matrix squaring, the harmonic density matrix keeps this form and an exact calculation shows that the explicit formula for the harmonic density matrix at all temperatures is given by this expression. The diagonal term rho_harm(x, x, beta) is given by exp(-x^2 * tanh(beta/2)) The fact that it is a Gaussian allows us to derive an algorithm for the Levy construction of the particle in a harmonic trap This is levy_harmonic_path.py and it is as simple as the free Levy construction. Again, we can now choose a configuration x at tau=0 and x' at tau=beta and the algorithm fills in a configuration, a path, between 0 and beta, without rejections and without using Markov chain methods. Of course, the path is now constrained through the harmonic potential and its extension is fixed by the oscillator strength omega and the temperature. Before continuing, please take a moment to download, run and modify the program we just discussed: levy_harmonic_path.py Besides this program and the graphics version, you'll also find a short fact-sheet that gives the relevant formulas and derives the harmonic density matrix through matrix squaring. You also find on the website a multidimensional version of this program We have no problem understanding that a three-dimensional quantum path can be sampled by independent Levy constructions in x, y and z. In this lecture, we studied the path integral picture of quantum statistical mechanics and the providential path sampling algorithm that goes back to Levy in 1940. In free space and in any dimension, this algorithm allows to sample a path from a position x at time tau=0 to x' at tau=beta. Notice that in this picture we are discussing a quantum system in one dimension: x but it looks like a classical line-shaped object in two dimensions It is for this reason that one says that a quantum system in one dimension corresponds to a classical system in two dimensions, or more generally: a quantum system in dimension d corresponds to a classical system in (d+1). As the temperature gets lower, beta gets higher and the extension of this space in all dimensions becomes infinite. As we discussed, the Levy construction also exists for particles in a harmonic potential and you will rely heavily on this to create your own Bose-Einstein condensate that resembles very much the experimental condensates that were created in this glass cell. So now it is time for you to download and run the program that we discussed and to create a quantum path like the one shown here. Create your own quantum path, with hundreds and thousands of points, and print it out on glossy paper, make yourself a frame and hang it up in your room. As mentioned, there are many exciting subjects that we will discuss in this week's tutorial and we will discover in this week's homework. So thanks for your attention and see you again next week in Statistical Mechanics: Algorithms and Computations for a week devoted to Bose-Einstein condensation.