So we have proven the formula for the density of the first integral arrival times one. This formulas given here, I will use for the argument letter t just to express as we are speaking about time. And now I'm going to prove the following formula which gives us the conditional density of the random variable Psi2 given Psi1. I claim that this conditional density is equal to lambda, small lambda, t plus s multiplied by the exponent in the power minus capital lambda, t plus s plus capital lambda S. Why this formula is true. Let me box it. To show this formula, we should separately find the joint density of Psi2, Psi2, and then we should divide this density by the density of Psi1. So, let me start with joint density of psi1 and Psi2. And, let me find the closed form for the distribution function of Psi1, Psi2. So Psi1, Psi2 s, t is the probability that Psi1 is less or equals an S, and Psi2 is less or equals than t. I will continue the line of reasoning here. So, actually, I will use a kind of total probability law, more precisely, I will write this probability as integral from zero to S probability that Psi1 is small or equal than S, Psi2 is smaller or equals than t given, that's Psi1 is equal to some fixed number y, multiplied by the density of the random variable Psi1 and point y, dy. Of course, we can simply cross away this term, because y is smaller than s. And now we can express this conditional probability as the probability that the increment of the process N, from y to t plus y is larger or equals than one. I formally should write down given that Psi1 is equal to y and multiply this by the density of Psi1. Now, we have the full link conditional probability. And here, we have an event which depends on the increment of N after time y. And what is given here is something, which is related to the process N before time y. Therefore, these two events are independent and the conditional probability is equal to unconditional. Finally, we can substitute here the exact form for this probability is equal to one minus exponent in the power minus lambda t plus y, minus capital lambda of y. And we can also plot in here the formula which is already proven. So we have small lambda of t exponent in the power minus capital, excuse me of y, minus capital lambda of y, dy. From this formula, we can definitely find the density function of Psi1, Psi2. To do this, we should take two derivatives, one with respect to t and another one with respect to S. It is more simple to take first derivative with respect to S because S appears as upper bound in this integral, and according to well known fact, this derivative is actually equal to the function, and as a integral where I will put s instead of y. So the first derivative therefore is equal to one minus exponent in the power minus capital lambda, t plus s, there should plus here, plus capital lambda of s multiplied by small lambda of s, exponent in the power minus capital lambda of S. And now, we should take the derivative with respect to t from this expression. You see that here, we have actually a difference between these two functions, and the first function doesn't depend on t at all. Therefore it's derivative physical to zero, and if we take derivative with their respect to t, we get small lambda the point t plus s multiplied by the exponent in the power minus capital lambda t plus s, multiplied by small lambda the point s and exponent in the power minus capital lambda of s. Now, if we divide this expression by the density of the random variable psi1 as the point S, we will definitely get on as the product of two functions lambda t plus s, and exponent in the power of minus capital lambda, t plus s. I forgot here something, is lambda of s. And therefore, we'll prove this formula. So, this observation completes the proof.