[SOUND] [MUSIC] Welcome back to the course on quantitative model checking. This second module deals with Discrete Time Markov Chains. And the first lecture is meant as a gentle introduction to DTMCs. So let's talk a bit about DTMCs. I would like to start from the point of view of state-transition systems. Recall that a state-transition system consist of nodes that represent states. A state can be the current color of a traffic light. The current value of all program variables or the current value of registers and values of input space. Then, we have edges. Edges model transitions. Transitions can be the switch from one color to another. Or it can be the execution of a program statement, or the change of registers and output bits for a new input. Let's take a quick look at the definition of a transition system. A transition system is a three tuple that consists of a set of states, the state space, a transition relation and the set of initial states. Now, this simple model is meant to model a vending machine where I can use my coin. Put it into the vending machine, and then choose whether I want to have, in this example, beer or soda. And then hopefully, the vending machine sells me the beverage of my choice. Now, if we want to move to Markov chains, we have to add probabilities to transitions. And these probabilities indicate the chance of taking exactly this transition. We also add time by specifying how long a system stays in a state. Now, time can either be discreet, modeling an abstract time duration, or it can be continuous, modeling a real time duration. Let me illustrate this for you. This is my timeline and discreet time steps, they should have the same distance between each other. Now, if I want to model a discreet signal, I would put the discrete signal here at these discrete time steps. Now instead, I want continuous time signal and then this would look something like this. Please meet Mr. Andre Markov. A Russian mathematician who is well-known for his work on stochastic processes, which later became known as Markov processes or Markov chains. Now, what all these Markov chains have in common is the so called Markov property. Which states that the future evolution of the system is independent from its past states. Another way of putting this is to say that the current state of the model contains all information that can influence the future evolution. And this concept is also known as memorylessness. Now, let us take a look at Discrete Time Markov Chains. In discrete time, we observe the system at a set of discrete time points, as I have just shown you. Now, I have to tell you something which you probably won't like. But in order to deal with DTMCs, you have to be able to deal with matrices and vectors. And especially, you have to be able to do matrix-vector multiplications. Furthermore, you need to be able to solve a system of linear equations correctly. Now, let's take a look at the DTMC definition. DTMC also has a state which is denoted as countable set of states. And also, a transition relation, but the transition relation is given by a probability matrix P, which is a stochastic matrix. What does it mean, that this is a stochastic matrix? Well, first of all, the sum of all the outgoing transition probabilities for one state S, has to be one. And all the entries in that stochastic matrix have to come from the interval of zero and one. So here is a quick reminder for you, what a stochastic matrix is. Now, let's put this to practice. I have come up with a very simple example and this is a model of your favorite web server. For example, Netflix. It can be in two states. It can either be working or broken. Let me illustrate this for you. This is the state where the web server is up and running. And this is the state where the web server is down. Now, we have some information available. And we know that if the web server is working today, it'll be working tomorrow with the probability of 90%. Now, I can add a transition with probability 0.9. Further more, we know if the web server is broken today, it will be broken tomorrow, with probability 60%. Also here, I can add a transition with probability 0.6. Of course, the web server can also move from broken to working and back from working to broken. So let's add transitions here. Now, what is left is the probabilities. And because this is a DTMC, we have just learned that the sum of the outgoing transition probabilities needs to be one. So I can add 0.1 here, because then 0.1 and 0.9 adds up to one. And I can also add 0.4 here. 0.4 and 0.6 also adds up to one. [COUGH] So, let me clean this up for you so that you can better see it. And this is how it looks then. And if you want to know what you can do with such a DTMC model, I recommend to watch the next lecture. [SOUND] [MUSIC]