[MUSIC] We are now at the third section of the second lesson and we will be taking about sensitivity auditing. Which, as I mentioned, is something we want to do because we are aware that use of mathematical modeling can be very fraught with vulnerability let's say or to abuse it, if you prefer. And for this reason, we'll be presenting you with a set of simple groups which can be used for to check the quality of model-based assessment, if you are reading one. Or, if you are producing one, to make sure that those rules will comply, your assessment will comply with those rules. One of these rules, that we've discovered is find uncertainties before uncertainty find you. Sensitivity auditing is something which was developed as an extension of sensitivity analysis. And the idea was that when a mathematical model is used to support policy a simple technical sensitivity analysis will not be sufficient, we will need something more. But what is sensitivity analysis then? Well, sensitivity analysis is something you have to do always when you apply mathematical modeling. It's an example of due diligence, if you wish, or a good practice. In this context of the post-normal science diagram, sensitivity analysis is also done when you are in the applied sciences section of the diagram. Simply to make sure the quality of your analysis is sufficient. When you go beyond, when you are in post-normal science, for sure you need sensitivity auditing. A sensitivity analysis can be visualized in this way. Imagine you are using a mathematical model, you are using input data. This data used to calibrate some parameters, then you also have uncertainty on how you set up your model, the grading, the resolution level, you have different alternative model structures. And when you ply all of these together, and manage to run these into some kind of Monte Carlo simulation, when you repeat over and over the calculation changing all of these inputs simultaneously What do we get is the distribution for the output. It is the grey curve. So that the horizontal axis is the output. And the vertical axis is the frequency of occurence. This is an empirical distribution function of your output or inference or prediction. And what you see here, a sensitivity analysis, is when you go one step forward and try to see where the uncertainty in this prediction is coming from. So in this example, you will see that the last part of the uncertainty comes from green arrows in the input data. Okay, but as I said, we want to move beyond this assessment of the relation between uncertain input and uncertain output. Because in this kind of assessment, in a sense, you assume that everything in your mathematical model is true, is fixed. In sensitivity auditing you abandon this pretension and everything is up for discussion and analysis. The first rule is in fact about making sure that mathematical model are not be used rhetorically as Latin. You remember the example of Kenneth on our previous lecture. Is not being used as Latin to scare people, to impress people. Here is a figure of a priest speaking Latin to a lay person so that this person would not understand what's going on. The second rule is about hunting assumption. Try to understand on which kind of included or not included assumption the model lies. The example I give is a model from an article of John Kay, an economist and editorialist of the Financial Times. It has been studying for many years, the use of impact assessment in the context of transport policy in the UK. And the model they use more frequently is called web tack, and this model, used as an input for simulation, how many people will sit in a car on average, many years from today, here I have 2036. But I have seen simulation where this is even extended further in the future. Now the idea that we don't know how many people will sit, on average, in a car so many years from now. If we need this number as an imput for the analysis, clearly something is wrong. Here we have an example of assumption hunting. Given that in the literature, you can find an assessment of the damage associated to the major nuclear accident in the middle of Europe. This other set about to say, are you really sure that you can compute the consequences in terms of net effect and dollars, Euros of such an occurrence, of such an accident? And they went and saw that there were so many uncertain variables in the calculation of the consequences of a major nuclear accident in Europe, they just simply could not compute a number for the outcome. There was too much uncertainty. So this analysis could not be done and could only be done if you Would simply ignore many of the uncertainties. And it is linked to the next rule which is a rule where you may want to achieve a certain level of precision by ignoring uncertainty, this is what is meant by Garbage In, Garbage Out. The first formulation is due to and rabbits in the book we mentioned in the previous session. They say garbage in, garbage out is when you try to compress the uncertainty in the input because otherwise your output would be indeterminate, which means all over the place. Rule four, here we are into when this analysis should be done, when a sensitivity analysis should be done. Clearly, it should be done before because if you don't someone else might wait for you and this is what will happen to you in this case. In fact, also in applied econometrics they have a similar rule. They say before you go out and you present the results of your econometric inference, make sure that you have done the sensitivity analysis, the robustness of the analysis. Also for Peter Kennedy here, the objective is to anticipate, Criticism. And this is one of the Ten Commandments of applied econometrics. The next rule has to do with transparency, because very often mathematical model are accused of being opaque, no? You cannot look into it. It's a black box. And they say very often what you need to do is to make sure that if you are in public policy setting and you are using a mathematical model, everyone must be able to use the same mathematical model to make sure that they wouldn't get difference inferences with possible changes in the input of the parameters. I have taken this quote from an American report from the Office for Management and Budget, but this discussion is going on also in Europe in impact assessment context. The next rule has to do with doing the right sum and this has to do with the issue of framing. We will discuss framing extensively in one of the next lesson but here what I want to say that one has to be sure that the aspects of a problem which have been captured by the mathematical model. It are really those which are relevant to the issue being considered and not some issue which we simply selected because we had the model to do it. Also in econometrics we have a similar rule, because also in econometrics, you normally apply econometrics of social issues, so you are trying to find out how you can improve on a certain social situation. And hence it's important to put the right questions. And this is one of the famous ten commandments of applied econometrics as well. We are now to the last rule and the last rule prescribe how sensitivity analysis should be done analysis a see in the ritual is done by varying one factor at a time this is hopefully wrong because it will change one factor at a time and then I mean you change factor one then you change factor two. And every time all the other factors are fixed, if you do this, you simply scratch the uncertainty. But you don't go in any depth assessing the uncertainty of your system. I tried to illustrate what would happen here. Imagine that you have to use a scaffold made by two ladders and a plank. Of course the behavior of the scaffold is different from the behavior of the two planks made in isolation and superimposed. This in a way to think that all parameters have effects which only emerge when you allow them to vary together because they would exhibit model behaviors which otherwise you will not capture. And this concludes the session, the last section of section number.