Hi and welcome back to the penultimate training video that we're going to be doing were in continual improvement still and we're looking at being able to demonstrate compliance. Yeah, we talked about demonstrating compliance several times, but now we're taking the results of a check phase and really trying to work out what's changed and how to evaluate where we are. What to do with those monitors and metrics to make sure we can move forward into the future. So we've already talked about whose expectations are we trying to meet and what are we trying to deliver. So let's think about that audience for a minute, let's think about that audience. Who are your stakeholders and what do they want to know, who are we reporting to, were reporting to a regulator? We were reporting to a privacy regulator, a health regulator, we're reporting to the management, the head of the privacy team, the head of the security team. Are you reporting to the head of the other business units, the data subject's shareholders, what do they want to know to ask them? Ask the management what they want to know, In fact, quite often amazed by the amount of privacy teams that don't actually go to the management look. what do you expect of us, what do you want us to do, what's your concern? Is it just that you don't end up in court or who are you go back to those questions we talked about the start of the course? Who are you, what are you trying to achieve, what are your priorities, is it health care for the individuals, is it being the best in breed? Is it just not ending up in court, what do you want to know, what what do we need to tell you? Management, I think they're spending money on you as a privacy division and they're going to want some sort of return on investment. So how do you demonstrate that, how do you demonstrate return on investment, how do you demonstrate your program maturity? And I think program maturity is a really good thing to talk about, how are you going to prove that your manager risks? Because remember there are a lot of people define it binary, are we complying with the law, yes or no? That's not really what privacy program is about It could be about multiple laws across multiple jurisdictions across the globe. Or it could be we're going to ignore the law we're going to take that risk, we just want a privacy program that safeguards individuals no matter what the law says. So what are we trying to achieve, how do we demonstrate whether our program is mature, mature or not? Are we going to evaluate the privacy program itself or the artifacts of that? My artifacts, I mean, the policies, the procedures, the training, the requests, the actual things that you're delivering, or are we evaluating the program itself? Ultimately, the organization's kind of once it's risk managed, so remember, ultimately, what those risks are what you're trying to prevent? What you're trying to do for the organization safeguarding the individual at the heart of what you do. And produce the metrics and measurements that support that once you have the metrics and measurements, we can analyze them. We can look at how they change over time and I think that's quite an important thing and the only spikes or changes. So there are things that change, trends going up or down either are sort of events or cycles that are going on. So look for that term trend analysis, you might see in the exam that the process of analyzing your measurements over time. Okay, think about the different sort of patterns that that trend analysis might give you. Let's have a look at a couple of graphs, so here's a graph, the frequency of some sort of measurement or metric over time. Let's say it's number of subject access requests for example, so what are we seeing here? Okay, so here's our, here's our measurements, what we're seeing here is growth. Yeah, we're seeing that the number of subjects requests are going up, makes sense we received one on day one, you can see there's a blip here. Let's imagine that the bottom here was months so obviously sort of midway through the year or just before, midway through the year, there's a peak of access requests. Now, if we did this every year, we could say, is that people always in the same place every year, is there an event or a cycle. Could that event be a one off a new law? Say the GDPR came into force May 2018, so we see June July 2018 a huge peak and subject access requests. Next year, no such peak, we can put it down to that, or does this happen every year we send out our annual reports in April? So in May we get a huge spike in the amount of subjects request to get, I deal with a few different companies with similar things. Education providers, exam providers, when they bring out their exam results to students they then follow, they get opinion subject access requests makes sense. Or when they send out their annual employee tax summary, they might get a pin in subject access requests, kind of makes sense. So, what are the measurements and metrics telling you, does that help you, what does that help you to do for the next year, can you plan your resources accordingly? Really useful stuff about some other ones we get this is a classic one. So, we get recurring peaks perhaps this could happen at the end of every month or at the end of every semester or every quarter. Can we correlate it to an event or or or or a cycle again? This is more of a cyclic approach where the end of every quarter we're getting some sort of peak. Or even complete breaks at all where you get nothing and then Salmon and some but actually this gives us a fairly constant picture. Things aren't changing, things aren't going up or down so don't forget trend analysis, trend analysis quite an important thing to do. When it comes to your program I have to say it's not about compliance but maturity people get really hung up on this compliance idea. Do I comply with your, can I put green in all the boxes? Well, we've already said that data protection doesn't work like that, you can't put green in all the boxes. You can't ever claim that you've not holding any data longer than you're supposed to, yeah, it does doesn't work in terms of program maturity. So what I tend to do, they tend to I've come up with about 50 different boxes for for sort of data protection things you might have a box for the privacy office and how it's planned. A box of risk assessment, A box for DPAS A box of access requests, a boxer razor requests, a box of documentation, A box of privacy notices. And for each of those 50 boxes and those boxes might change depending on what framework or law or or or area you're dealing with and then don't say do I comply yes or no. I think that's really dangerous because, you can take say that everything is green and then walk away whistling. That's not a continually improving privacy program, what I start to do instead is look at program maturity and I use a sort of a naught to five scale here in ought to five scale here. So here's an example, so naught it doesn't apply, topic does not apply, don't have to worry about it. One, there's probably no organized way of dealing with it, there might be individuals who do something, but generally speaking it's inconsistent nobody does anything. Two, we're trying to get something done, we might have had a documentation, we might have tried to bring in a policy procedure, but there's evidence it's not dealt with consistently. It might be dealt with inconsistently, but we have got something in place. Number three, I think this is where we say it's in place and consistent staff are doing it. They're trained it's probably likely that every it's dealt with properly everywhere across the organization free is a good place to aim for. Four, we're then getting better, we're doing the check in the act stuff now, we're monitoring it, we're measuring it, we're reporting on IT management and looking at the results of that evaluation. We're taking some improvement actions and then up at five, we're starting to optimize were starting to improve. We might have automated it in some way or taking some sort of world class solution here. Now clearly, you don't want everything to be a five, if you never have a subject access request, why would you want it to be fine? If you might be happy with it being a one, you might be happy with it being a two. So what I tend to do with these program maturity models is go in and speak to the management and say, look here are your 50 buckets, who are you, where do you want to be? I want to be free for that for for that, I'm happy with being a two for that you let them establish their own appetite and then you work off the delta. You say okay, you wanted to be a four, you are a two, I believe it could work the other way you want it to be a two. You are a four so actually might want to reduce the effort will put you into that area. So the idea of course is they're not just looking at your data protection compliance. If you like in terms of the law that he were evaluating the program as a whole, the program maturity as a whole. That's fine, so we've got our review, we've taken our check phase, yeah, we looked at this in the previous one. We've looked at all the we've looked at all the monitoring and measuring. We're doing in the Czech phase, all those monitors and measures within doing a review within looking at what we could change for the act phase. Any corrective actions we need to take, any preventive actions we need to make any preventative actions we need to make to eliminate root course. It's not enough to look back however, I think it's also important in that management review to look forwards to look at horizon scanning. And this has become really important as well, once we know what's in the past, that's great. But what is the next year, the next period holders, how do we need to change our objectives and plans for what's coming up? What does the future hold, what can you plan for, is the business going to change new products and new technology that you've got to be using? The laws, regulations going to change any recent court cases, current events, things that people are worried about. what's going to happen over that next year, that's going to change your privacy program. So really that transition from the Czech Phase two, the act phase, is that looking at the past, looking at the future and then deciding what actions we need to take to improve. The final session We're going to look at is audits assessments of where you are. And once we've done the audits and assessments wherever you are that will be the end of the course. Thank you.