One of the biggest issues you're going to have to deal with if you're going to start doing predictive analytics, is dealing with some of the organizational issues. These can be politics, these can be technology, but ultimately remember what we're doing. We're taking data, big data if you have it, we're estimating some models. That requires technology, and then base on those models I'm going to start changing my organization, guess what? Some people are going to win and some people are going to lose. You may find out that an intuition or hypothesis you had is completely wrong. The person who's basically made their career based on intuition is going to be real happy on this. So here are some of the issues that you're going to need to think about, if you're going to start doing predictive analytics. Okay, some of the common reasons why companies have trouble doing this right. Insufficient model development right? You can do data mining, which has a lot of benefits doing data mining, but ultimately what you want to know is, right. What leads to what? And what actions can I actually take as a company to do this? So even if you do the data mining, you need to again start doing this peeling back the onion which means, you need to think about what these linkages are. For my strategy, what do I think this causal business model is? How is it that I'm going to implement this. Will I think that A is going to lead to B is going to lead to C? Let's see, but you need to lay that out. The other problem I've seen in a lot of companies is, they will ask me, okay, what's best practices. Or I saw this benchmarking model, or this generic measurement framework like the balance scorecard. So I'm going to use that to pick my performance measures. Well these problems with that. Again, strategic advantage means you're doing something different than your competitors. So the last thing I want to do is benchmark myself and do exactly what they're doing. The other problem with generic measurement frameworks, they're generic. Which means it says okay, everybody should be doing this. Now what you want to do is tailor both the measures that you track, and the analysis you do to the strategy of your company. So if I think I'm going to compete on a different dimension, I do not want to be doing what the other guy did. So this means there's going to be a lot of up front, starting with your strategy to decide, here's the analytic models I want to estimate, it's not purely statistics. And based on that, what are the actions that we as a company can take? Another problem you have is, we are taking measures, right? Measures can be good. Measures can be really bad. Couple of reasons of why you might have measures that are bad, are what are called psychometric properties. One, does it really pick up what you claim it's picking up? Right, you're trying to estimate some construct, and I gotta find some way of measuring this kind of intangible thing. Well is that measure you have is that really picking up that intangible you care about? And the other thing is, is this influenced by so many other things that it bounces up and down all over the place. And I have no idea whether that means you're doing well or not doing well. Now this really becomes a problem when you start using surveys for questions. One of the problems is you may have too few questions. Right, like how satisfied are you, a little, a lot, and that's the only question I ask you. That doesn't help me much afterwards. because even if I find a relationship, I don't know which dimensions of satisfaction you answered about, and I don't know what to do as a manager. Another one with scales is, you have too few scale points. You're using one to three, I'm not satisfied, I'm very satisfied, or I'm in the middle. That doesn't really tell you a lot. Or you use what's called the top box measure. Say your satisfaction skill goes from one to five, what percentage of the people are at five? That may be fine as we saw in some of the examples. In other examples it may not be fine. You need to do the analysis of whether moving everybody to the top of the box even makes any sense. But again what you would like to have are measures, what's called good signal to noise ratio. Signal means it's responsive to managerial actions. When they take an action I like, it goes up. When they take an action that's not good, it goes down. So it's responsive. Noise means it's not effected by all kinds of other stuff that are outside of your company's control. It's not just bouncing all up and down. So if you could pick measures, or analyze measures that have high signal. They respond to the actions we take. And low noise, they're not affected by all kinds of stuff outside our control. That's going to help you when you do the statistics, because it's more likely you're going to find the relationships that are really there. Some other reasons that people have trouble doing these analysis. You're measuring the wrong attributes. Again, customer satisfaction. You can ask me do I like the facility. Well, I could answer that, but that's not a question that has anything to do with whether I'm going to come back or not. So you have to make sure, again, when you do the analysis, when you have these measures, are you measuring things that actually impact behaviors? So again, that's peeling back the onion. Some other problems are really organizational right? Those are really kind of, are your measures good are your measures bad. Some of it is the fact that it's not that we don't do analytics in companies. We do analytics. Some's more rigorous than others, but we do it in little pockets, or as we call them islands of analysis for strategy silos. Everybody's doing their own little analysis, the marketing guys, the strategy guys, the operations people, nobody ever talks to each other. You're all making action plans based on your own analysis. And don't look at how are they interdependent, or even how does one analysis in one part of your department, impact analysis done by the other part of the department. Again, strategy is trying to tie all this stuff together, so you need to think about how can we do this? The other thing is, you need to figure out what is your intuition in the first place. You may not know this, or think about this, but you do have hypotheses about what's going to make your company work, that's called strategy. Right. You do have intuition that if this happens, I think something else is going to happen, but in a lot of cases we never actually write that down. Well, to do the analytics and to figure out which model I estimate, you should write down, what is intuition. What do we think is driving success? What are our hypotheses here. Spend the time upfront trying to figure out what are the relationships you want to test, before you just start going off and doing the statistics on this. Another problem, is as companies say we have lots of data, no information. Data's really not a problem, think about most companies, we got more data then we know what to do with. It's getting even worse now, because storage costs are so low. We save everything. But what you need again, are the resources that the appropriate skills sets to actually analyze it, to turn data in the information. Now skills sets does not mean you've got always great statisticians, yes you need that. The other thing you need though, is the business people that tell us statisticians, here's what we need to estimate or here's what we need to know. Because ultimately the predictive analytics need to be turned into action plans. And that's where it requires some mix of business skills and statistical skills on there. The other thing is you need to dedicate resources to this. Its not like most of us have time in our day to actually do analytics. Where are the resources going to go, and do that Ideally what you'd like to do up front, is what I call proof of concept. Pick off a small analytics project, do it, show that it works. Pick an area where you're pretty sure I can learn a lot on this. And once you do that, you're going to start getting the resources committed to it. Instead of saying, I'm going to analyze the whole company in one big fell swoop, start out small, learn how to do it and show proof of concept. We have the other issue which is technology. You could have the greatest data base in the world, okay, we went out and we bought Oracle, we bought SAP, we have a relational data base. That in theory all the data supposed to be in this relational database. In theory. In practice, though, how you can actually access the data depends on how you coded it. Okay. And here's the problem I've had in companies. You get in there and they say, we want to link up employees to customers to did the clients decide to keep our contract? So that's fine, you've got date on the client, did they renew or not. Right? You got data on customer satisfaction stuff. You got data on employees. Well it turns out the employees stuff was never coded up to the client level, so I can't match that up. The customer satisfaction thing was for the whole company, but you have 14 different projects with that company and they're only doing one at a time. You need to think in advance. How are you going to code the data before it goes into the database? And think forward, think about four, five years from now, I may want to do analytics. Make it as granular as possible when you put it into the database. You can always roll things up, you can never roll it back down. The other thing you should check on is, when somebody says it's something like defect rates, are they defining it the same way? We've gone in the companies where you, and here's an example from an auto plant. Two different auto plants in the same company, each one defined defects differently. So trying to match those up when they didn't even find defects the same way, that's going to be hard to do. So that's a problem that you have on this. Another big problem, remember ultimately we're trying to predict financial outcomes. The problem is, most of your financial data is going to come out of your accounting system. Some of the outcomes that you want to predict are not things that naturally come out of an accounting system. And one good one would be customer profitability. Not how profitable the product is, not how profitable division is. For this customer, am I making money on that customer or not? Well, that's not naturally a way we gather data in an accounting system. So, until you can start actually gathering that financial data that way, it's going to be hard to predict the model you want. Now, maybe the case you say, I want to do this in the future, we start tracking it that way. But that isn't limitation. Finally, politics is everywhere. Okay, you need to worry about things data fiefdoms. Data is strength. Different parts of companies don't like giving up data. How can you get different functions to start sharing? To actually do this analytics, linking non financials to financials. I need the finance people, the marketing people, the operations people, to start sharing data. This is not always very easy. We've had companies where we've had to go to the very highest levels of the C suite to actually get different functions to share the data. The other one is, what do you do if you're intuition doesn't appear to be true? Here's somebody's who's made their career based on their intuition. And now you tell them, hey look, maybe that intuition's not right. Right. A lot of people don't want to know the answer. I've staked my career on this relationship. It better be there. All you can do is either show me I'm right, which I already know I'm right, or you're going to tell me I'm wrong, which I don't want to know. I do not want to know the issue, the answer to this thing. You've got to worry a bit about the politics upfront. And the organizational power issues. Is this going to shift when you start saying the money should be invested here versus there? What's going to happen? So given those issues, this can be a little tricky, but it doesn't really matter. I mean, ultimately this is where it's going. We are going to start analytics, like it or not. You need to at least think about the politics up front. But ultimately we're going to do it. So here to conclude, here's the key questions you want to ask. First, what's the firm's business model in the first place? Forget the analytics, start out with your strategic plan, right? Start asking yourself how specifically is this company or this business unit expected to create value for our organization? Specifically lay out that causal business model. How is it that A is expected to lead to B, is expected to lead to C? Now, once you've done that, now you can if that's the model I want to test, what data do we have that are currently available to test these value propositions? Well as I said, most companies data is not a problem. Information is. My recommendation is, don't reinvent the wheel. Don't start tracking new data. Try to find data that you currently have in the organization that's close enough to what you need, and start testing these relationships. Don't start gathering new data, because there's probably some there already. Ask yourself, what is the desired economic outcome you care about? It doesn't have to be profits. It could be revenues, it could be revenue growth. It could be if you look at contracts, did we win the contract, did we not win the contract. It could be retention rates. What are the economic outcomes? And based on that, you can start gathering the data and doing the analysis on this. A really big question to ask yourself, what's the appropriate unit of analysis? It may not be the whole firm. In fact it may be very hard to do this at the firm, right. Is it an office, is it a plant, a region? Right, as you've seen in other videos, it could be customers where we do customer analytics. It could be a product or service a program or initiative. Figure out for the analysis you're trying to do, what is the appropriate unit of analysis there. It doesn't have to be low level customers, it doesn't have to be the whole company in total. So given the analytics in which you want to answer, that could change. And finally, you would like to make this an ongoing process, not we did it once, here's the answer we got. Strategies change, competitors change, the business world changes. You need to keep doing the analytics to say, does this relationship still exist? How is it that you can set up an organizational mechanism, to ensure that you have this ongoing analysis to include, let me give you an example of what some companies are doing. What they do is set up quarterly meetings with all their high level executives. In each one of those meetings they lay out some hypothesis that we would like tested for the next meeting. Then the analytics group goes off and they test these hypotheses, they present the results in the next quarter. I guarantee you that when you present the results, a whole bunch of new questions start coming up, which lead to the next test of hypotheses that we're going to do on there. And by doing that, you've got an ongoing mechanism of updating the results. And again, peeling back this onion til you get to the point where you can start answering these questions and ensuring this predictive analytics becomes embedded in your company. As opposed to becoming a just one off exercise that you're going to do maybe this year, maybe next year. No. This is an ongoing process as the world changes faster and faster. You need to set up an ongoing mechanism. And with that, I'd like to thank you for listening to this, and I hope you can take some of the things I've taught here and use it in your companies. Because predictive analytics can be incredibly powerful tool, both for figuring out how your strategy is working, and more importantly, figuring out what's the biggest financial payback when you start linking these non-financial measures to the financial measures.