Last time, we talked about how secure software design can be simply good design. At the end of the lesson we enumerated a number of design activities that are considered good and which should also help with security. In this lesson, we'll go over each of these items in detail. At the architecture level, we'll talk about identifying and mitigating risks, drawing pictures, building backup and recovery into the system. At the high level, or architectural level, we talked earlier about identifying risks in the project, and then mitigating those risks. This means looking across the whole project and thinking about what might be hard to do. What's hard and easy to do, of course, is going to vary by person. But realize, as the designer, you have to understand how to do the whole project, what parts are hard for you to understand. This is not a time for prowess. This is a time for honesty. I'm sure you can understand how this is a good design idea, but where's the security in it? Quite simply, if in fact you don't understand the whole problem and the difficult situation ensues, because you hadn't identified the risk early on, you're increasing the likelihood of a budget crunch, time crunch situation where you're forced to move faster than you really can do to get things done. And that's the security risk. We talked about drawing pictures. Why do we have to draw pictures? Let's get the cause and effect straight here. Drawing pictures won't make the project successful. However, in successful projects, people will have draw in pictures. Pictures represent vision, pictures represent understanding. Pictures are often the easiest way to communicate to someone else what the project is all about. So, if you don't have pictures or diagrams in your project, don't make a pronouncement to everyone and demand that they draw pictures. Rather, figure out why the pictures aren't being produced and fix that. Is it the lack of a design tool? Is it a lack of common vision that pictures would make embarrassing to clear? Chances are, if you fix the underlying problem, pictures will appear. What's the security angle for pictures? Well, pictures are commonly, they mean a common understanding of the problem and of its solution. That's a good thing, in it of itself. It also means that more minds are thinking about the situation. And the more minds that are working on the problem, the more likely that security holes will be found. Building backup and recovery into the system. How can it go down? How can we get it up fast? These are questions you ought to ask. Depending on the architecture and criticality, you may need to plan for database backup, database replication, database fail over. Think about how data can be lost in failure, recovery scenarios, and make design changes to prevent unacceptable losses. Even if you're not using databases but simple data files, the same principle applies. How can I lose data in a failure and how can I prevent that? I wrote an application once to manage and track runners during a day long ultramarathon. I couldn't exactly envision a failure mode that would lose data, but I compensated for it, anyway, just because I always do. On race day, the application failed, because some other application locked up the operating system on a laptop it was running on. And it had to be shut off with the seven second power off, and then restarted. The data came back just fine, because all user entered data was flushed to data files with each transaction. The security implications of this are obvious. On the detailed design level, let's consider all the places in the software where free-form input can be encountered. Some of these can be subtle. Packets from the internet, downloaded documents, drag and drop inputs can be considered free-form. Basically, if the input wasn't built by your software, it's free-form. When you identify such inputs, you'll need to design a way to validate that the inputs are in proper format and won't result in anomalous processing, which could damage your system. Now, I can see arguments that this is an implementor's problem, not the designer's. However, if you leave the solution to implementors, you can get different solutions to the same problem, which appears in different parts of the software. Best to have consistency, leverage economies of scale, and benefit from thorough testing by designing one solution to each problem. Identifying sensitive data that should be protected is a failure that I've seen time and time again. There was a time when using social security numbers as primary keys in a table of users, or students, or, well, people, was okay. That time is long past. People should be identified in databases by numbers that have nothing to do with them otherwise. Addresses, phone numbers, health information, next of kin information and the like, should be encrypted. Database encryption is something that is frequently not done despite everyone's knowing better. The reason? People don't know how to do it. So, this is one of those hard-to-do things that you need to prototype and figure out how to do. Next, we'll look at design practices that involve working in groups. Adherence to coding standards is important for a number of reasons, security being only one of them. It makes the code more easily understandable when reviewed by others. You could also specify that certain security aware language constructs be used, such as C language calls that specify buffer length, so that there are no buffer overflows. You could also specify that if you're using C, or C++, for example, that no local string declarations are permitted in subroutines. String space can be allocated on the heap as an alternative. Enforcing coding standards is another issue. I have been particularly fond of source code parsers designed to look for violations and coding standards. These aren't so much formatting standards as they are code structure standards. Following each, if predicate, with a pair of braces is one example. The security implications here are easy. It makes it more likely that reviewers will find mistakes in the code. And it takes the burden of detailed code checking from the people and it puts it onto the machine. Recording design decisions is an activity, which, like the production of drawings, doesn't necessarily mean the project will succeed. You can document lots of bad decisions. However, good projects share this attribute. So, if you're not documenting design decisions, the question is, why not? If the project is a small one, it's perhaps the case that there aren't many decisions to be made. However, I have been on well-compensated projects where I am the only software person; designer, coder, tester, and I document decisions. The reason I do is that I hope to move on from the project and it will then become somebody else's responsibility. Documenting the design decisions is one of several ways to effectively communicate the essential history of the project to somebody else. Another reason to document design decisions is that writing them down generally involves justifying or explaining the decision. Doing this tends to lead to better decisions. And occasionally, even on small projects, there is a liability aspect, which means that if you made a decision on a design, you have to be able to recall when and why you made that decision. Writing it down is a good way to keep that record. Of course, on large projects where design decisions may be made by several people, a centralized design decision log will help keep everyone informed as to why the project is being designed a certain way. It also helps prevent investigating the same problems over and over, because there is a record of the justification for the decision. Ensuring good team communications might seem obvious. I think the relevant question to ask here is, "What constitutes good?". Alistair Cockburn on writing about Agile Software Development, emphasizes several aspects of communication on a team. The most important is the speed at which questions can be answered. If it takes minutes to answer a question, the argument for an onsite customer representative, then the design is more complete and less likely to contain substantial errors. If it takes a day or more to get a question answered, the project can sometimes be fatally flawed. This is because sometimes designers never ask questions knowing it will take a long time to get them answered. And if it takes a day or more, likely, the interaction won't be face-to-face, and a lot of information can be dropped just because of the communication medium. The other dimension of good, which Cockburn talks about is what he calls "osmotic communication." This is communication among engineers that occurs not specifically, but incidentally. When the designers are co-located and the requirements people and perhaps developers are together, people overhear conversations and this is, incidentally, osmotically acquired information. This can be critical to good design. In my career, I've overheard hundreds of conversations, which led me to believe that either something wasn't right, either I didn't understand it or somebody else didn't understand the situation. Investigating the situations further and you have to be not afraid to do that. Inevitably, either sets things right or creates a reason to rework some part of the design. Getting quick answers and everyone hearing anything is often a challenge. One remedy that works well is the caves and commons concept where workers have areas, which they can retreat to, so that they can think and work without distraction, and where there are large open spaces for group work, discussions, and people who prefer to work with the buzz of activity around them. To this end, roll around desks, whiteboards, chairs, and partitions can be employed in large open spaces and to offer limitless reconfigurability to meet communication needs as a project progresses. The security angle here is if you get the design right the first time, or almost the first time, you'll be in better shape at the end of the project when testing arrives. Also, the more minds that are involved in the project, generally, the better. Last group of considerations is synergy. Making the design a product of designer's experiences rather than just the sum. This is often a matter which is in place before the work begins. It has to do with the personalities of the people on the team. In this regard, respect is a huge attribute such that if it is lacking, the person should not be on the project. When I was in charge of hiring new developers, there was the usual process of interviewing them with a number of people. I started picking very junior people who weren't very knowledgeable, or didn't appear to be, as interviewers for one of the final interviews. The candidate would inevitably discover that the interviewer didn't know much. And if the candidate factor that into answers such that the interviewer would learn something, I was pretty sure to hire that person, assuming the other interviews had gone well. There were certainly cases where the last interview cost the person the job. Attempts to show that the candidate knew so much more than the interviewer wasn't a good thing. So, respect has to be part of the chemistry of a good team. To be sure, it takes all types. This is the diversity issue we discussed earlier. For about 16 years - yes, this is another story - I did undergraduate admissions interviews for Harvard University. Harvard has a lot of applicants. About 35,000 for the 6,350 beds in the rooms in Harvard Yard where the freshman class lives. There are more applicants with perfect S.A.T. scores than there are admissions slots. But the school doesn't want a freshman class where everyone knows how to take the S.A.T. test. Other factors are necessary. We sometimes referred to people as, some people as bricks; people who were excessively brilliant where they got their perfect score on the physics S.A.T as a freshman in high school. And some as mortar, who were the ones who would pay attention to their roommates' moods, resolve conflicts, offer study assistance to students who weren't getting it. Very often, the words "chemistry of the class" came up. That had to do with the making sure that there was a good mix of personality and experience. The same chemistry, or brick and mortar analogy, holds for project teams. To be sure, you need way smart people, but you need more than that. Sometimes, you need people who care about the way smart people, because the way smart people sometimes can't care for themselves. It's an interesting world. But this is what needs to happen to have a good project team and to make a good design team. Here's a security aspect we haven't talked about before; insider threats. People on a project can sometimes be co-opted to do bad things for the project. We've certainly seen evidence of that in a big way with the Bradley, Chelsea Manning case, and with the Edward Snowden case. These are very dissatisfied employees. Building a team where there is cohesion and satisfaction goes a long way to combat this kind of project security weakness. Okay. We'll break here, because this has been a long lesson. I'm returning, and we'll continue our enumeration of good design practices that result in more secure software. Thanks.