[MUSIC] Welcome back. In this video we will discuss different approaches and definitions of privacy. In 2012, targets sought to answer the question, if we wanted to figure out if a customer is pregnant, even if she didn't want us to know, can we do that? It turned out the answer was, yes. Target was able to predict whether someone was pregnant based on their past purchases. Armed with this new knowledge, Target then sent advertising to their house assuming the customer was pregnant. This also revealed to the other people in the house that the targeted individual was pregnant. But what is privacy? What does it mean to say that an act violates my privacy? While Target legitimately had stored consumer purchase data, this particular use was a privacy violation. And it turns out that how scholars and regulators define privacy varies. And as we will see how you define privacy impacts how you protect it and who counts as having a right to privacy. Traditionally two approaches to defining privacy dominate our privacy discourse. Both have limitations. The restricted access view of privacy suggests that privacy is that which is hidden. That is where access is restricted and all that is not hidden is therefore not private. The restricted access view of privacy captures our instincts to want that which is hidden or closely held to be protected. When individuals go outside use apps, shop online or file an insurance claim, this version of privacy would suggest that they no longer have any privacy expectations since access to their information is no longer restricted. However, this version misses that we regularly have expectations of privacy around information which is shared with friends, doctors, teachers, coworkers, Target and online. The control view of privacy defines privacy as the degree of control someone has over their person and information. The more control you have, the more privacy you have. We see this in surveys sometimes with questions about the degree of control someone thinks they have over their information. In the United States, the FTC's fair information practices principles is how we put privacy as control into practice. The FTC requires companies to provide adequate notification and consent for users of their services online. Consumers have control by reading the notice and deciding to engage with the website or app. Unfortunately this provides an incentive for companies to make convoluted and ambiguous notices that do not explain how consumer information is gathered, stored, sold, shared and used. In fact notices are not read or understood and people tend to actually project their privacy expectations on notices and assume their privacy is being respected when a notice is present. How we define privacy is important to the ethics of technology. For data analytics the privacy expectations of the individuals and the data is critical to understand if that particular data set is ethical to use. Both versions of privacy the restricted access and control view of privacy place an enormous focus on the handoff of information to others. In other words, when information is turned over to a person or company, access to that information is no longer restricted and the individual no longer has control of that information. Both views find privacy to be diminished or non existent when information or people are in the public or interact with a third party such as a website and app or company. The idea of privacy in public is not a novel idea. The original story of Peeping Tom was actually a story about privacy and public. The folk tale around Lady Godiva is that after months of begging her husband the local earl to lift the onerous taxes on his people, he finally dared her to ride through the town naked on her horse in order for him to lift taxes. All the people in the town were to turn their backs, shut their windows and avert their eyes while she rode through town. Tom, a tailor in town was the only person to turn to look at her as she rode by. So we since know him as Peeping Tom, this individual violated the privacy expectations of Lady Godiva actually while in public. In fact, the off sided privacy paradox is in fact, a misnomer. The privacy paradox is the perceived mismatch between individual stated privacy expectations captured in surveys and their behavior in practice normally captured as to whether or not they go online. In other words, why do people go online and shop in grocery stores if they care about privacy. However, when asked in a more robust survey, individuals differentiate between disclosing information to a website and allowing their information to be tracked, collected, shared, sold and used by third party data brokers and ad networks. In fact, a recent study showed that contrary to common depictions of online sharing behaviors is careless, people care about privacy. And take steps to protect their information, but are faced with an uphill battle with all the trackers, data aggregators and ad networks online. Helen Nissenbaum directly addresses the utility of transparency and choice, also known as notice and choice, which is widely used in the United States to govern privacy online. Nissenbaum sees transparency and choice as appealing to those who wish to control information as a mechanism to respect privacy. Further transparency and consumer choice fits within our existing free market paradigm where consumer choice dominates regulations. However, she outlines why choice in this instance around privacy notices is not the authentic choice we might think it is. Further these types of transparency required to explain everything that happens to our data would be confusing to most. She calls this the transparency paradox. The more transparency companies are about how data is collected, shared, used, the less understandable their privacy notice is. Nissenbaum offers her theory, privacy is contextual integrity as a solution where privacy is the respect for the norms around how data is transmitted and shared and by whom within a particular community. In other words, within the medical community, we have one set of norms within the educational community, we would have another set of norms. In analyzing the data analytics program according to Nissenbaum, one would examine whether a data set or action respected the norms of contextual integrity. In order to respect individual privacy by analyzing whether the data attributes being gathered and used the actor and the transmission principle was appropriate for a given context. In privacy as a social contract I actually argue that privacy norms can be viewed as mutually beneficial agreements within a community about how information is shared and used. Individuals within a given community discriminately share information within a particular set of obligations in mind as to who has access to the information and how it will then be used. In other words, rather than giving away privacy, individuals share information within norms governing the use of their information. To apply privacy as a social contract of data analytics program, one would ask who has access to what information and why or what purpose and if that combination met their privacy expectations for that particular community. Understanding the factors that drive mutually beneficial and sustainable privacy norms is important to companies in order to best meet the privacy expectations of consumers, users and employees. For example, according to both myself and Helen Nissenbaum, customers would find it inappropriate for an insurance company to purchase behavioral data from their online activities. For Nissenbaum, online browsing and shopping data is not within the insurance context but is within instead the retail context. And sharing that data would breach contextual integrity. For privacy as a social contract, individuals share browsing activities with specific websites online and do not expect that information to be shared with a new firm and used within a new purpose, such as deciding insurance policies. See you soon.