We have with us today two experts, to help us unpack the implications of data privacy and corporate reputation. Dr. Ana Martinovici is Assistant Professor Department of Marketing Management at the Rotterdam School of Management, and Dr. Daniel Trottier is Associate Professor at the Department of Media and Communication also at Erasmus University, Rotterdam. Welcome to both of you. Can you start by defining privacy in very simple terms? What is privacy? I would say privacy is the right to choose what is known about yourself to others. Also the right to choose not to have anything known about yourself. The right in a way to be alone, but that's a rather general definition. It can be more tailored depending on the context. I would agree with your definition and I think it of course matters if we're talking about privacy as a legal concept, as a policy matter or as something in our culture. But building on this idea of having a certain degree of autonomy and decision-making in terms of how you're known in different contexts, and over time as well. That's how I wouldn't understand privacy. This is an important point about control and autonomy and having a choice. That's also a core contention in this particular case, and we will discuss more about that as we go along. Indeed, we can attest to the fact that new and varied threats to privacy have emerged, particularly in the digital universe. I can share from some recent industry data. Indeed, that privacy issues of cybersecurity or data breaches, can lead to clear reputation risk impacting financial performance, consumer trust, a drop in brand value. In particularly a study by Ipsos, situates technology companies, especially in the US, as facing the highest reputational challenges or reputational threats. There's some interesting data also from IDC, the International Data Corporation that shows an operational disruptions and legal ramifications aside, victims of a data breach, for instance, are very likely to not only experience a drop in trust, at some 65 percent, but 85 percent of them are likely to go and talk to others about their experiences. We're really talking about this negative word of mouth, which then spirals into reputation risk. But that said, I mean privacy, right? Correct me here. Privacy is a much more umbrella concept, bigger than data breaches or security lapses. What's the big fuss about privacy and why now? Well, I would definitely not call it a fuss, because it is important. I also think that maybe we're not as outraged as we should be. There's growing interest in privacy, and rightly so, because we are sharing more and more about ourselves many times, unknowingly. All the things that we do when we use an app or a website, when we just browse the Internet, all of those actions are being recorded and we are sharing a lot more than just the posts we actively decide to put online. I think it's yeah, it's growing and it's going to continue to grow in terms of concerns. Absolutely. There's so many stories anecdotal and otherwise that we hear in the news or as he said from friends about privacy breaches or the consequences of posting something online. I think this fills up our minds, but at the same time, there's so much that we're not aware of. We're not aware of enough at the moment. Things like text or personal data being saved as raw text files and all these very avoidable potential breaches that, yeah, it's quite a pressing issue. In your research. In particular, I know you talk about the role of attention in decision-making, looking at various skews, such as eye movements and particularly you talk about the digital footprint. The trail of information that people leave behind when they post or share information about themselves, that reflects our interests, preferences, and attitudes. One would think, maybe it is less of a surprise now that people should know that there is nothing private about what is in the public domain. But I see you laughing here and then smiling. Talk to us about the role of the individual, because really the access to this kind of data, the digital footprint that you talk about, is gives organizations, policymakers a very powerful position to influence choices and behaviors. Yes. It does, I can only agree to that. My research is indeed about attention, and how attention reflects preferences and interests, and can also be used to predict what we're going to choose. Attention can be reflected by many things, because at its core, attention is not observed. It's what happens in your brain. It's what happens in your mind, like what captures your attention. When we are online, or when we're using apps or our mobile devices. What is important to us? What preoccupies our mind? Is being reflected in the websites. We check the articles, we read, the pictures, we like, the pictures we share. That's how companies, policymakers can get information about what occupies our mind. Then they can use that information to potentially influence our future choices in many domains. It goes beyond just influencing choice form, food choices or clothes. Also maybe in the medical domain. What do you say that people paradoxically also want these personalized marketing and communication efforts, or more targeted marketing. That the information is more user-friendly, or even applicable and relevant to them? Yes. Clair, and there is also quite some research on that. As you also know, it's called the privacy paradox. Consumers do want all of this personalized service and products and recommendations. We do like it that we go to Amazon and we get a recommendation for what we might be interested in buying. But at the same time, many consumers are growing more and more worried about how their data is being used. They tried to balance these two aspects. Daniel, you are a sociologist by training and you've also published extensively in the realm of digital media, and it's uses for the purposes of scrutiny, denunciation and shaming. In particular, want to ask you to elaborate a bit on your research on identity problems in the Facebook era, can you speak to what kinds of identity problems are you referring to here? Whose identity? Well, looking at the way that platforms like Facebook and Twitter and Instagram have grown. Not just in terms of the amount of people using them, but also the various contexts in which we use them, and using them over decades in certain cases. The way that people might consent to having certain information up there and then finding out, you know, a few years down the road in a different context, be it a job interview, border crossing, or anything else, that this information might compromise them if it's either linked up with other data, or presented in, let's say maybe a different context. There's many cases also anecdotally, but many people have experienced this as well, of potentially stigmatizing information that either they knowingly or unknowingly uploaded, or other people might take a video recording of someone at their lowest point. These things can be taken out of context. Sometimes the context itself is quite damning. We can get into the specifics of no particular contexts. But this is something that I think when it comes to public shaming, and the core of the public opinion. We're seeing a vast expansion of these moments. Right and particularly in the sociopolitical realm as you see with the Facebook Cambridge Analytica context, how would you then comment on the ramifications of privacy and the role of organizations in particular, the responsibility perhaps that they have in ensuring that these user data are protected, are not misused? Well, companies that do have access to all of these data have also a lot of responsibility. With great power comes great responsibility and it's not just about security of data. So it's not just that Facebook, for example, since you mentioned it, would need to make sure that data is kept in a secure way but also privacy which is a different and separate concept that the company would also think about how the data is being used. Those are the two aspects that they could and should keep in mind whenever they collected data. Especially when you're talking about these companies or platforms where the stated purpose is either not so clearly stated or seems to change every year, be it Facebook or Amazon or any other. It becomes quite a concern in terms of these potential applications and seeing Cambridge Analytica and the potential political manipulation coming out of that is a clear example. If you're talking about a company that's just has records for the purposes of short term marketing or customer files, maybe it could just stay like that and we can think about the idea of data minimization as opposed to this perpetual expansion in terms of what we do with this data. It seems almost difficult to wrap your head around that given the large amount of data that are floating around, that are shared and that are available to these large organizations, the data they're able to harvest from users and perhaps sometimes we ourselves as users willingly offer some of these data. There's been a lot of talk, particularly in the tech industry about regulation being the way forward for this industry. I wonder if you could comment on; what are the ways to effectively bring about some changes here. Is regulation the answer that we're looking at to privacy problems? In many cases the regulatory mechanisms are already there in place perhaps in theory not in practice. Updating those and enforcing those seems like a non-issue, it has to be the case and especially when we're talking about these larger companies, they're able to skirt across different taxation regimes and similarly with regulatory agencies, we need to coordinate a bit on that and have a conversation about how comfortable we are with these companies that exist on such a scale and that permeate in so many different aspects of our lives and our institutions. That can go hand in hand with updated and continued media education campaigns, both for the youth but also for us and our parents and maybe promoting more of a culture of not just logging off temporarily or longer than that but also deleting apps like data minimization for companies but also for individuals.