[MUSIC] Thank you so much, Jane, for coming to talk to me about synthetic biology and governance. So first if you could just introduce yourself to the class, that would be fantastic. >> Right, well, my name's Jane Calvert. I'm a social scientist at the University of Edinburgh, and I work in the field of Science Technology Studies or STS. So I'm very interested in studying science and technology but from a perspective of the social scientists. And I'm been interested in synthetic biology since about 2008, so for quite a few years now. >> And could you talk a little bit about, from your perspective, how science and technology studies is different from bioethics? >> Yeah, that's a difficult one. [LAUGH] I think we don't talk about ethics so much, I suppose, we don't have an explicitly ethic agenda. Some science technology studies, or STS as it's called, is much more descriptive, just kind of looking at how scientists make knowledge in the laboratory. But I think even in its more descriptive form, it actually can bring to the surface issues that would be of interest to bioethics, so I think there's not a very strict distinction between science and technology studies and bioethics. So some of the values that people bring to their work might be shown by a descriptive study of people working in the lab which may not be kind of very hard values, but maybe sort of the values about what's useful or what research should be invested into, things like that. >> And you're talking there about the values that scientists bring to their work. >> Yeah. >> Great, and I know that you've been quite involved in a number of synthetic biology endeavors, both on the research side and on the sort of oversight or governance side. Could you talk a bit about your role on the synbio roadmap? >> Yeah, so in about 2011, the UK government, in particular a minister of the UK government, decided that the UK should have a strategy for synthetic biology. So he set up a working group to write a roadmap on synthetic biology. And that involved people from industry, from academia, from policy. And after the roadmap had been running for a couple of months, they decided that they had some expertise missing from it. So I was contacted along with a couple of other social science colleagues, and we were asked to join the roadmap and contribute to their discussions. >> And how was that conversation proceeding? Where were they at the point that you joined it? >> Well, once we joined it, it was actually very much practically written in quite a short time scale because they wanted it to feed into the next spending review where the money is given out. And so the whole roadmap was written within, I think about eight months, where we had meetings every couple of weeks. And once we have started on the committee, we participated in all the meetings and contributed to the discussion of all the different chapters. It wasn't strictly a roadmap in the sense of some other technology roadmaps. The most famous is the international, not the international, the roadmap for semiconductors. I can't remember the exact name for it. >> [LAUGH] >> And that has various kind of defined technical requirements which have to be met at a certain time. This roadmap was more of a kind of policy document, outlining a way forward for the field. So depending on how you define roadmap, maybe it wasn't a strictly roadmap. >> And how much of it was focused on the science itself? And what were the sort of scientific questions that needed to be addressed, and what was the technology that needed to be developed, but rather, was more about governance? >> Well, so there were chapters on the market for synthetic biology, on the underlying training and infrastructure. And then the group as a whole had kind of decided that the public acceptance of the technology was an issue that needed to be addressed. So they tasked myself and my social science colleagues with writing a chapter on public acceptability of synthetic biology, but we actually didn't want to do this. Because we knew that decades of research in the social sciences has shown that the public isn't a kind of homogeneous mass that just accepts or rejects technologies but actually is a very differentiated group. And that also teaching people about the science doesn't necessarily make them like it more. So instead we suggested writing a chapter on responsible research and innovation, and this was the term that was kind of rising in popularity at the time, this is in 2011. The European Commission was kind of deciding to adopt it. There was some work coming out of the research councils in the UK. So when we suggested that, it was taken up with enthusiasm by the group as being a kind of more useful title for a chapter. >> And can you talk a little bit about what the difference is between public acceptability and responsible innovation? >> Well, I suppose public acceptability assumes that you have the technology, you put it out there, and you want people to accept it. Whereas responsible research and innovation is more about the process, all the way from the lab bench to the innovation stage. And it's about feeding in ideas, other perspectives, all the way through that process. Which could result in a product which would be hopefully more kind of socially robust and more valuable, both scientifically and in terms of responsibility. >> And what do you mean by socially robust? >> Yes, [LAUGH] that's a term that is used in the social science literature, and I suppose it means not just more likely to be accepted, but more governed by people's needs, wants, desires, and concerns. So will be more likely to accept it, but that's not the motivation behind it, the motivation behind it is actually developed work. So people use the phrase a better technology and a better society. So that both society and technology can benefit from these developments. >> By getting input from the public early in the process. >> Yeah, not necessarily just from the public, it could be from different experts. It could be, for example, if you were developing an agricultural technology, you might want to involve farmers, agronomists, economists, ecologists. All these people who may not be part of a synthetic biology project team but actually may have relevant and useful expertise. >> So I think in my lectures, I talk about stakeholders. >> Yeah. >> So- >> Yeah, but also it would also include other academics. I think sometimes, for synthetic biology, the easiest person to talk to initially is another academic in the field that's just slightly different from their own. So I think, for example, there can be lots of benefits from molecular biologists talking to ecologists particularly if they're thinking of developing something which might be released into the field. >> And what was it that was being sensed by the committee that there was a problem with public acceptability? >> I think this is just the default thing that people think. They tend to think that the problem is with the public, but also in- >> In they, you mean the scientists? >> Yeah, but also the policymakers in this context as well, and the people from industry as well. But also, in the UK, there's obviously been a lot of contention around GM crops, so the concern was that synthetic biology would be another GM, and that it was necessary for it not to be like that. So that's kind of made people a bit more aware and heightened about the fact that you can't just develop a technology and assume it will be taken up, that you maybe have to think about how people might actually want to use it. And there was a public dialogue on synthetic biology in 2010 in the UK. And some of the questions that came out of that were what is the purpose of this technology? Who's going to benefit from it? So those types of questions are actually the questions that people want to ask, they're not so concerned with questions about is it safe or not. They're more concerned about why is it being done in the first place and who will it benefit. So those questions actually got into our chapter of the roadmap, we took them through the public dialogue report, and we reproduced them in a box in the roadmap chapter. >> So that tells me two things. One, there's a public dialogue that I want to talk about. But also I'm sort of excited about the fact that the public, something from the public dialogue actually made it into sort of the next phase in thinking about governance of the technology. >> Yeah, yeah, and I'm glad that it's in there because I think those questions are quite important questions. There are five or so questions. >> Right. >> I can't remember the others in so much detail. Also, another change that we made was that one of the original visions for synthetic biology, it's like three vision statements. I think one of them was that it should be widely publicly accepted. And we suggested to change that to of clear public benefit, so therefore changing the emphasis on the public to accept the technology to the scientists and engineers to develop technologies for beneficial purposes. >> I'll be talking in another section of the course about the deficit model. Can you talk a little bit about the deficit model and how that is or isn't changing in synthetic biology? >> Well, the deficit model is a model that there is a deficit amongst the public, so you can think of them as kind of an empty vessel which needs to be poured full with knowledge. And once that knowledge is there, then the public will not only have the knowledge, but will also be favorably inclined towards the technology. And although it is, of course, the case that if you give people knowledge, they will have the knowledge, there's no evidence that that will make them favorably inclined towards something. So for example, I don't know if you know about Cricket. It's an English game. It's very boring. [LAUGH] If I told you loads and loads and loads about cricket, you wouldn't necessarily become more favorably inclined towards it. So the problem with the deficit model is not that people are ignorant, because, of course, people are ignorant. Everyone is ignorant. We are all ignorant, scientists and engineers are ignorant about some things and the public, whoever they are, which they're not one thing, but many members of the public are ignorant about things that they don't know about. But by just having more knowledge doesn't make you more favorably inclined towards something. In fact, it can actually heighten preexisting attitudes so people who are opposed to something may become more opposed to it if they have more knowledge. So this is an assumption about the public which is kind of widely held, but has been shown to be wrong, but is yet very, very persistent in scientific and engineering communities. It's quite a mystery about why it's so persistent. It may be that it's quite a convenient way of thinking. Because if the problem is with someone else, then you don't have to change your own way of thinking. >> So the public dialogue around synthetic biology that took place in the UK, was that approached with the same sort of deficit model thinking? Or was that sort of differently framed? >> Well, I mean, I wasn't involved in organizing it. I was kind of on the boundaries of it looking in. It definitely wasn't intended to be deficit modeling. It was intended to be a dialogue, which is why it was called a dialogue. I mean, it was interesting in some ways. But in other ways, it started off with people who had no knowledge or interest in the topic. So it kind of filtered out people who had backgrounds in science or backgrounds in law, or backgrounds in journalism, before even starting the dialogue. So you started with this kind of blank slate. People who were then had to be given knowledge, and then they were asked what they thought of it. So there's no perfect way of doing a dialogue. And personally, I think that perhaps dialogues are better oriented around specific problems or challenges. So maybe having a dialogue around sustainability, or new types of fuel, and how we can replace fossil fuels might be and how synthetic biology perhaps could feed into them, as well other technologies as well as social innovations. That might be a more useful type of thing to have a dialogue about than a particular technology, what you think of it. Also, the dialogue was held in 2010, and at that time, there were very few applications of synthetic biology, so it was people, I mean there was some autonomies and then produced from [INAUDIBLE] lab. So I mean, it's easy to be critical, but actually there was some good things about the dialogue as well. And I think one of the best things that came out were these series of questions which really challenge assumptions that people are only concerned about their own safety. They're actually quite concerned about the motivations and purposes behind doing synthetic biology. But also they were very interested in how it was funded, why it was funded. So that in some ways led to an emphasis on responsible research and innovation. Because the results of the public dialogue were taken up by one of the research counsels that funded it who then commissioned some social scientists to write a report which then turned into a paper on resource research and innovation. So there is a connection there. >> So do you think now six years on, it would be useful, or we would get a different result from a public dialogue? >> Well, like I said, I think that dialogues around particular technologies are kind of strange and maybe not the right way to go. And also this dialogue didn't have any particular result. It wasn't people are pro or con, which is obviously, the case because it depends what you were going to use something for. I mean, I might be very pro chocolate bars if I'm hungry, but I'd be against them if they're used as missiles. >> [LAUGH] >> That's not a very good example but [LAUGH] no, that's not a good example, I mean, but you can see that it depends on the usage of the technology often. >> Yep, fair enough.