[MUSIC] [APPLAUSE] >> My name is Anthonie Meijers and I have a double background. I have a background in mechanical engineering, and I have a background in philosophy, and why is this so? Well, I first studied mechanical engineering, but then I got fascinated by engineering as a science and also was very much impressed about all the major impacts that technology has. So, that's why I decided to study philosophy, and, in the end, in my current job, I'm a Professor of Philosophy and Ethics of Technology. I combine both backgrounds. I would like to convince you, in my talk, of the thesis that engineering is, in the end, also social engineering, and I'm going to do that by giving you examples. I could have given examples of the past, so for example, the invention of the steam engine really mattered for the Industrial Revolution. It ignited the Industrial Revolution and has major impacts. Now we are on the thresholds of new technologies that will have similar impacts. I will give examples, three examples, to show that the thesis engineering is social engineering is indeed true faeces. The first example I would like to give you is from the field of robotics, and I would like you to look at this picture for ten seconds and really try to find out what our belief is at stake in this picture. Now what I see in this picture Is a person, a woman, who developed an emotional bonding with an artifact, a robot. She's very tender to the robot, maybe she loves him or her, we don't know. But it shows that the relation between technological artifacts and human beings is changing Now you may think this is just science fiction. Well, studies have shown, psychological research, that children who have robots in their environment from early on treat robots in the same way as they treat their human playmates. So they don't make a difference anymore. Now if that is the case that would mean that our relationship in the future may change. These children, they don't make a distinction anymore maybe the becomes their role model. They want to behave like a role model. Now what's are things, what's are the question are raised by that? In addition The European Parliament just passed a resolution granting legal status to electronic persons like robots, software systems that can make decisions and so on and so forth. Legal status, that means that I can be attributed Responsiblity in case things go wrong. Again, it changes our relationship to these technological systems. Now that is not just the case, as we see here, but you can now sort of think about healthcare practices, work practices, it will change massively our current Society. And in the end as I said, or I hoped I've shown, it not just changes the way that we value artificial systems around us, but in the end it also means that the way that we value ourselves will change because we think of ourself are unique persons having responsibility having ethics and so on and so forth that may change in the future robots may get a similar or maybe not that similar but they move in our directions so to say. That's the first example the second example is about a very different technology it's about neurotechnologies There are basically two sets of neurotechnologies. One are neurotechnologies that are meant to improve things that have gone wrong. And the other type of neurotechnologies are technologies that improve ourselves as healthy human beings and try to make ourselves even better than we are now. So in the first category there are for example deep brain stimulation technologies for people who have Parkinson's disease or people who suffer from severe tremors. These systems can be switched on and off. And they raise all kinds of new questions that didn't exist before. For example the question In case somebody does something, doesn't act or an action, who is responsible? Is it the person at which the system is switched on, the deep brain stimulation system? Or is he fully responsible only if the switch Is off that's a new type of question that arises because of the introduction of this new technologies. We can also use this technologies to improve ourselves. This week I got an email from an secondary school student. And he asked me about this idea, whether it could be possible to implement the chip in everybody's brain. And the chip would contain Google maps, that would be very, very handy to have. I mean, you never have to ask where you have to go, you just forward the instructions in your brain. And she asked me about the ethical, what, consequences of this, whether I'd thought about that, whether it would be technically feasible, and so on and so forth. I think it's a beautiful example, but it also raises new questions, like, are we now hackable? Because these Google Maps need to be updated, actually quite often. So yes, there has to be some entry to this system. And if there's an entry, there's also the possibility of misuse. So that's a new question that arises. And also the question, in case something goes wrong, I mean, is it the Google Maps who is responsible? Or is it me, or The hybrid of and Google Maps, so this also introduced new questions in a society. And a certain I would like to show to you is about fake news. Now probably you have heard about fake news because if the American election, but it led major people in the ICT industry. To take action. So here is an Associate Press release. It says that Apple chief executive, Tim Cook says, fake news is killing minds and governments and tech firms must act to stop it. Because it has such a great influence on the way that democracy functions. And then he continues and he says, well tech firms have a duty to create some tools that help diminish the volume of fake news. And then the important clause, without curbing free speech. And this is a beautiful example I think of what future engineering is about. It's about tradeoff of value. So on the one hand We have social media which enables to communicate very easy with everybody in the world in no time. But it has also unwanted effects, the spread effect news. But if you want to do something about the spread effect news that may. Violate your rights as a free speaker. So, engineering has to solve this. It's really a design problem for engineering, I think. So, but again, I think the point I'm making here is that the introduction of social media really changed our society. And that again is support for the thesis that engineering is always social engineering. And the tradeoff of values also points to what I call a triangle, a triangle of individual human beings of society and values. And these, obviously, very simplified picture of the complexities of reality. But it points to certain relationships that are mediated by technology. So robots change how we value ourselves. Your technologies change the way that we look at responsibilities. That's also important. Our navigation practices are changed and they develop a new value value of not being hackable. So social media also changed the way that we look at privacy or we look at truth maybe So this is a key triangle, I think, for engineering. And engineering needs to find ways to solve conflicts between values by smartly designing new artifacts and systems. And that is actually the thing where this course is about. So the course. On engineering ethics, gives you tools and insights to help you, designing new products and technological systems, taking into account this triangle of humans, society and values. Another point about ethics, because people believe that ethics is usually. Comes in after everything has been developed and then we try to reflect on whether it's good or bad, what are the consequences, how do we treat people in these technologies and so on and so forth. I think engineering ethics has a very different approach. It's an approach where it is not end of pipe, so to say, but it is an approach where we try to interact with engineering and technology developers in the process of technology development. So it's very much interactive, it's involved, and it tries to incorporate these Important values in the systems on the way. Now, I hope that I've convinced you of this idea of, engineering is always social engineering, and I hope that you have been motivated by what I've said. To follow this course on engineering ethics, because I think. An important way to improve technology is by taking ethics into account. Thank you. >> [APPLAUSE]