So let's come to some conclusion and some prospects also in view of the fact that this is the last lecture of our lecture series on toxicity testing for the 21st century. We've seen that Tox-21 is in continuation of a process which started some 60 years ago with Russell and Burch calling on alternative methods, the three R's of reduce, replace, and refine, to replace animal testing. This was before in vitro testing actually became reasonably available. We have only seen in the 80s and 90s of the last century that a level of standardization and commercial availability of tools allowed the broad implementation of cell culture and its use in testing. This then prompted the need for validation. I just spoke about the validation bodies over the world which were created starting in the 90s in Europe, which have helped to bring some of these methods. More than 50 alternative methods have been formally validated so far internationally. They've brought some of them into use. But now, Tox-21c promises to do this on a much faster and broader scale. So our Center for Alternatives to Animal Testing here at Johns Hopkins was created in 1981. This preceded most of the practical development of cell culture methodologies. The European Center for Validation of Alternative Method, as I said, in 1991. Altogether, this is the three R's I wrote. The three R's of reduce, replace and refine. And this picture by my [inaudible] from our team who observed this, has nicely shown fishing. And this actually was a fishing exercise. At the beginning, these were very small fishes. We did fish with our rod and in vitro test which gave meaningful results. When the fishes got bigger, assays became validated or really big, internationally accepted by OECD and others. But with Tox-21c we are actually introducing now fishing with a net. We are doing a much more strategic effort to get new methodologies into regulatory use, and this is really bringing the whole thing of developing new approaches to a new level of development. The evolution of toxicology was in the beginning using early alternative methods, simple cell culture methods, one cell type, cytotoxicity, or very few other parameters were measured. What we also started on the computational side. It was very simple structure-activity relationships, pure correlations between what is found in a molecule and its activity. Today, these things have developed further. We have seen that organo-typic cell culture is coming more and more into the laboratory. This includes co-cultures, organ functionality, perfusion of cell systems, you've heard about these technologies. In the future, this promises to develop even further to the co-culture of such organ cultures and human-on-chip approaches, multi-organ models combined with microfluidics in order to have a type of perfusion and exchange of media between these organs. But we also have seen that automated cell culture high-throughput screening is used in Tox [inaudible] and Tox-21 has entered, so we can run our cell culture now without the laborious work, manual work of a technician, of a student or a researcher. We also have seen that new technologies allow us to learn more of our cell cultures. We have discussed in this lecture series the omics and image analysis methodologies, the high-content approaches of getting more information from our cells. You might recall the lecture on the human toxome project here. And together this allows us now to move to a toxicity mechanism, a mechanism-based toxicology. Adverse outcome pathways and the human toxome project are pertinent examples of what is happening in this field. But also, the computational approaches have moved forward, modeling of receptor binding but also of virtual organs of the kinetics of a substance in the body. You've heard lectures on physiology-based kinetic modeling. So there's a lot of opportunities now with the computational power of our time to actually support toxicology. And we're combining both in integrated testing strategies. You've heard the lecture on how to systematically combine the various type of testing and non-testing approaches to get the best information possible. In the future, this promises to develop even further: the system's toxicology comes in to reach, concepts of virtual patients, the entirety of an organism and how it is perturbed by a substance which is interacting it is exerting toxic effects. These systems' toxicologies at this moment is still a little bit of pie in the sky. It is something, however, which we are targeting, we're having a goal, and we're moving into this direction. This lecture series with certain lectures was about how to better generate data in toxicology. It was about how to embrace the new technologies of the 21st century to allow us to get high-quality data which are relevant to human disease, to human health effects. The next lecture series which is a logical confirmation will be on evidence-based toxicology. You will hear about systematic reviews. You will hear about the evidence-based toxicology collaboration which we created and for which we set up a secretariat on both sides of the Atlantic. You already see here some of the websites to access. So the next lecture series will address how to better handle data. It is about bringing evidence-based medicine principles into toxicology. So I would like to close with a quote by Georg Christoph Lichtenberg who already 250 years ago said something very wise. He said, "I cannot say whether things really get better if we change; what I can say is they must change if they are to get better." Thank you very much for your interest in our lecture series.