Let's dive in into a little bit about what Hugging Face is. Now sometimes people don't see Hugging Face as a platform but in fact it is. You can see here that right off the main website, we see models, datasets, spaces and docs. Will cover a few of those in detail later. But it is enough to say that we're going to be able to interact with and host models and create our own version of our models work with data sets, existing ones in our own as well. So the way I see it is that it's a platform a way where we can cross collaborate with everything machine learning. Now, I've gone ahead and created my own profile but an account and you can see here that there's models and data sets. I don't have anything yet, I'll have to fix that pretty soon. But this is, again, the way I see it is kind of like GitHub, just like GitHub is for for soft engineering. I see Hugging Face being the same thing but for machine learning, collaboration and interaction. The other thing that I want to show you is, once I created my account, I had these programmatic access using the Hub's Python Client Library. We'll get into a little bit more details as to what the Hub is. But you can stand and start seeing here that sometimes it's not only the website but you have libraries that you can use to create repositories with specific names and with a specific purpose. Like say, for example, a model or a data set, or even a space. Now spaces are its own special thing and we'll cover those later as well. But I just wanted to show you that you have the ability to interact with Hugging Face in many different ways. And then the last thing I want to show you is that this is the Python Package Index. This is where PIP goes and finds finds its packages. There are a couple of things I want to show you. The first one is transformers and if you search for transformers and you go to this one right here, version 4.22. That two in this case, you'll see that the familiar Hugging Face logo is right there and this is in fact coming from Hugging Face. So this is hugging face itself providing open source tooling, open source libraries to interact with our models. And here, transformers is specifically for text, vision and audio. And it is definitely a joy to use and we'll see a couple of things that we can use. And then you have all these kinds of demos available. Now, the one thing I want to show you in relation to this, I'm going to go back here. So you see here T5 and GPT2 mentioned, let's go back to the website, to the main website over this. The first one, the first step rather, and I'm going to search for GPT2. I'm going to say that, I'm going to return and then you'll have several different things. So this means that you have support for PyTorch, TensorFlow, TFLite, it uses several different things here. And you have this model card, the model card allows you to demo, explain what this is about and give you a little bit of information. You can look at this, you can deploy it, how inference has its own way of deploying its models. You have a hosted API where you can do life inference and you have the endpoints and even Sagemaker, Direct Sagemaker connection. So that's pretty interesting. And I'm going to scroll back here because I want to show you, well, here's the hosted inference API. So let's let's try to compute that and see what happens and then we'll see this being completed by the model. So, pretty good, pretty useful. And what I was telling you about before, and I mentioned space is a little bit was that you have the ability to use these GPT2 model and do something else with it and repurpose it and fine tune it to your own needs. And all of these are lots of different accounts that happen, doing something with it. And here you have examples, you have a little bit of documentation. And then we saw transformers, but you can see here pipeline and sets it. So transfers allows you to have an abstraction library with Python to do text generation in this case. This example is going to be using text generation, you can see here. And then it's like CPT2. And again, we'll see all of these in detail later. But you get the output, so essentially four lines of Python, if you're counting the import statements and then you get something really powerful out. So you get libraries hosting datasets, hosting models and interaction, all from Hugging Face in a web UI that allows cross collaboration for anything machine learning.