Hi, I'm Evan Jones. A Technical Curriculum Developer for Google Cloud. I'm going to be walking you through three labs that will cover the basics for how you can understand and analyze your costs on Google Cloud platform. With these labs, you'll be more prepared to work with the appropriate teams on setting up advanced cost controls based on your business needs. First, I'll go through a lab that walks you through setting a quota to control BigQuery usage and costs. The next lab focuses on Stackdriver. Stackdriver is an important tool that provides monitoring and management for services, containers, applications, and infrastructure. It's also a great resource for collecting information related to your GCP costs. In the third lab, we'll dive into Cloud functions. A lightweight compute solution for developers to create single-purpose, standalone functions. Functions can respond to Cloud events, such as programmatic budget notification without the need to manage your own server. Now, for the first lab, I'll query a public dataset and explore the associated BigQuery costs. Then I'll modify the per day quota allowed for my lab user, and try to rerun that same query with a newer reduced quota. Let's see what happens. Here's the first quick lab that you'll be walking through. It's all about setting up cost-control with a quota inside of BigQuery. The lab prompt is going to look like this. Some of the things that I like to do inside of Qwiklabs when I first start, or take a brief look at the learning objectives. Here's what we're going to do, we're going to query that BigQuery public dataset, it will be a large one, we'll be processing terabytes of data and explore the costs of processing that much data. Then we'll exercise a really cool option as a billing administrator, and you can modify the quota which is the cap that folks are allowed to process inside of BigQuery. Then we're going to make sure that quota actually works by re-running one of those really large queries after we've actually done the quota. The lab provides you information on BigQuery pricing, and a few tips on setting up your lab inside of the Qwiklabs console. Remember inside a Qwiklabs you can start and stop these labs, but as soon as you click end lab, that project is deleted. You can restart and create new projects if you want to do the lab multiple times. So the main tool that you'll be using for this lab is BigQuery. That's when you're processing large amounts of data. So the first thing we want to do is query that public dataset. I'm going to copy over the query here. Inside of Google Cloud Platform as you'll see in the lab when you walk through, you'll be taken to BigQuery the navigation that's under Big Data. You can scroll all the way down, you'll see Big Data. My tip and trick of the day is, if you hover over and you click the pin icon, you can actually pin it to the top. You can see that it gets removed and then added to your favorites, it' storing a particular product, and I'm going to paste that query in. You can see that it takes 1.4 terabytes. It's going to process 1.4 terabytes of data to run, which depending upon your budgets, maybe a little bit more data that you wanted to process based on how much BigQuery is going to charge you for the bytes that are processed. As a quick review, BigQuery will process one terabyte of data per month, per project, for free. It will charge you I think it's upwards of five dollars a terabyte or something like that after that. So but half a terabyte is like two bucks. So when the question is in the labs going to ask you the total cost of processing this entire query. So if I were to run this query which is what you're going to do as part of the lab, 1.4 terabytes actually processed super quickly, and that's because I've actually run this one before. This lab actually takes advantage of cache. So if you're working on a project with a bunch of other folks, and everyone's running the same query at the same time, BigQuery is going to go. Hey, I just gave those same answer for the exact same query to person A and then person B is asking me the same thing. Cache is actually a really great feature that's built into BigQuery to prevent you from being charged twice for the exact same query. But say I want to for the purposes of just experimenting with the quota. I'm going to go into more settings, and then disable the automatic caching of a query. So I really wanted to process 1.4 terabytes of data. Let's see. As I start to run this, I get this pop-up that says your custom quarter has exceeded your usage based on this quarter policy query usage per day, which was set by me about 10 minutes before you saw this video, is breached by this particular query. Now, you won't get that when you're running this lab for the first time because you haven't set up that quota. So the last part of the lab is going to get you inside a navigation menu under Identity Access Management and Admin. There's a great spot called quotas. Within there, though I will walk you through how to set up a custom quota for BigQuery. If you wanted to set a ceiling on how many bytes of data that people can process. You can see that I've set a really arbitrarily low quota, I think it's only a couple 100 gigabytes for the BigQuery API. The lab will walk you through how you can set up that quota, then you can edit the quarter that exists, and it is provide, and you can see where you can actually set the quota inside of here. So we did 0.25 terabytes instead of unlimited and that's why it triggered that. So it's a great way, and this is just one of the quotas that you can set across many of their products in there, but it's a great way to prevent people from using more than they should. Most of the other labs inside of this course, will provide you monitoring on the usage. Quotas is one of those hard ways where you can basically say you can use this much and no more. In this lab, I'll specifically cover how to monitor a Google Compute Engine Virtual Machine or VM instance with Stackdriver. You'll also learn how to install monitoring and logging agents for your own VMs which collect the information from your instance. You can include things like useful metrics or logs from third-party applications. In subsequent labs, we'll be using Stackdriver a lot to monitor events that can help you identify cost optimization opportunities in GCP. Let's take a look. As we mentioned in the introduction, Stackdriver is an extremely useful tool for monitoring the usage of your Google Cloud Platform project. This is going to get you familiar with the basics of Stackdriver, and here we're actually just going to monitor just a very simple Virtual Machine that we've converted into an Apache Web Server. So as part of the lab, I'm not going to walk you through all of it, but I'm going to hit the highlights. On the right-hand side, you can clearly see the lab involves quite a few steps but they're all step-by-step it'll walk you through. So even if you've never worked with Stackdriver GCP before, it's pretty just hitting one different item at a time and moving through and collecting these activity tracking points which is great. So another nice feature about Qwiklabs that I like is, as you complete a task, you'll automatically earn these points afterwards. You know that you did this step correctly. So I'm a huge fan of the activity tracking. So the first thing that's going to ask you to do inside of the lab is you need to create something that's going to consume resources so you can monitor later. Just for this use case, Stackdriver can be used to monitor lots of resources within GCP. But this use case, we're just going to create a simple web server off of a Compute Engine VM, I'll show you, I've already done that here. So in GCP under Compute, you can get to the Virtual Machine instances page, you're going to do all this inside of your lab. The cool thing is, I've other virtual machines here that's basically saying, "Hey, that work you're doing with TensorFlow, you're not using it at all. You should switch it to a different costume machine type." That's super cool. So that's just intelligence that's built into the Google Cloud Platform. But ours it's going to ask you to create this lamp one VM. Lamp stack is for Linux, Apache MySQL, PHP is just a particular stack of software technologies, and this one is just going to be a web server. So what we're actually going to do is, let's see if we can't find the external IP address. Boom. So this is just a blank default page if you install Apache on your particular bare-bones virtual machine. What you're going to be doing is you're going to be looking at metrics, like uses metrics, network traffic, computational power, CPU power, by monitoring it with Stackdriver. So inside of the lab, you'll first need to create a Stackdriver account, then once you've created accounts, it involves installing little agent scripts on the actual virtual machine itself, which you'll be doing via SSH. You'll be creating checks and alerting policies. I'll show you what the finished product of that looks like. To get this tract over by the way, as you'll see in a lab, Navigation menu, all the way down from monitoring, that'll kick you over in the Stackdriver, and you'll be building a dashboard based on your usage of that virtual machine. So let's see. I'm in my particular workspace, I've already as part of the lab, installed these agents, created these uptime checks, alerting policy that actually kicks out an e-mail if my web server is getting way too much traffic. So let's see. I want to check in on my uptime check. Hopefully, my server is up and running, you can monitor the latency. Let's take a look at dashboard that we built. So you can see the CPU load on my CPU, is what I do to test this by the way. Back on this page, I frantically refresh this, or you can arbitrarily send lots of network traffic to your virtual machine because you're essentially mocking what a production user would do, and then you can see the packets that are received over time. I think the one that I'm working on here is the blue one. I have multiple virtual machines in this Qwiklabs account. The blue one, you can see there's received per second, and then the load on that particular CPU as well. It's great because the cool thing that you can do is all right, well dashboards are pretty, but I needed to take action on them. That's where you can set up an alert. So I can set up a policy to basically say, "Hey, I've breached 500 bytes or some other threshold of traffic because it's a holiday sale and I want to make sure that any one standard one, web server has enough serving capacity because they didn't build it with containers or something like that." I want to kick out an e-mail. So this will actually send an email out, and I just added in some little, custom text here that just says, "Hey, this was actually triggered." So it gets your feet wet with a lot of the things that you can do inside of Stackdriver as far as monitoring goes, and then beyond, just visualizing those new dashboard. You can trigger out those alerts via an e-mail or another different medium as well. So that'll get you started with Stackdriver. In this next lab, you'll create a simple Cloud Function, then you'll deploy and test that function and view the output logs. When we are exploring deploying Cloud Functions via the command line, is that's how we'll deploy Functions and all of our subsequent labs. If you're interested in exploring the Cloud Function UI, feel free to check out the Cloud Functions Qwik Start console version of the lab, inside of the Qwiklabs catalog. Let's take a look. In this third lab, it's all about Cloud Functions. If you want some single use generally event triggered code to run inside the Cloud, you want to use Cloud Functions. You don't have to worry about maintaining a server. You can just write in node JS code, Python code, JavaScript code, Go, Lang code, as well. If you wanted you to just execute and do something based on a response to like an HTTP request or a GCS bucket alteration happening, that's a perfect use case for Cloud Functions. So this just gives you very, just loosely familiar with the basics of creating, invoking, and then monitoring one of those functions. It's the simplest of all functions. It's just basically the literally, the hello-world function, and monitoring the output. So we're going to be using the Command Line here, which means you can be using the terminal instead of clicking around inside of the UI. If you haven't used a Command Line before, it's actually not too bad. Inside of GCP, I'm going to just open up the Cloud Shell which is a free terminal that you get inside of there. Some micro virtual machine that automatically opens up. Some of the different functions that you'll invoke on your virtual machine is deploying the hello-world function. I'll show you what that function actually looks like. So this is the function that you going to create. You make a directory, call it hello-world, navigate to that directory, and it's just a JavaScript function, the language we're going write for this function is JavaScript, that just says, "Hey, log something into the logs." Just say, "Hey, hello-world, it's my Cloud Function and it actually invoked." It's literally just saying you called the function, here is the output that it was called, it's not doing anything special. But in subsequent labs after this, you can do a ton of things inside of your Cloud Function code, but this is just getting you familiar with actually creating, testing, deploying, and monitoring the output of a Cloud Function. So let's see if we can get that Cloud Function run I created it a little bit earlier. So I'm going to say, "Call this Cloud Function that was created earlier." I should hopefully get a result that says, "It was executed." I got an executed ID which is great, I then can find the logs. Let's find, view the logs. I can then view out all the times I executed this hello-world function just by copying that. Again, I've done it more than once at this point. So hopefully, they'll be more than one that shows up. So it's reading for the logs, the function that was invoked, I just gave the custom name hello-world, and then it will return us back the results. You can see there was invoked all these different times throughout the day today, how long it took, and the status of the function. A really cool thing about Cloud Functions is as you're going to see in subsequent labs, you're going to set up automated cron jobs via Cloud scheduler, and you wanted to invoke a function like clean up persistent disks, clean up IP addresses, clean up other resources that aren't used. The cool thing about Cloud Function is, for whatever reason, if that function failed and it's not like a programming error what's going to fail every single time, if it was like a tiny issue that'll resolve itself, you can actually automatically specify a retry time for that Cloud Function to be invoked. So it's designed to be what they call Serverless, or hands-off, and you don't have to monitor the hardware, it's going to make sure that whatever code that you want to run as part of that function works fine, and Google will handle the rest of the maintenance.