So let's take a look at this example. We now want to use the idea of Newton's method to solve this particular function. RF is KD divided by X plus h times X divided by two. So if you take a look at this function maybe immediately, you know what this. This is the EOQ problem we introduced in the past, okay. You may interpret this objective function in the following way. X is your order quantity? H is the inventory cost per unit per year. K is the fixed cost to fix ordering cost for placing order and the D is the annual demand. So I'm not going to go through the details again, if you want to get some memory about EOQ problem, maybe you may go back to look at the previous videos. So we may also specify numbers. So if we specify numbers for the parameters, we will be able to draw a curve like this. So this curve is our objective function. Okay, so now let's try to apply the newton's method. That say we know what's the optimal solution. Okay, so with that maybe you may just get this 144.34. So you may either use an equation to do this. So this is not taught actually. But you may also do something the research to get this numeric value. Okay, so just forget about this equation. Maybe you would take a look at the internet by yourself or wait for future analysis. But now let's see how with this numeric problem we may go to an optimal solution. So at any xk we know we need to be able to find the quadratic approximation. So this is our general formula for quadratic approximation. So here we may have f of xk. So f of xk is the first term. Alright, f prime of xk is here. So you may do the derivatives by yourself. Okay, and lastly, f prime of k is here. Okay, so this is nothing but some basic calculus exercise. So once we have that now at any point, for example x zero, If we take the S0 as 80, then pretty much this function or this second order function that we need to minimize would be this one. You simply unplugging all the numbers and the 80 into this particular formula. We're going to get a second order problem that you want to minimize or you get this red curve. Okay, this red curve is exactly your second order approximation at your initial point, which is X0 80. So you try to minimize that once you try that you're going to move to this point which is around one 100 and something. Okay, so you try to solve this problem. If you try to solve this problem, then it's global minimum is going to satisfy the fact that the first order derivative for your previous function would be zero. For example, if we are having that example 80 then after you plug in all the numbers, you're going to get 101.71. So that's how in the first iteration you move from 80 to 101.71. Now we are able to do the iterations again and again and don't forget that we actually have a formula. Okay, so for the formalized xk we always start with xk and then we subtract xk by the following terms. We have the numerator as f prime and the denominator as f prime prime. Okay. For the EOQ problem, you simply take terms here and then you don't really need to do the minimization all the time because you have a formula now. So from x1, you're going to move to x2 from x2, you're going to move to x3. You may always repeatedly plugging numbers, plugging numbers, plugging numbers. And then you will be done by getting closer to 144.34. So this is pretty much an illustration of how may we use the Newton's method, at least for solving one dimensional problems. Pretty much all you need to do is that you need to have your functional form for f. And then you do f prime, you do f prime prime and then you will be able to get this formula and then you do all the things. So this is a successful story, but I also need to remind you that if you're function looks weird, then this approach may fail. For example, if you are currently looking at a solution that is here, then when you do your second order approximation, it may be something like this. Then if you try to fight the minimum point on it, you don't know how to do it because this is having downward curvature, so you actually don't have a minimum point or if you stick to your formula like this, what you will do is that you will move to the first order point along that second order approximation. Okay, so you will move to this point. Okay, and now you may understand that from here to here after you do one iteration, you actually get higher. Okay, so when your function is really weird, actually, Newton's method does not guarantee you improvement in each iteration. So that's the bad part for Newton's method. So that may be bad, but new test method also has its good parts. For example, if you have a quadratic function, that is your objective function, then, you know, Newton's method guarantees to get to an optimal solution in one iteration. Okay, so for nice behavior functions, Newton's method may be faster than gradient descent, but gradient descent is a much more robust algorithm. Gradient descent always do meaningful search unlike Newton's method.