We continue to discuss constrained optimization with inequalities. It's interesting that compared with the case of equalities, there is no relationship between the number of variables, here we have x and y, and the number of inequalities. Here, I have written just one but I can add many, many more. For instance, quite often, we are looking for the solution of this problem in the first quadrant where x and y take non-negative values. But, there is a rule. When you set a problem of optimization with inequality constraints, you need to follow the same sign which is for maximization problem as here is less than or equal to. But here, we have a different sign, inequality, greater than or equal to. But it's quite easy by multiplication by negative one to change these inequalities into allowed ones. Also, we can set a problem of minimization. There is also an agreement between mathematicians that whenever you are looking for minimum values of the function f, the constraints as inequalities should be written using a different sign, greater than or equal to. For instance, we have a constraint like this, and this time it's perfectly allowable to keep these signs. Now, let us talk about how we form a Lagrangian. There is also an agreement and a rule for such a problem. So, we're considering a maximization this time. We introduce Lagrangian L, starting as always with objective function, which is f. After that, we introduce Lagrange multipliers, and the number of these multipliers is the same as the number of inequalities. For instance, we start with the first constraint. We always put plus. After that I'm introducing multiplier, I'm using Lambda letter. Within these square brackets, I'll write down these constraint. Now, specific rule is as follows; in a maximization problem where you need to fill in the empty place here between the brackets, you subtract from the greater term in this inequality, which is b, the smaller term which is g of xy. So, here, we'll write b minus gxy. Now, along with this constraint we have non-negativity constraints. I have re-written them in the form I decided to change the sign. Now, I need to introduce another two multipliers. This time I'll be using Mu one, Greek letter. Here, I follow the same rule. I need to subtract from the bigger term which is in the right, the smaller term, and this difference will provide simply x. In the same manner, we write the final term Mu two y. Now, what about minimization problem? For minimization problem, we follow a similar rule with some difference. So, that was for maximization, for max. Now, for minimization. So, I'll separate with a line. Here I'm writing Lagrangian, again, starting with the objective function, followed by always plus bracket. Here, I'll be subtracting from the smaller term the bigger term. That makes a difference when I subtract from bg of xy. So, this term will be followed by another two terms on the basis of their non-negativity constraints. So, following the same rule, I need to subtract from zero, x, and that will provide the resulting minus, and the same for the second term. So, that will be Lagrangian for minimization. Now, we're approaching a general problem. So, the basic optimization problem will be a maximization problem. This time we'll be dealing with the constraints set consisting of m constraints. So, I'll label with one, this maximization problem, and here I'll write f of, here we have many variables and variables. This function will be maximized subject to m constraints. It's right time to state a problem which provides necessary conditions for the solution of this problem.