ec400 part ii math for micro lecture 5
play

EC400 Part II, Math for Micro: Lecture 5 Leonardo Felli NAB.SZT 15 - PowerPoint PPT Presentation

EC400 Part II, Math for Micro: Lecture 5 Leonardo Felli NAB.SZT 15 September 2010 One Inequality Constraint: Let f and g be continuous functions defined on U R n . Assume that x is the solution of the problem: max f ( x ) x s.t. g ( x


  1. EC400 Part II, Math for Micro: Lecture 5 Leonardo Felli NAB.SZT 15 September 2010

  2. One Inequality Constraint: Let f and g be continuous functions defined on U ∈ R n . Assume that x ∗ is the solution of the problem: max f ( x ) x s.t. g ( x ) ≤ b . Assume also that x ∗ is not a critical point of g ( x ) if g ( x ∗ ) = b . Then given the Lagrangian function: L ( x , λ ) = f ( x ) − λ ( g ( x ) − b ) Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 2 / 24

  3. There exists a real number λ ∗ such that: ∇ x L ( x ∗ , λ ∗ ) = ∇ f ( x ∗ ) − λ ∗ ∇ g ( x ∗ ) = 0 . λ ∗ ( g ( x ∗ ) − b ) = 0 λ ∗ ≥ 0 g ( x ∗ ) ≤ b Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 3 / 24

  4. Example: ABC is a perfectly competitive and profit maximizing firm. It produces y from input x according to the production function f ( x ) = x 1 / 2 . The price of output y is 2 , and of input x is 1. Negative levels of x are impossible. Also, the firm cannot buy more than a > 0 units of input x . Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 4 / 24

  5. The firm’s maximization problem is therefore f ( x ) = 2 x 1 / 2 − x max x s.t. g ( x ) = x ≤ a x ≥ 0 We ignore for now the last constraint x ≥ 0. The Lagrangian is: L ( x , λ ) = 2 x 1 / 2 − x − λ [ x − a ] Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 5 / 24

  6. The first order condition with respect to x is: ( x ∗ ) − 1 / 2 − 1 − λ ∗ = 0 Therefore: ( x ∗ ) − 1 / 2 − 1 − λ ∗ = 0 λ ∗ ( x ∗ − a ) = 0 λ ∗ ≥ 0 x ∗ ≤ a Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 6 / 24

  7. We can now solve the system of equations. It is the easiest to consider the two separate cases: λ ∗ > 0 and λ ∗ = 0 . Assume λ ∗ > 0 . The constraint is binding. Then x ∗ = a . The full solution is then (constraint x ∗ ≥ 0 is satisfied): λ ∗ = 1 x ∗ = a , √ a − 1 When is this solution viable? Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 7 / 24

  8. We need to keep consistency so if we assume that λ ∗ > 0 it needs to be the case that: 1 √ a − 1 > 0 ⇔ a < 1 . What if λ ∗ = 0? this means that the constraint is not binding. From the first order condition: ( x ∗ ) − 1 / 2 − 1 = 0 ⇔ x ∗ = 1 The solution is therefore (constraint x ∗ ≥ 0 is satisfied): x ∗ = 1 , λ ∗ = 0 and this solution holds for all a ≥ 1. Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 8 / 24

  9. Several Inequality constraints The generalization is easy. Key difference: some constraints may be binding while some may not. An example: consider the problem: max x y z x , y , z s.t. x + y + z ≤ 1 x ≥ 0 , y ≥ 0 , z ≥ 0 . Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 9 / 24

  10. The Lagrangian is L ( x , y , z , λ 1 , λ 2 , λ 3 , λ 4 ) = = x y z − λ 1 ( x + y + z − 1) + λ 2 x + λ 3 y + λ 4 z Solving the Lagrange problem yields a set of critical points. The optimal solution will be a subset of this. However, we can already restrict this set of critical points because clearly λ 2 = λ 3 = λ 4 = 0 . Indeed, if, say, λ 2 > 0 , then by complementary slackness x = 0 . But then the value of ( x y z ) is 0, and obviously we can do better than that (for example x = y = z = 3 / 10) . Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 10 / 24

  11. Thus, the non-negativity conditions cannot bind. This leaves us with a problem with one constraint x + y + z − 1 ≤ 0. We have to decide whether λ 1 > 0 or λ 1 = 0 . Obviously, the constraint must bind. If x + y + z < 1 we can increase one of the variables, satisfy the constraint, and increase the value of the function. Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 11 / 24

  12. From the first order conditions: xy − λ 1 = 0 zy − λ 1 = 0 xz − λ 1 = 0 We then find that xy = yz = zx and hence it follows, from the binding constraint, that the optimal solution is: x = y = z = 1 3 Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 12 / 24

  13. We have looked at: max f ( x y ) x , y s.t. g ( x , y ) ≤ b . We have characterized necessary conditions for a maximum. So that if x ∗ is a solution to a constrained optimization problem (it maximizes f subject to some constraints), it is also a critical point of the Lagrangian. We then find the critical points of the Lagrangian. Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 13 / 24

  14. Can we then say that these are the solutions to the constrained optimization problem? In other words, can we say that if these are maximizers of the Lagrangian these are also maximizers of f (subject to the constraint)? To determine the answer, let ( x ′ , y ′ , λ ) satisfy all necessary conditions for a maximum. We can now show that if x ′ , y ′ is a maximizer of the Lagrangian, it also maximizes f . To see this notice that λ [ g ( x ′ , y ′ ) − b ] = 0 . Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 14 / 24

  15. Thus, f ( x ′ , y ′ ) = f ( x ′ , y ′ ) − λ [ g ( x ′ , y ′ ) − b ]. By λ ≥ 0 and g ( x , y ) ≤ b for all other ( x , y ) , then f ( x , y ) − λ [ g ( x , y ) − b ] ≥ f ( x , y ) . Since x ′ , y ′ maximizes the Lagrangian, then for all other x , y : f ( x ′ , y ′ ) − λ [ g ( x ′ , y ′ ) − b ] ≥ f ( x , y ) − λ [ g ( x , y ) − b ] Which implies that f ( x ′ , y ′ ) ≥ f ( x , y ) So that if ( x ′ , y ′ ) maximizes the Lagrangian, it also maximizes f ( x , y ) subject to g ( x , y ) ≤ b . Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 15 / 24

  16. Recall the main results from unconstrained optimization: If f is a concave function defined on a convex subset U ⊂ R n , x 0 is a point in the interior of U such that D f ( x 0 ) = 0 , then x 0 maximizes f ( x ) in U : f ( x ) ≤ f ( x 0 ) for all x . You have seen in class that in the constrained optimization problem, if f is concave and g is convex, then the Lagrangian function is also concave. This means that first order conditions are necessary and sufficient. Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 16 / 24

  17. Main Theorem: Theorem (The Kuhn-Tucker Theorem) Consider the problem of maximizing f ( x ) subject to the constraint that g ( x ) ≤ b . Assume that f and g are differentiable, f is concave, g is convex, and that the constraint qualification holds. Then x ∗ solves this problem if and only if there exists a scalar λ ∗ such that ∂ L ( x ∗ , λ ∗ ) ∂ f ( x ∗ ) − λ ∂ g ( x ∗ ) = 0 , = ∀ i ∂ x i ∂ x i ∂ x i λ ∗ ≥ 0 g ( x ∗ ) ≤ b λ ∗ [ b − g ( x ∗ )] = 0 Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 17 / 24

  18. Several Inequality Constraints Consider the problem: max f ( x ) x s.t. g 1 ( x ) ≤ b 1 . . . g m ( x ) ≤ b m . The Lagrangian is then L ( x , λ ) = f ( x ) − λ ( g ( x ) − b ) .   g 1 ( x ) − b 1 . . where λ = ( λ 1 , . . . , λ m ) and ( g ( x ) − b ) =   .   g m ( x ) − b m Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 18 / 24

  19. Assume that there exist λ ∗ such that: ∂ L ( x ∗ , λ ∗ ) = 0 , ∀ i = 1 , . . . , n ∂ x i λ ∗ ≥ 0 , ∀ j = 1 , . . . , m j λ ∗ j ( g j ( x ) − b ) = 0 , ∀ j = 1 , . . . , m Assume that g 1 to g e are binding and that g e +1 to g m are not: λ e +1 = 0 , . . . , λ m = 0. Denote   g 1 . . g E =   .   g e λ E = ( λ 1 , . . . , λ e ) . Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 19 / 24

  20. Suppose that the Hessian of L with respect to x at ( x ∗ , λ ∗ ) is negative definite on the linear constraint set { v : D g E ( x ∗ ) v = 0 } , that is: D g E ( x ∗ ) v = 0 → v T ( D 2 x L ( x ∗ , λ ∗ )) v < 0 , v � = 0 , Then x ∗ is a strict local constrained maximizer of f on the constraint set. The question is then how to establish that whether D 2 x L ( x ∗ , λ ∗ ) is negative definite on a constraint set. Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 20 / 24

  21. To establish this we make use of the bordered Hessian: � Dg E ( x ∗ ) � 0 Q = Dg E ( x ∗ ) T D 2 x L ( x ∗ , λ ∗ ) If the last ( n − e ) leading principal minors of Q alternate in sign with the sign of the determinant of the largest matrix the same as the sign of ( − 1) n , then sufficient second order conditions hold for a candidate point x ∗ to be a solution to a constrained maximization problem. Leonardo Felli (LSE, NAB.SZT) EC400 Part II, Math for Micro: Lecture 5 15 September 2010 21 / 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend