3 3 optimizing functions of several variables 3 4
play

3.3 Optimizing Functions of Several Variables 3.4 Lagrange - PowerPoint PPT Presentation

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers Prof. Tesler Math 20C Fall 2018 Prof. Tesler 3.33.4 Optimization Math 20C / Fall 2018 1 / 56 Optimizing y = f ( x ) In Math 20A, we found the minimum and maximum of y


  1. 3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers Prof. Tesler Math 20C Fall 2018 Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 1 / 56

  2. Optimizing y = f ( x ) In Math 20A, we found the minimum and maximum of y = f ( x ) by using derivatives. First derivative: Solve for points where f ′ ( x ) = 0 . Each such point is called a critical point . Second derivative: For each critical point x = a , check the sign of f ′′ ( a ) : f ′′ ( a ) > 0 : The value y = f ( a ) is a local minimum. f ′′ ( a ) < 0 : The value y = f ( a ) is a local maximum. f ′′ ( a ) = 0 : The test is inconclusive. Also may need to check points where f ( x ) is defined but the derivatives aren’t, as well as boundary points. We will generalize this to functions z = f ( x , y ) . Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 2 / 56

  3. Local extrema (= maxima or minima) Consider a function z = f ( x , y ) . The point ( x , y ) = ( a , b ) is a local maximum when f ( x , y ) � f ( a , b ) for all ( x , y ) in a small disk (filled-in circle) around ( a , b ) ; global maximum (a.k.a. absolute maximum ) when f ( x , y ) � f ( a , b ) for all ( x , y ) ; local minimum and global minimum are similar with f ( x , y ) � f ( a , b ) . A , C , E are local maxima (plural of maximum) E is the global maximum D , G are local minima G is the global minimum B is maximum in the red cross-section but minimum in the purple cross-section! It’s called a saddle point . Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 3 / 56

  4. Critical points on a contour map P Q ● ● ● S 1 ● 1 R Classify each point P , Q , R , S as local maximum or minimum, saddle point, or none. Isolated max/min usually have small closed curves around them. Values decrease towards P , so P is a local minimum. Values increase towards Q , so Q is a local maximum. Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 4 / 56

  5. Critical points on a contour map P Q ● ● ● S 1 ● 1 R The crossing contours have the same value, 1 . (If they have different values, the function is undefined at that point.) Here, the crossing contours give four regions around R . The function has a local min. at R on lines with positive slope (goes from > 1 to 1 to > 1 ) a local max. at R on lines with neg. slope (goes from < 1 to 1 to < 1 ). Thus, R is a saddle point. Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 5 / 56

  6. Critical points on a contour map P Q ● ● ● S 1 ● 1 R S is a regular point. Its level curve ≈ 8 is implied but not shown. The values are bigger on one side and smaller on the other. P : local min Q : local max R : saddle point S : none Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 6 / 56

  7. Contour map of z = y / x : crossing lines 10 − 7 − 3 7 3 − 2 2 − − 1 1 1 1 5 − 1/2 1/2 − 1/3 1/3 − 1/7 1/7 0 0 0 1/7 − 1/7 − 1/3 1/3 − 1/2 1/2 − 5 − − 1 1 1 1 − 2 2 − 3 3 − 7 7 − 10 − 10 − 5 0 5 10 Contours of z = y / x are diagonal lines: z = c along y = cx . Contours cross at ( 0 , 0 ) and have different values there. Function z = y / x is undefined at ( 0 , 0 ) . Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 7 / 56

  8. Contour map of z = sin ( y ) Minimum and maximum form curves, not just isolated points − 1.00 − 1.00 − 0.75 4 − 0.50 − 0.25 0.00 0.25 0.50 0.75 2 1.00 1.00 0.75 0.50 0.25 0.00 0 − 0.25 − 0.50 − 0.75 − 1.00 − 1.00 − 2 − 0.75 − 0.50 − 0.25 0.00 0.25 0.50 − 4 0.75 1.00 1.00 − 4 − 2 0 2 4 Contours of z = f ( x , y ) = sin ( y ) are horizontal lines y = arcsin ( z ) Maximum at y = ( 2 k + 1 2 ) π for all integers k Minimum at y = ( 2 k − 1 2 ) π These are curves, not isolated points enclosed in contours. Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 8 / 56

  9. Finding the minimum/maximum values of z = f ( x , y ) The tangent plane is horizontal at a local minimum or maximum: f ( a , b ) + f x ( a , b )( x − a ) + f y ( a , b )( y − b ) − z = 0 . � � The normal vector f x ( a , b ) , f y ( a , b ) , − 1 � z -axis when f x ( a , b ) = f y ( a , b ) = 0 , or ∇ f ( a , b ) = � 0 . At points where ∇ f � � 0 , we can make f ( x , y ) larger by moving in the direction of ∇ f ; smaller by moving in the direction of − ∇ f . ( a , b ) is a critical point if ∇ f ( a , b ) is � 0 or is undefined. These are candidates for being maximums or minimums. Critical points found in the same way for f ( x , y , z , . . . ) . Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 9 / 56

  10. Completing the squares review ( x + m ) 2 = x 2 + 2 mx + m 2 For a quadratic x 2 + bx + c , take half the coefficient of x : b / 2 Form the square: ( x + b / 2 ) 2 = x 2 + bx + ( b / 2 ) 2 Adjust the constant term: x 2 + bx + c = ( x + b / 2 ) 2 + d where d = c − ( b / 2 ) 2 Example: x 2 + 10 x + 13 Take half the coefficient of x : 10 / 2 = 5 Expand ( x + 5 ) 2 = x 2 + 10 x + 25 Add/subtract the necessary constant to make up the difference: x 2 + 10 x + 13 = ( x + 5 ) 2 − 12 Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 10 / 56

  11. Completing the squares review For ax 2 + bx + c , complete the square for a ( x 2 + ( b / a ) x ) and then adjust the constant. Example: 10 y 2 − 60 y + 8 10 y 2 − 60 y + 8 = 10 ( y 2 − 6 y ) + 8 y 2 − 6 y = ( y − 3 ) 2 − 9 10 y 2 − 60 y + 8 = 10 ( y − 3 ) 2 + ? 10 ( y − 3 ) 2 = 10 ( y 2 − 6 y + 9 ) = 10 y 2 − 60 y + 90 10 y 2 − 60 y + 8 = 10 ( y − 3 ) 2 − 82 Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 11 / 56

  12. Critical points Let f ( x , y ) = x 2 − 2 x + y 2 − 4 y + 15 ∇ f = � 2 x − 2 , 2 y − 4 � ∇ f = � 0 at x = 1 , y = 2 , so ( 1 , 2 ) is a critical point. Use ( x − 1 ) 2 = x 2 − 2 x + 1 ( y − 2 ) 2 = y 2 − 4 y + 4 f ( x , y ) = ( x − 1 ) 2 + ( y − 2 ) 2 + 10 We “completed the squares”: x 2 − ax = ( x − a 2 ) 2 − ( a 2 ) 2 f ( x , y ) � 10 everywhere, with global minimum 10 at ( x , y ) = ( 1 , 2 ) . Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 12 / 56

  13. Second derivative test for functions of two variables How to classify critical points ∇ f ( a , b ) = � 0 as local minima/maxima or saddle points Compute all points where ∇ f ( a , b ) = � 0 , and classify each as follows: Compute the discriminant at point ( a , b ) : � � ∂ 2 f ∂ 2 f � � ∂ x 2 ∂ y ∂ x � � = f xx ( a , b ) f yy ( a , b ) − ( f xy ( a , b )) 2 D = � � � � ∂ 2 f ∂ 2 f � � ∂ y 2 ∂ x ∂ y � � � ������������ �� ������������ � Determinant of “Hessian matrix” at ( x , y )=( a , b ) If D > 0 and f xx > 0 then z = f ( a , b ) is a local minimum; If D > 0 and f xx < 0 then z = f ( a , b ) is a local maximum; If D < 0 then f has a saddle point at ( a , b ) ; If D = 0 then it’s inconclusive; min, max, saddle, or none of these, are all possible. Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 13 / 56

  14. f ( x , y ) = x 2 − y 2 Example: Find the critical points of f ( x , y ) = x 2 − y 2 and classify them using the second derivatives test. ∇ f = � 2 x , − 2 y � = � 0 at ( x , y ) = ( 0 , 0 ) . The x = 0 cross-section is f ( 0 , y ) = − y 2 � 0 . The y = 0 cross-section is f ( x , 0 ) = x 2 � 0 . It is neither a minimum nor a maximum. f xx ( x , y ) = 2 and f xx ( 0 , 0 ) = 2 f yy ( x , y ) = − 2 and f yy ( 0 , 0 ) = − 2 f xy ( x , y ) = 0 and f xy ( 0 , 0 ) = 0 = f xx ( 0 , 0 ) f yy ( 0 , 0 ) − ( f xy ( 0 , 0 )) 2 D = ( 2 )(− 2 ) − 0 2 = − 4 < 0 so ( 0 , 0 ) is a saddle point Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 14 / 56

  15. f ( x , y ) = x 2 − y 2 Example: ∇ f = � 2 x , − 2 y � points in the direction of greatest increase of f ( x , y ) . The function increases as we move towards the x -axis and away from the y -axis. At the origin, it increases or decreases depending on the direction of approach. Detailed direction General direction Contour plot of gradient of gradient 2 2 2 f x = 0: 2x = 0 f y = 0: ! 2y = 0 " f 1 1 1 f x < 0 f x > 0 ! (0,0) f y < 0 0 0 ! 0 f y > 0 − 1 − 1 − 1 − 2 − 2 − 2 − 2 − 1 0 1 2 − 2 − 1 0 1 2 − 2 − 1 0 1 2 Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 15 / 56

  16. f ( x , y ) = 8 y 3 + 12 x 2 − 24 xy Example: Find the critical points of f ( x , y ) and classify them using the second derivatives test. Solve for first derivatives equal to 0: f x = 24 x − 24 y = 0 gives x = y f y = 24 y 2 − 24 x = 0 24 y 2 − 24 y = 24 y ( y − 1 ) = 0 gives so y = 0 or y = 1 x = y so ( x , y ) = ( 0 , 0 ) or ( 1 , 1 ) Critical points: ( 0 , 0 ) and ( 1 , 1 ) ( D = f xx f yy − ( f xy ) 2 ) Second derivative test: Crit pt f xx = 24 f yy = 48 y f xy = − 24 Type f D ( 0 , 0 ) − 24 − 576 D < 0 0 24 0 saddle ( 1 , 1 ) − 4 − 24 D > 0 and f xx > 0 24 48 576 local minimum No absolute min or max: f ( 0 , y ) = 8 y 3 ranges over (− ∞ , ∞ ) Prof. Tesler 3.3–3.4 Optimization Math 20C / Fall 2018 16 / 56

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend