Horje
How to Find Extrema of Multivariable Functions

Finding the extrema of multivariable functions is a crucial aspect of multivariable calculus. These extrema can be either maximum or minimum values, and they provide essential insights into the behavior of functions involving several variables.

To determine these extrema, we first identify critical points of the function. This involves calculating the first partial derivatives of the function with respect to each variable and setting them equal to zero. This step helps us locate points where the function’s rate of change is zero, indicating potential extrema. In this article, we will discuss Extrema for Multivariable Functions including methods to calculate it.

What is a Multivariable Function?

Multivariable function is a type of mathematical function that takes two or more variables as input and produces an output. These functions are used extensively in fields such as calculus, physics, engineering, and economics to model situations where the outcome depends on several factors.

A multivariable function can be represented as [Tex]f(x_1, x_2, \ldots, x_n)[/Tex], where x1, x2, . . . ,xn are the input variables. For example, a function f(x, y) might represent the temperature at a point on a surface, with x and y being the coordinates of the point.

Extrema in Multivariable Functions

In multivariable calculus, finding the extrema (maxima and minima) of a function involves determining the points at which the function reaches its highest or lowest values within a certain region.

  • Local Maximum: A function f(x1, x2, . . . ,xn) has a local maximum at a point a = (a1, a2, . . ., an) if there exists a neighbourhood around a such that f(a) ≥ f(x) for all points x in that neighbourhood.
  • Local Minimum: Similarly, f(x1, x2, . . . ,xn) has a local minimum at an if there exists a neighbourhood around a such that f(a) ≤ f(x) for all points x in that neighbourhood.
  • Global Maximum/Minimum: These are the highest and lowest values of the function over its entire domain.

Methods to Find Extrema of Multivariable Functions

Finding extrema (maxima and minima) of multivariable functions involves several methods, each suited to different types of functions and conditions. Some of these methods are:

Using Partial Derivatives

To find the extrema of multivariable function, we can use Hessian matrix and it’s determinant.

Hessian Matrix: [Tex]H = \begin{bmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{bmatrix}[/Tex]

D = det(H)=fxx​fyy ​− (fxy​)2

All the conditions for extrema are:

  • If det(H) > 0 and fxx > 0, the function has a local minimum at the critical point.
  • If det(H) > 0 and fxx < 0, the function has a local maximum at the critical point.
  • If det(H) < 0, the function has a saddle point at the critical point.
  • If det(H) = 0, the test is inconclusive.

Let’s consider some solved examples for better understanding:

Example 1: Find the extrema of the function [Tex]f(x, y) = x^2 + y^2 – 4x – 6y + 9[/Tex].

Solution:

Step 1: Compute the first partial derivatives and find the critical points.

  • [Tex]f_x = \frac{\partial f}{\partial x} = 2x – 4 [/Tex]
  • [Tex]f_y = \frac{\partial f}{\partial y} = 2y – 6[/Tex]

Set fx = 0 and fy = 0:

  • 2x – 4 = 0 ⇒ x = 2
  • 2y – 6 = 0 ⇒ y = 3

So, the critical point is (2, 3).

Step 2: Compute the second partial derivatives to form the Hessian matrix.

  • [Tex]f_{xx} = \frac{\partial^2 f}{\partial x^2} = 2 [/Tex]
  • [Tex]f_{yy} = \frac{\partial^2 f}{\partial y^2} = 2[/Tex]
  • [Tex]f_{xy} = \frac{\partial^2 f}{\partial x \partial y} = 0[/Tex]

The Hessian matrix H is:

[Tex]H = \begin{bmatrix} 2 & 0 \\ 0 & 2 \end{bmatrix}[/Tex]

Step 3: Compute the determinant of the Hessian matrix.

[Tex]D = \det(H) = f_{xx}f_{yy} – (f_{xy})^2 = 2 \cdot 2 – 0^2 = 4[/Tex]

Step 4: Analyze the determinant and fxx.

[Tex]D > 0 \quad \text{and} \quad f_{xx} = 2 > 0[/Tex]

Since D > 0 and fxx > 0, the function has a local minimum at the critical point (2, 3).

Conclusion:

The function [Tex]f(x, y) = x^2 + y^2 – 4x – 6y + 9[/Tex] has a local minimum at (2, 3).

Example 2: Find the extrema of the function [Tex]f(x, y) = x^3 – 3xy^2[/Tex].

Solution:

Step 1: Compute the first partial derivatives and find the critical points.

  • [Tex]f_x = \frac{\partial f}{\partial x} = 3x^2 – 3y^2[/Tex]
  • [Tex]f_y = \frac{\partial f}{\partial y} = -6xy[/Tex]

Set fx = 0 and fy = 0:

  • [Tex]3x^2 – 3y^2 = 0 \implies x^2 = y^2 \implies x = \pm y[/Tex]
  • [Tex]-6xy = 0 \implies x = 0 \text{ or } y = 0[/Tex]

Combining these, we get the critical points (0, 0), (a, a), and (a, -a).

Step 2: Compute the second partial derivatives to form the Hessian matrix.

  • [Tex]f_{xx} = \frac{\partial^2 f}{\partial x^2} = 6x [/Tex]
  • [Tex]f_{yy} = \frac{\partial^2 f}{\partial y^2} = -6x[/Tex]
  • [Tex]f_{xy} = \frac{\partial^2 f}{\partial x \partial y} = -6y[/Tex]

Step 3: Evaluate the Hessian matrix at each critical point.

  • At (0, 0):

[Tex]H = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}[/Tex]

[Tex]\Rightarrow D = \det(H) = 0 \cdot 0 – 0^2 = 0[/Tex]

The test is inconclusive at (0, 0).

  • At (a, a):

[Tex]H = \begin{bmatrix} 6a & -6a \\ -6a & -6a \end{bmatrix}[/Tex]

[Tex]\Rightarrow D = \det(H) = (6a)(-6a) – (-6a)^2 = -36a^2 + 36a^2 = 0[/Tex]

The test is inconclusive at (a, a).

  • At (a, -a):

[Tex]H = \begin{bmatrix} 6a & 6a \\ 6a & -6a \end{bmatrix}[/Tex]

[Tex]\Rightarrow D = \det(H) = (6a)(-6a) – (6a)^2 = -36a^2 – 36a^2 = -72a^2[/Tex]

Since D < 0, the function has a saddle point at (a, -a).

The function [Tex]f(x, y) = x^3 – 3xy^2[/Tex] has saddle points at (a, -a), and the test is inconclusive at (0, 0) and (a, a).

Lagrange Multipliers Method

Method of Lagrange multipliers is a strategy used to find the local maxima and minima of a function subject to equality constraints. This method transforms a constrained optimization problem into a system of equations that can be solved to find the optimal points.

To find the extrema of multivariable function, we can use the following steps:

Step 1: Define the Objective Function and Constraint

  • Let f(x, y, . . .) be the objective function to be maximized or minimized.
  • Let g(x, y, . . .) = 0 be the constraint function.

Step 2: Form the Lagrange Function

Construct the Lagrange function L by combining the objective function and the constraint with a Lagrange multiplier λ: L(x, y, λ) = f(x, y) − λ(g(x, y) − c)

Here, λ is the Lagrange multiplier, and ccc is a constant representing the constraint level.

Step 3: Compute the Partial Derivatives

Find the partial derivatives of L with respect to each variable and the Lagrange multiplier λ: ∂L/∂x = 0, ∂L/∂y = 0, ∂L/∂λ = 0

Step 4: Solve the System of Equations

Solve the resulting system of equations to find the critical points (x, y, λ)

Step 5: Verify and Classify the Critical Points

Substitute the critical points back into the original objective and constraint functions to verify that they satisfy the constraint and determine whether they correspond to a maximum or minimum.

Example: Objective Function: f(x, y) = x2 + y2

Constraint: g(x, y) = x + y − 1 = 0

  1. Define the Objective Function and Constraint
    • Objective function: f(x, y) = x2 + y2
    • Constraint: g(x, y) = x + y − 1 = 0
  2. Form the Lagrange Function
    • Construct the Lagrange function: L(x, y, λ) = x2 + y2 − λ(x + y − 1)
  3. Compute the Partial Derivatives
    • Compute the partial derivatives: [Tex]\frac{\partial \mathcal{L}}{\partial x} = 2x – \lambda = 0 \quad \Rightarrow \quad 2x = \lambda \quad \Rightarrow \quad \lambda = 2x[/Tex]
    • [Tex]\frac{\partial \mathcal{L}}{\partial y} = 2y – \lambda = 0 \quad \Rightarrow \quad 2y = \lambda \quad \Rightarrow \quad \lambda = 2y[/Tex]
    • [Tex]\frac{\partial \mathcal{L}}{\partial \lambda} = x + y – 1 = 0[/Tex]
  4. Solve the System of Equations
    • From λ = 2x and λ=2y, we get
    • 2x = 2y ⇒ x = y.
    • Substitute x = y into the constraint:
    • x + y = 1 ⇒ x + x = 1 ⇒ 2x = 1 ⇒ x = 1/2,
    • y = 12x + y = 1
    • [Tex]\Rightarrow \quad x + x = 1 \quad \Rightarrow \quad 2x = 1 \quad \Rightarrow \quad x = \frac{1}{2}, \quad y = \frac{1}{2}[/Tex]
    • The critical point is (1/2, 1/2).
  5. Verify and Classify the Critical Points
    • Substitute x = 1/2​ and y = 1/2​ into the objective function: [Tex] f\left( \frac{1}{2}, \frac{1}{2} \right) = \left( \frac{1}{2} \right)^2 + \left( \frac{1}{2} \right)^2 = \frac{1}{4} + \frac{1}{4} = \frac{1}{2}[/Tex]
    • The point (1/2, 1/2) is a minimum of the function f(x, y) = x2 + y2 subject to the constraint x + y = 1.

Difference Between Extrema in Single and Multivariable Functions

Common differences between extrema in single and multivariable functions are listed in the following table:

AspectSingle-Variable FunctionsMultivariable Functions
Function Formf(x)f(x,y) orf(x,y,z,…)
Critical PointsPoints where f′(x)=0 or f′(x) is undefinedPoints where ∇f=0 (gradient vector is zero)
First Derivativef′(x)Partial derivatives fx= ∂f​/∂x, fy ​= ∂f​/∂y, etc.
Second Derivativef′′(x)Second partial derivatives fxx​ = ∂2f​/∂x2, fyy= ∂2f/y2​, fxy​ = ∂2f​/∂xy, etc.
Test for ExtremaSecond derivative test: Check the sign of ?′′(?)f′′(x)Second partial derivative test: Use the Hessian matrix ?H
Hessian MatrixNot applicable

[Tex]H = \begin{bmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{bmatrix}[/Tex]

Determinant of Hessian (D)Not applicableD = det(H)=fxx​fyy ​− (fxy​)2
Classification of Critical PointsIf f′′(x) > 0, local minimum
If f′′(x) < 0, local maximum
If f′′(x)=0, test is inconclusive
If ? > 0 and fxx​ > 0, local minimum
If D > 0 and fxx​ < 0, local maximum
If D < 0,fxx​ = 0 saddle point
If D=0, test is inconclusive
Graphical InterpretationPoints where the slope of the tangent is zeroPoints where the gradient vector is zero (stationary points)
ConstraintsGenerally, no constraintsCan include constraints, often handled using Lagrange multipliers

Conclusion

In conclusion, the concept of extrema in multivariable functions is vital as it allows us to find the maximum and minimum values of functions over a given region. By identifying critical points and evaluating the function along its boundaries, we can determine where these extrema occur. By using partial derivatives and the Hessian matrix, we can determine where these functions reach their highest or lowest values, or even saddle points where the function changes direction.

Read More,

Practice Problems on Extrema of Multivariable Functions

Problem 1: Find the critical points and classify them for the function:

f(x, y) = x2 + y2 – 4x – 6y + 13

Problem 2: Determine the local extrema of the function:

f(x, y) = x3 – 3x + y2

Problem 3: Identify and classify the critical points for the function:

[Tex]f(x, y) = e^{x^2 + y^2}[/Tex]

Problem 4: Find the extrema of the function subject to the constraint x + y = 1:

[Tex]f(x, y) = x^2 + y^2[/Tex]

Problem 5: Determine the critical points and their nature for the function:

f(x, y) = x4 + y4 – 4xy

Problem 6: Locate and classify the critical points of the function:

f(x, y) = sin(x) cos(y)

FAQs on Extrema of Multivariable Functions

What are extrema in multivariable functions?

Extrema in multivariable functions refer to the points where the function reaches either a maximum or a minimum value. These can be classified into local (or relative) extrema and global (or absolute) extrema.

How do you find critical points of a multivariable function?

To find critical points of a multivariable function f(x, y), you need to compute the first partial derivatives fx​ and fy​ and set them equal to zero:

  • fx(x, y) = 0
  • fy(x, y) = 0

What is the second derivative test for multivariable functions?

The second derivative test for multivariable functions involves the Hessian matrix, which is the matrix of second partial derivatives:

[Tex]H = \begin{pmatrix}f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{pmatrix}[/Tex]

At a critical point (x0, y0), evaluate the Hessian matrix H. Compute the determinant of the Hessian, [Tex]D = f_{xx}f_{yy} – (f_{xy})^2[/Tex]

  • If D > 0 and fxx > 0, then (x0, y0) is a local minimum.
  • If D > 0 and fxx < 0, then (x0, y0) is a local maximum.
  • If D < 0, then (x0, y0) is a saddle point.
  • If D = 0, the test is inconclusive.

Can we always rely on the second derivative test?

The second derivative test is useful but not always conclusive. If the determinant of the Hessian matrix D = 0, the test is inconclusive, and other methods, such as evaluating the function at critical points and comparing values, might be necessary to determine the nature of the critical points.




Reffered: https://www.geeksforgeeks.org


Engineering Mathematics

Related
Abstract Algebra Abstract Algebra
Continuous Random Variable Continuous Random Variable
Legendre&#039;s Differential Equation Legendre&#039;s Differential Equation
Absolute and Conditional Convergence Absolute and Conditional Convergence
Conditional Convergence Conditional Convergence

Type:
Geek
Category:
Coding
Sub Category:
Tutorial
Uploaded by:
Admin
Views:
24