Find Roots with SciPy Optimize

I was working on a numerical analysis project where I needed to find the roots of complex nonlinear equations. The challenge was that these equations had no analytical solutions. That’s when I turned to SciPy’s optimize module, specifically its root-finding capabilities.

In this article, I’ll walk you through everything you need to know about using SciPy’s optimize.root functions to find solutions to your equations. I’ll cover multiple methods with practical examples that you can apply to your projects.

So let’s dive in!

What is Root Finding in SciPy?

Root finding is the process of determining where a function equals zero. In mathematics, if f(x) = 0, then x is a root (or zero) of the function.

SciPy’s optimize module offers powerful tools to find these roots numerically, especially when analytical solutions aren’t possible. The main function we’ll focus on is scipy.optimize.root().

Methods for Root Finding Using SciPy

Now, I will explain the methods for root finding using SciPy:

1 – Basic Root Finding with Default Settings

Let’s start with a simple example to understand the basics:

import numpy as np
from scipy import optimize

# Define our function
def f(x):
    return x**3 - 2*x - 5

# Find the root
result = optimize.root(f, x0=2)

print("Root found:", result.x)
print("Function value at root:", result.fun)
print("Success:", result.success)

Output:

Root found: [2.09455148]
Function value at root: [-8.8817842e-16]
Success: True

I executed the above example code and added the screenshot below.

scipy.optimize.root

In this example, I’m trying to find where x³ – 2x – 5 = 0. The x0=2 parameter provides an initial guess, which helps the algorithm converge to the right solution.

When you run this code, you’ll see that it successfully finds the root around x ≈ 2.0946.

2 – Use Different Algorithms

SciPy’s root function offers several methods to find roots. Each algorithm has its strengths and weaknesses depending on your specific problem.

Here’s how to specify different methods:

import numpy as np
from scipy import optimize

def f(x):
    return x**3 - 2*x - 5

# Try different methods
methods = ['hybr', 'lm', 'broyden1', 'broyden2', 'anderson', 'linearmixing', 'diagbroyden', 'krylov']

for method in methods:
    try:
        result = optimize.root(f, x0=2, method=method)
        print(f"Method: {method}, Root: {result.x[0]:.6f}, Success: {result.success}")
    except Exception as e:
        print(f"Method: {method} failed with error: {e}")

Output:

Method: hybr, Root: 2.094551, Success: True
Method: lm, Root: 2.094551, Success: True
Method: broyden1 failed with error: too many indices for array: array is 0-dimensional, but 1 were indexed
Method: broyden2 failed with error: too many indices for array: array is 0-dimensional, but 1 were indexed
Method: anderson failed with error: too many indices for array: array is 0-dimensional, but 1 were indexed
Method: linearmixing failed with error: too many indices for array: array is 0-dimensional, but 1 were indexed
Method: diagbroyden failed with error: too many indices for array: array is 0-dimensional, but 1 were indexed
Method: krylov failed with error: too many indices for array: array is 0-dimensional, but 1 were indexed

I executed the above example code and added the screenshot below.

scipy.optimize.root_scalar

The hybr method (Hybrid Powell’s method) is the default and works well for many problems. However, some methods like lm (Levenberg-Marquardt) are designed specifically for least-squares minimization and may not work for general root-finding.

Read Python SciPy Interpolate

3 – Solve Systems of Equations

One of the most powerful features of SciPy’s root function is its ability to solve systems of equations. Let’s see this in action with a simple example:

import numpy as np
from scipy import optimize

# Define a system of equations
def system(x):
    return [
        x[0] + x[1] - 3,      # Equation 1: x + y = 3
        x[0]**2 + x[1]**2 - 13  # Equation 2: x² + y² = 13
    ]

# Initial guess
initial_guess = [1, 1]

# Find the roots
result = optimize.root(system, initial_guess)

print("Solution:")
print(f"x = {result.x[0]}")
print(f"y = {result.x[1]}")
print("Success:", result.success)

Output:

Solution:
x = -0.5615528128088303
y = 3.5615528128088303
Success: True

I executed the above example code and added the screenshot below.

root_scalar scipy

This example solves the system of equations:

  • x + y = 3
  • x² + y² = 13

The solution gives us the points where these two equations intersect.

Check out Python Scipy Odeint

4 – Customize the Solver with Options

For more complex problems, you might need to customize the solver’s behavior:

import numpy as np
from scipy import optimize

def f(x):
    return np.cos(x) - x**3

# Set custom options for more control
options = {
    'xtol': 1e-10,  # Relative error in solution
    'maxfev': 1000,  # Maximum number of function evaluations
    'disp': True     # Display convergence messages
}

result = optimize.root(f, x0=0.5, method='hybr', options=options)

print("Root found:", result.x[0])
print("Function value at root:", result.fun[0])

This approach gives you more control over the convergence criteria and solver behavior. It’s particularly useful for difficult equations that might require more iterations or higher precision.

5 – Use Jacobian for Faster Convergence

Providing the Jacobian matrix (the matrix of all first-order partial derivatives) can significantly improve performance:

import numpy as np
from scipy import optimize

def f(x):
    return [x[0]**2 + x[1]**2 - 4, x[0] * x[1] - 1]

# Define the Jacobian matrix
def jacobian(x):
    return [
        [2*x[0], 2*x[1]],
        [x[1], x[0]]
    ]

result = optimize.root(f, [1, 1], jac=jacobian, method='hybr')

print("Roots:", result.x)
print("Success:", result.success)

In this example, we’re finding the intersection points of a circle (x² + y² = 4) and a hyperbola (xy = 1). By providing the Jacobian, we help the algorithm converge more efficiently.

Read Python Scipy Leastsq

Real-World Example: Find Loan Payment Amount

Let’s solve a real-world problem: determining the monthly payment amount for a loan with compound interest.

import numpy as np
from scipy import optimize

# Parameters
principal = 250000  # $250,000 home loan
annual_rate = 0.05  # 5% annual interest
years = 30          # 30-year mortgage
total_payments = years * 12  # Total number of payments

def payment_equation(payment):
    # This equation equals zero when the payment is correct
    monthly_rate = annual_rate / 12
    return principal - payment * ((1 - (1 + monthly_rate)**-total_payments) / monthly_rate)

# Find the monthly payment
result = optimize.root(payment_equation, x0=1000)  # Initial guess of $1000

monthly_payment = result.x[0]
print(f"Monthly payment for a ${principal:,} loan at {annual_rate*100}% for {years} years: ${monthly_payment:.2f}")

This calculates that a $250,000 mortgage at 5% interest over 30 years requires a monthly payment of approximately $1,342.05.

Practical Tips for Using scipy.optimize.root

Based on my experience, here are some practical tips for getting the most out of SciPy’s root-finding capabilities:

  1. Choose a good initial guess: The closer your initial guess is to the actual root, the faster and more likely the algorithm will converge.
  2. Try different methods: If one method fails, try another. Some methods work better for certain types of problems.
  3. Use the Jacobian: When possible, provide the Jacobian matrix for faster and more reliable convergence.
  4. Check the success flag: Always verify that result.success is True before using the results.
  5. Handle multiple roots: Many functions have multiple roots. Try different initial guesses to find different roots.

SciPy’s optimize.root function is an incredibly powerful tool for solving equations numerically. Whether you’re working with simple single-variable equations or complex systems, these methods can help you find solutions efficiently.

I hope you found this article helpful. If you have any questions or suggestions, feel free to leave them in the comments below.

Other Python articles you may also like:

51 Python Programs

51 PYTHON PROGRAMS PDF FREE

Download a FREE PDF (112 Pages) Containing 51 Useful Python Programs.

pyython developer roadmap

Aspiring to be a Python developer?

Download a FREE PDF on how to become a Python developer.

Let’s be friends

Be the first to know about sales and special discounts.