When I was working on an optimization problem where I needed to find the global minimum of a complex function with multiple variables. Traditional optimization methods kept getting stuck in local minima, and I needed something more robust. That’s when I turned to SciPy’s differential evolution algorithm.
In this article, I’ll walk you through how to use SciPy’s differential evolution for optimization problems, with practical examples that you can apply to your projects.
So let’s get in!
What is Differential Evolution?
Differential Evolution (DE) is a powerful evolutionary algorithm designed to optimize real-parameter, multi-modal functions. Unlike gradient-based methods, DE doesn’t require derivative information, making it ideal for complex optimization problems.
The algorithm works by maintaining a population of candidate solutions and creating new candidate solutions by combining existing ones. It then keeps the best solutions and removes the others, gradually improving the quality of solutions over generations.
Install SciPy
Before we begin, make sure you have SciPy installed:
pip install scipyRead Python SciPy Sparse
Basic Usage of Differential Evolution in SciPy
Let’s start with a simple example to understand how differential evolution works in SciPy:
from scipy.optimize import differential_evolution
import numpy as np
# Define the objective function to minimize
def objective_function(x):
return x[0]**2 + x[1]**2 + x[2]**2
# Define the bounds for each variable
bounds = [(-5, 5), (-5, 5), (-5, 5)]
# Run differential evolution
result = differential_evolution(objective_function, bounds)
# Print the results
print("Best solution:", result.x)
print("Function value at best solution:", result.fun)
print("Number of function evaluations:", result.nfev)
print("Convergence status:", result.success)
print("Convergence message:", result.message)Output:
Best solution: [0. 0. 0.]
Function value at best solution: 0.0
Number of function evaluations: 6394
Convergence status: True
Convergence message: Optimization terminated successfully.I executed the above example code and added the screenshot below.

In this example, we’re minimizing a simple quadratic function with three variables. The global minimum is at (0, 0, 0) with a function value of 0.
Set Bounds for Variables
One of the key requirements when using differential evolution is defining the bounds for each variable. This tells the algorithm the range within which to search for solutions.
There are two ways to define bounds:
- As a list of tuples, where each tuple represents the lower and upper bounds for a variable:
bounds = [(-5, 5), (-10, 10), (0, 15)]- As a sequence of min-max pairs, where each pair represents the bounds for one variable:
bounds = [(-5, 5), (-10, 10), (0, 15)]Check out Python SciPy IIR Filter
Advanced Configuration Options
SciPy’s differential evolution offers several parameters to customize the algorithm’s behavior:
result = differential_evolution(
objective_function,
bounds,
strategy='best1bin', # Mutation strategy
maxiter=1000, # Maximum number of generations
popsize=15, # Population size multiplier
tol=0.01, # Relative tolerance for convergence
mutation=(0.5, 1), # Mutation constant
recombination=0.7, # Recombination constant
seed=42, # Random seed for reproducibility
workers=-1 # Number of processes (-1 means all CPUs)
)Let me explain some of these parameters:
- strategy: Controls how the mutation is performed. Options include ‘best1bin’, ‘best1exp’, ‘rand1exp’, ‘randtobest1exp’, ‘best2exp’, ‘rand2exp’, ‘randtobest1bin’, ‘best2bin’, ‘rand2bin’, ‘rand1bin’.
- maxiter: Maximum number of generations.
- popsize: Population size multiplier. The actual population size is
popsize * len(x). - mutation: The mutation constant or a tuple (min, max) for adaptive mutation.
- Recombination: The recombination constant must be between 0 and 1.
- workers: Number of processes to use for parallel evaluation. Use
-1to use all available CPU cores.
Use Constraints
Sometimes, you need to add constraints to your optimization problem. SciPy’s differential evolution supports linear and nonlinear constraints:
from scipy.optimize import Bounds, LinearConstraint, NonlinearConstraint
# Define the objective function
def objective_function(x):
return x[0]**2 + x[1]**2
# Define variable bounds
bounds = Bounds([0, 0], [10, 10])
# Define a linear constraint: 2*x[0] + x[1] >= 1
linear_constraint = LinearConstraint([[2, 1]], [1], [np.inf])
# Define a nonlinear constraint: x[0]**2 + x[1] <= 5
def nonlinear_constraint_func(x):
return x[0]**2 + x[1]
nonlinear_constraint = NonlinearConstraint(nonlinear_constraint_func, -np.inf, 5)
# Run differential evolution with constraints
result = differential_evolution(
objective_function,
bounds=bounds,
constraints=(linear_constraint, nonlinear_constraint)
)
print("Best solution:", result.x)
print("Function value at best solution:", result.fun)Output:
Best solution: [0.40000183 0.20001235]
Function value at best solution: 0.20000640108583126I executed the above example code and added the screenshot below.

Read Python SciPy Butterworth Filter
Real-World Example: Portfolio Optimization
Let’s consider a more practical example: portfolio optimization. Imagine we have five stocks and want to determine the optimal allocation to maximize the Sharpe ratio (return per unit of risk):
import numpy as np
from scipy.optimize import differential_evolution
# Historical annual returns for 5 stocks (e.g., AAPL, MSFT, AMZN, GOOGL, META)
mean_returns = np.array([0.15, 0.12, 0.25, 0.10, 0.18])
# Covariance matrix of returns
cov_matrix = np.array([
[0.0225, 0.0075, 0.0090, 0.0075, 0.0090],
[0.0075, 0.0144, 0.0075, 0.0060, 0.0075],
[0.0090, 0.0075, 0.0625, 0.0090, 0.0120],
[0.0075, 0.0060, 0.0090, 0.0100, 0.0075],
[0.0090, 0.0075, 0.0120, 0.0075, 0.0324]
])
# Risk-free rate (e.g., 10-year Treasury yield)
risk_free_rate = 0.02
# Objective function to maximize Sharpe ratio (we minimize the negative)
def negative_sharpe_ratio(weights):
weights = np.array(weights)
portfolio_return = np.sum(mean_returns * weights)
portfolio_stddev = np.sqrt(weights.T @ cov_matrix @ weights)
sharpe_ratio = (portfolio_return - risk_free_rate) / portfolio_stddev
return -sharpe_ratio # Negative because we want to maximize
# Constraint: weights must sum to 1
def sum_constraint(weights):
return np.sum(weights) - 1
constraints = ({'type': 'eq', 'fun': sum_constraint})
# Bounds for each weight: between 0 and 1
bounds = [(0, 1) for _ in range(5)]
# Run differential evolution
result = differential_evolution(
negative_sharpe_ratio,
bounds=bounds,
constraints=constraints,
strategy='best1bin',
maxiter=1000,
popsize=15,
tol=0.0001,
mutation=(0.5, 1),
recombination=0.7,
seed=42
)
# Get the optimal weights
optimal_weights = result.x
optimal_sharpe = -result.fun
print("Optimal portfolio weights:")
print(f"AAPL: {optimal_weights[0]:.4f}")
print(f"MSFT: {optimal_weights[1]:.4f}")
print(f"AMZN: {optimal_weights[2]:.4f}")
print(f"GOOGL: {optimal_weights[3]:.4f}")
print(f"META: {optimal_weights[4]:.4f}")
print(f"Optimal Sharpe Ratio: {optimal_sharpe:.4f}")This example demonstrates how differential evolution can be used to solve a financial optimization problem that might be difficult for traditional optimization methods.
Check out Python SciPy Stats Fit
Performance Tips for Differential Evolution
Here are some tips to improve the performance of differential evolution:
- Choose appropriate bounds: Narrower bounds can help the algorithm converge faster if you have prior knowledge about where the solution might lie.
- Adjust population size: A larger population size (
popsize) can help explore the search space more thoroughly, but will take longer to converge. - Tune mutation and recombination parameters: The default values work well for many problems, but tuning these parameters can improve performance for specific cases.
- Use parallel processing: Set
workers=-1to use all available CPU cores for function evaluations, which can significantly speed up the process. - Try different strategies: If you’re not getting good results with the default strategy, try different mutation strategies.
Handle Multiple Objectives
While SciPy’s differential evolution is designed for single-objective optimization, you can handle multiple objectives by using techniques like weighted sums or constraints.
Here’s an example using a weighted sum approach:
def multi_objective(x, w1=0.6, w2=0.4):
obj1 = x[0]**2 + x[1]**2 # First objective
obj2 = (x[0]-2)**2 + (x[1]-2)**2 # Second objective
# Weighted sum
return w1 * obj1 + w2 * obj2
bounds = [(-5, 5), (-5, 5)]
result = differential_evolution(multi_objective, bounds)
print("Best solution:", result.x)Output:
Best solution: [0.8 0.8]I executed the above example code and added the screenshot below.

Differential Evolution vs Other Optimization Methods
In my experience, differential evolution excels in situations where:
- The objective function is non-differentiable or noisy
- The problem has multiple local minima
- The function is complex, and gradient information is unavailable or unreliable
However, for problems where the objective function is smooth and differentiable, traditional methods like BFGS or L-BFGS-B might converge faster.
I’ve found that differential evolution is particularly useful for real-world problems like parameter fitting, engineering design optimization, and financial portfolio optimization, where the objective function is complex and has multiple local optima.
Python’s SciPy implementation of differential evolution provides a powerful yet easy-to-use tool for solving these complex optimization problems. With the right parameter settings and problem formulation, it can find global optima for problems that would stump traditional optimization methods.
You may also like to read:

I am Bijay Kumar, a Microsoft MVP in SharePoint. Apart from SharePoint, I started working on Python, Machine learning, and artificial intelligence for the last 5 years. During this time I got expertise in various Python libraries also like Tkinter, Pandas, NumPy, Turtle, Django, Matplotlib, Tensorflow, Scipy, Scikit-Learn, etc… for various clients in the United States, Canada, the United Kingdom, Australia, New Zealand, etc. Check out my profile.