Recently, I was working on a data analysis project that required calculating the rate of change in temperature measurements across different U.S. cities. The challenge was finding an efficient way to compute derivatives for large arrays of time-series data.
SciPy came to the rescue with its powerful numerical differentiation capabilities. If you’re dealing with scientific computing in Python, understanding how to calculate derivatives of arrays is essential.
In this article, I’ll show you several practical methods to compute derivatives of arrays using SciPy, with real-world examples and efficient techniques I’ve refined over my decade of Python development.
Work with Derivatives in SciPy
Before getting into the code, let’s clarify what we mean by “derivative of an array.” In numerical computing, we’re typically looking for how quickly values change concerning their indices or another variable.
SciPy offers multiple approaches to calculating derivatives:
- Using
scipy.misc.derivativefor function derivatives - Using
scipy.interpolatewith splines for array data - Using finite difference methods with NumPy
- Leveraging
scipy.signalfor derivative approximations
Let’s explore each method with practical examples.
Read Python SciPy Chi-Square Test
Method 1: Use scipy.misc.derivative
The scipy.misc.derivative() function is perfect when you have a Python function and need to find its derivative at a specific point.
import numpy as np
import matplotlib.pyplot as plt
# A function representing daily temperature in New York (simplified model)
def temp_function(x):
# x represents days (0-365)
# Returns temperature in Fahrenheit
return 50 + 25 * np.sin((x - 30) * 2 * np.pi / 365)
# Calculate the derivative at day 180 using central difference manually
def numerical_derivative(f, x, dx=1e-6):
return (f(x + dx) - f(x - dx)) / (2 * dx)
day = 180
derivative = numerical_derivative(temp_function, day)
print(f"Temperature change rate at day {day}: {derivative:.2f}°F per day")
# Visualize the function and its derivative
days = np.linspace(0, 365, 1000)
temps = temp_function(days)
# Use np.gradient for vectorized derivative
derivatives = np.gradient(temps, days)
plt.figure(figsize=(12, 6))
plt.subplot(2, 1, 1)
plt.plot(days, temps)
plt.title("Daily Temperature in New York")
plt.ylabel("Temperature (°F)")
plt.subplot(2, 1, 2)
plt.plot(days, derivatives)
plt.title("Rate of Temperature Change")
plt.ylabel("°F per day")
plt.xlabel("Day of Year")
plt.tight_layout()
plt.show()I executed the above example code and added the screenshot below.

The dx parameter controls the step size for numerical differentiation. A smaller value typically gives more accurate results, but can lead to numerical instability if too small.
You can also calculate higher-order derivatives by setting the n parameter:
# Calculate second derivative (acceleration of temperature change)
second_derivative = misc.derivative(temp_function, day, dx=1e-6, n=2)
print(f"Second derivative at day {day}: {second_derivative:.4f}°F per day²")Check out Python SciPy Exponential
Method 2: Derivatives of Array Data using Interpolation
When working with actual measurement data (not a function), we can use SciPy’s interpolation capabilities to compute derivatives.
import numpy as np
from scipy import interpolate
import matplotlib.pyplot as plt
# Weather data: days and corresponding temperatures for Chicago
days = np.array([0, 30, 60, 90, 120, 150, 180, 210, 240, 270, 300, 330, 360])
temps = np.array([32, 35, 45, 58, 70, 80, 85, 83, 75, 62, 48, 38, 33])
# Create a spline interpolation
tck = interpolate.splrep(days, temps, s=0)
# New x-axis for smooth curve
days_fine = np.linspace(0, 360, 1000)
# Get interpolated temperatures
temps_smooth = interpolate.splev(days_fine, tck, der=0)
# Get first derivative
temps_derivative = interpolate.splev(days_fine, tck, der=1)
# Plotting
plt.figure(figsize=(12, 6))
plt.subplot(2, 1, 1)
plt.plot(days, temps, 'o', label='Data points')
plt.plot(days_fine, temps_smooth, label='Interpolated')
plt.title('Chicago Temperature Throughout the Year')
plt.ylabel('Temperature (°F)')
plt.legend()
plt.subplot(2, 1, 2)
plt.plot(days_fine, temps_derivative)
plt.title('Rate of Temperature Change')
plt.xlabel('Day of Year')
plt.ylabel('°F per day')
plt.tight_layout()
plt.show()I executed the above example code and added the screenshot below.

The splev function’s der parameter lets you specify which derivative to compute (0 for the function itself, 1 for the first derivative, etc.).
Read Python SciPy Confidence Interval
Method 3: Finite Difference Methods
For direct array differentiation, finite difference methods provide a straightforward approach:
import numpy as np
import matplotlib.pyplot as plt
# S&P 500 daily closing prices (simplified example)
days = np.arange(30) # 30 trading days
sp500_values = np.array([
4200, 4210, 4205, 4220, 4235, 4245, 4240, 4250, 4265, 4280,
4275, 4290, 4310, 4305, 4315, 4325, 4330, 4345, 4360, 4350,
4370, 4390, 4385, 4400, 4410, 4405, 4415, 4430, 4450, 4460
])
# Central difference method for first derivative
def central_diff(y, dx=1.0):
return np.gradient(y, dx)
# Forward difference method
def forward_diff(y, dx=1.0):
return np.diff(y) / dx
# Calculate derivatives
central_derivatives = central_diff(sp500_values)
forward_derivatives = forward_diff(sp500_values)
# Plotting
plt.figure(figsize=(12, 8))
plt.subplot(3, 1, 1)
plt.plot(days, sp500_values)
plt.title('S&P 500 Daily Closing Values')
plt.ylabel('Index Value')
plt.subplot(3, 1, 2)
plt.plot(days, central_derivatives)
plt.title('Rate of Change (Central Difference)')
plt.ylabel('Points per day')
plt.subplot(3, 1, 3)
plt.plot(days[:-1], forward_derivatives)
plt.title('Rate of Change (Forward Difference)')
plt.xlabel('Trading Day')
plt.ylabel('Points per day')
plt.tight_layout()
plt.show()I executed the above example code and added the screenshot below.

NumPy’s gradient function implements the central difference method, which is generally more accurate than forward or backward differences.
Method 4: Use SciPy Signal Processing
For noisy data, SciPy’s signal processing tools can be valuable:
import numpy as np
from scipy import signal
import matplotlib.pyplot as plt
# Generate noisy sensor data (e.g., pollution levels in Los Angeles)
days = np.linspace(0, 30, 300) # 30 days with 10 measurements per day
true_trend = 50 + 20 * np.sin(days * 2 * np.pi / 30) # Underlying trend
noise = np.random.normal(0, 5, size=len(days)) # Measurement noise
pollution_levels = true_trend + noise
# Smooth the data with a Savitzky-Golay filter
window_length = 51 # Must be odd
polynomial_order = 3
smoothed = signal.savgol_filter(pollution_levels, window_length, polynomial_order)
# Calculate the derivative
derivative = signal.savgol_filter(pollution_levels, window_length, polynomial_order, deriv=1)
# Plotting
plt.figure(figsize=(12, 8))
plt.subplot(3, 1, 1)
plt.plot(days, pollution_levels, 'gray', alpha=0.5, label='Noisy data')
plt.plot(days, smoothed, 'r', label='Smoothed')
plt.title('LA Pollution Levels (PM2.5)')
plt.ylabel('µg/m³')
plt.legend()
plt.subplot(3, 1, 2)
plt.plot(days, true_trend, 'g', label='True trend')
plt.plot(days, smoothed, 'r', label='Recovered trend')
plt.title('True vs. Recovered Trend')
plt.ylabel('µg/m³')
plt.legend()
plt.subplot(3, 1, 3)
plt.plot(days, derivative)
plt.title('Rate of Change in Pollution Levels')
plt.xlabel('Day')
plt.ylabel('µg/m³ per day')
plt.tight_layout()
plt.show()The Savitzky-Golay filter is especially powerful as it can simultaneously smooth data and calculate derivatives by setting the deriv parameter.
Method 5: Compute Derivatives of 2D Arrays
For spatial data like temperature maps, we need to calculate derivatives in multiple dimensions:
import numpy as np
import matplotlib.pyplot as plt
from scipy import ndimage
# Create a 2D temperature map (e.g., US temperature grid)
x = np.linspace(-5, 5, 100)
y = np.linspace(-5, 5, 100)
X, Y = np.meshgrid(x, y)
Z = 70 + 20 * np.exp(-0.1 * (X**2 + Y**2)) # Temperature in Fahrenheit
# Calculate gradients (derivatives) in x and y directions
dx = ndimage.sobel(Z, axis=1)
dy = ndimage.sobel(Z, axis=0)
# Magnitude of the gradient
gradient_magnitude = np.sqrt(dx**2 + dy**2)
# Plotting
plt.figure(figsize=(15, 5))
plt.subplot(1, 3, 1)
plt.imshow(Z, cmap='inferno')
plt.title('Temperature Map (°F)')
plt.colorbar(label='Temperature (°F)')
plt.subplot(1, 3, 2)
plt.imshow(dx, cmap='coolwarm')
plt.title('East-West Temperature Gradient')
plt.colorbar(label='Rate of change (°F/unit)')
plt.subplot(1, 3, 3)
plt.imshow(gradient_magnitude, cmap='viridis')
plt.title('Gradient Magnitude')
plt.colorbar(label='°F/unit')
plt.tight_layout()
plt.show()This approach using scipy.ndimage is particularly useful for image processing, geographical data, or any 2D array data where you need to calculate spatial derivatives.
Compute Second Derivatives Using SciPy
Sometimes we need to calculate the second derivative (rate of change of the rate of change). This is useful for analyzing acceleration, curvature, or inflection points:
from scipy import misc
import numpy as np
import matplotlib.pyplot as plt
# Function representing monthly sales of a seasonal product
def sales_function(x):
# x is month (0-11)
# Base sales + seasonal component
return 1000 + 500 * np.sin((x - 3) * 2 * np.pi / 12) + 50 * x
# Calculate first and second derivatives
months = np.linspace(0, 11, 100)
sales = [sales_function(m) for m in months]
first_deriv = [misc.derivative(sales_function, m, dx=1e-6, n=1) for m in months]
second_deriv = [misc.derivative(sales_function, m, dx=1e-6, n=2) for m in months]
# Plotting
plt.figure(figsize=(12, 8))
plt.subplot(3, 1, 1)
plt.plot(months, sales)
plt.title('Monthly Product Sales')
plt.ylabel('Sales ($)')
plt.subplot(3, 1, 2)
plt.plot(months, first_deriv)
plt.title('Rate of Change in Sales')
plt.ylabel('$/month')
plt.subplot(3, 1, 3)
plt.plot(months, second_deriv)
plt.title('Acceleration of Sales (Second Derivative)')
plt.xlabel('Month')
plt.ylabel('$/month²')
plt.tight_layout()
plt.show()The second derivative helps identify where sales growth is accelerating or decelerating, which is valuable for inventory planning and marketing strategy.
Performance Considerations
When working with large arrays, performance becomes critical. Here’s a comparison of different methods:
import numpy as np
import time
from scipy import misc, interpolate, ndimage, signal
# Generate a large array
x = np.linspace(0, 10, 10000)
y = np.sin(x) * np.exp(-0.1*x)
# Method 1: NumPy gradient
start = time.time()
d1 = np.gradient(y, x)
time1 = time.time() - start
print(f"NumPy gradient time: {time1:.5f} seconds")
# Method 2: SciPy interpolate
start = time.time()
tck = interpolate.splrep(x, y, s=0)
d2 = interpolate.splev(x, tck, der=1)
time2 = time.time() - start
print(f"SciPy interpolate time: {time2:.5f} seconds")
# Method 3: SciPy signal Savitzky-Golay
start = time.time()
d3 = signal.savgol_filter(y, 51, 3, deriv=1, delta=x[1]-x[0])
time3 = time.time() - start
print(f"SciPy signal Savitzky-Golay time: {time3:.5f} seconds")
# Method 4: Finite difference
start = time.time()
d4 = np.diff(y) / np.diff(x)
# Add padding to match original size
d4 = np.concatenate(([d4[0]], d4))
time4 = time.time() - start
print(f"Finite difference time: {time4:.5f} seconds")
In my experience, NumPy’s gradient and finite differences are generally fastest, while interpolation methods are more accurate but slower. The Savitzky-Golay filter strikes a good balance for noisy data.
Check out Python SciPy Minimize
Real-World Example: Analyze Stock Market Data
Let’s apply derivative calculations to analyze real-world financial data:
import numpy as np
import matplotlib.pyplot as plt
from scipy import signal
# Simulated daily S&P 500 closing prices over 2 years (simplified)
trading_days = np.arange(500) # ~2 years of trading days
base_trend = 3000 + trading_days * 2 # Upward trend
seasonal = 150 * np.sin(trading_days * 2 * np.pi / 252) # Annual cycle
volatility = np.cumsum(np.random.normal(0, 15, size=len(trading_days))) # Random walks
sp500_prices = base_trend + seasonal + volatility
# Calculate first derivative (daily price change)
daily_change = np.gradient(sp500_prices)
# Calculate second derivative (acceleration of price changes)
acceleration = np.gradient(daily_change)
# Calculate 20-day moving average of price
ma20 = np.convolve(sp500_prices, np.ones(20)/20, mode='valid')
# Pad to match original length
ma20_padded = np.pad(ma20, (19, 0), 'edge')
# Plot the results
plt.figure(figsize=(14, 10))
plt.subplot(3, 1, 1)
plt.plot(trading_days, sp500_prices, label='S&P 500')
plt.plot(trading_days, ma20_padded, 'r', label='20-day MA')
plt.title('S&P 500 Index (Simulated)')
plt.ylabel('Price ($)')
plt.legend()
plt.subplot(3, 1, 2)
plt.plot(trading_days, daily_change)
plt.axhline(y=0, color='r', linestyle='-', alpha=0.3)
plt.title('Daily Price Change (First Derivative)')
plt.ylabel('Change ($/day)')
plt.subplot(3, 1, 3)
plt.plot(trading_days, acceleration)
plt.axhline(y=0, color='r', linestyle='-', alpha=0.3)
plt.title('Price Change Acceleration (Second Derivative)')
plt.xlabel('Trading Day')
plt.ylabel('Acceleration ($/day²)')
plt.tight_layout()
plt.show()
This analysis helps identify momentum shifts in the market. When the second derivative (acceleration) changes sign, it often indicates potential trend reversals that might not be immediately obvious from the price chart alone.
Read Python SciPy Freqz
Handle Edge Effects
One challenge with derivatives is edge effects, especially at the boundaries of your data:
import numpy as np
import matplotlib.pyplot as plt
# Generate sample data
x = np.linspace(0, 10, 100)
y = np.sin(x)
# Different derivative methods
gradient = np.gradient(y, x)
diff = np.diff(y) / np.diff(x)
diff_padded = np.concatenate(([diff[0]], diff)) # pad to match original size
# True derivative (analytical)
true_deriv = np.cos(x)
# Plot to compare
plt.figure(figsize=(10, 6))
plt.plot(x, true_deriv, 'k-', label='True derivative (cos(x))')
plt.plot(x, gradient, 'r--', label='np.gradient')
plt.plot(x, diff_padded, 'g:', label='np.diff (padded)')
plt.title('Comparison of Derivative Methods')
plt.xlabel('x')
plt.ylabel('Derivative')
plt.legend()
# Zoom in on edge
plt.figure(figsize=(10, 6))
plt.plot(x[:10], true_deriv[:10], 'k-', label='True derivative')
plt.plot(x[:10], gradient[:10], 'r--', label='np.gradient')
plt.plot(x[:10], diff_padded[:10], 'g:', label='np.diff (padded)')
plt.title('Edge Effects in Derivative Calculation')
plt.xlabel('x')
plt.ylabel('Derivative')
plt.legend()
plt.grid(True)
plt.show()
You can see that np.gradient typically handles edges better than simple differences, as it uses a central difference method where possible and switches to forward/backward differences at the edges.
Working with derivatives of arrays is a powerful technique in numerical analysis, whether you’re analyzing financial data, scientific measurements, or signal processing. SciPy provides multiple approaches, each with its strengths:
- Use
np.gradientfor simple, efficient derivatives of well-behaved data - Use interpolation techniques when you need higher accuracy
- Use Savitzky-Golay filters when dealing with noisy data
- Consider edge effects, especially when analyzing small datasets
Over my years of Python development, I’ve found that choosing the right differentiation method depends heavily on your specific data characteristics and application requirements.
Remember that numerical differentiation amplifies noise, so preprocessing your data with appropriate smoothing techniques is often essential for meaningful results.
You may like to read other SciPy-related articles:

I am Bijay Kumar, a Microsoft MVP in SharePoint. Apart from SharePoint, I started working on Python, Machine learning, and artificial intelligence for the last 5 years. During this time I got expertise in various Python libraries also like Tkinter, Pandas, NumPy, Turtle, Django, Matplotlib, Tensorflow, Scipy, Scikit-Learn, etc… for various clients in the United States, Canada, the United Kingdom, Australia, New Zealand, etc. Check out my profile.