I worked on a data analysis project that involved processing time-series data from various sensors. The raw data contained a significant amount of noise, so I needed to filter it, identify peaks, and conduct spectral analysis. To accomplish this, I used the signal module from SciPy, which is a powerful toolkit that makes signal processing in Python both easy and efficient.
In this article, I’ll share practical ways to use SciPy signal for various signal processing tasks. Whether you’re analyzing stock market data, processing audio signals, or working with scientific measurements, these techniques will help you extract meaningful insights from your data.
So let’s get started..!
Get Started with SciPy Signal
Before we can use SciPy signal, we need to install and import it:
# Import the necessary modules
import numpy as np
from scipy import signal
import matplotlib.pyplot as pltRead Python SciPy Sparse
Method 1 – Filter Signals with SciPy
One of the most common tasks in signal processing is filtering. Let’s say you’re analyzing temperature readings from sensors in different US cities, but the data contains noise:
# Create a sample signal (temperature readings with noise)
time = np.linspace(0, 10, 1000) # 10 seconds of data
# Simulating daily temperature variations (in Fahrenheit) with noise
clean_signal = 75 + 10 * np.sin(2 * np.pi * 0.1 * time) # Base temperature around 75°F
noise = 3 * np.random.randn(len(time))
noisy_signal = clean_signal + noise
# Apply a low-pass Butterworth filter
b, a = signal.butter(4, 0.1, 'low')
filtered_signal = signal.filtfilt(b, a, noisy_signal)
# Plot the results
plt.figure(figsize=(10, 6))
plt.plot(time, noisy_signal, 'b-', alpha=0.5, label='Noisy')
plt.plot(time, filtered_signal, 'r-', linewidth=2, label='Filtered')
plt.plot(time, clean_signal, 'g--', linewidth=1.5, label='Original')
plt.legend(loc='best')
plt.title('Temperature Readings - Raw vs Filtered')
plt.xlabel('Time (hours)')
plt.ylabel('Temperature (°F)')
plt.grid(True)
plt.show()You can see the output in the screenshot below.

The Butterworth filter is excellent for smoothing data without distorting the underlying signal too much. I find it particularly useful for weather data, stock prices, or any time series that contains unwanted high-frequency noise.
Method 2 – Find Peaks in Signals
Another common task is finding peaks in your signal. For example, if you’re analyzing foot traffic in a New York shopping mall throughout the day:
# Create a sample signal with peaks (foot traffic during store hours)
hours = np.linspace(8, 20, 1000) # Store hours from 8 AM to 8 PM
# Morning rush, lunch peak, and evening rush
traffic = 50 + 100 * np.exp(-0.5 * ((hours - 9) / 0.5) ** 2) + \
150 * np.exp(-0.5 * ((hours - 12) / 0.8) ** 2) + \
200 * np.exp(-0.5 * ((hours - 17.5) / 1) ** 2) + \
10 * np.random.randn(len(hours))
# Find peaks
peaks, properties = signal.find_peaks(traffic, height=150, distance=100)
# Plot the results
plt.figure(figsize=(10, 6))
plt.plot(hours, traffic)
plt.plot(hours[peaks], traffic[peaks], 'ro', markersize=8)
plt.title('Mall Foot Traffic with Peak Detection')
plt.xlabel('Hour of Day')
plt.ylabel('Number of Shoppers')
plt.grid(True)
plt.show()
print(f"Peak shopping times: {hours[peaks].round(1)} hours")You can see the output in the screenshot below.

The find_peaks function is versatile, you can set minimum heights, required distances between peaks, and even the prominence (how much a peak stands out from surrounding values).
Check out Python SciPy IIR Filter
Method 3 – Signal Resampling
Sometimes your data isn’t sampled at the rate you need. SciPy signal makes resampling easy:
# Original signal: US GDP quarterly data (simplified example)
quarters = np.arange(0, 5, 0.25) # 5 years of quarterly data
gdp_quarterly = 21000 + 500 * quarters + 100 * np.sin(2 * np.pi * quarters) + 50 * np.random.randn(len(quarters))
# Resample to monthly data
months = np.arange(0, 5, 1/12)
gdp_monthly = signal.resample(gdp_quarterly, len(months))
# Plot the results
plt.figure(figsize=(10, 6))
plt.plot(quarters, gdp_quarterly, 'bo-', label='Quarterly Data')
plt.plot(months, gdp_monthly, 'r.-', alpha=0.7, label='Monthly Resampled')
plt.title('US GDP Data Resampling')
plt.xlabel('Years')
plt.ylabel('GDP (billions $)')
plt.legend()
plt.grid(True)
plt.show()You can see the output in the screenshot below.

Resampling is invaluable when you need to align data from different sources or when converting between different time intervals.
Read Python SciPy Butterworth Filter
Method 4 – Spectral Analysis with FFT
Fast Fourier Transform (FFT) helps us analyze the frequency components of a signal. This is useful for finding cyclical patterns:
# Create a signal with multiple frequency components (e.g., electricity consumption)
t = np.linspace(0, 1, 1000)
# Daily cycle (24 hours) + weekly cycle (7 days) + random noise
daily_cycle = 10 * np.sin(2 * np.pi * 24 * t)
weekly_cycle = 5 * np.sin(2 * np.pi * (24/7) * t)
yearly_seasonal = 3 * np.sin(2 * np.pi * (24/365) * t)
noise = 2 * np.random.randn(len(t))
power_consumption = daily_cycle + weekly_cycle + yearly_seasonal + noise
# Compute FFT
fft = np.fft.fft(power_consumption)
freqs = np.fft.fftfreq(len(power_consumption), t[1] - t[0])
# Plot the power spectrum (only positive frequencies)
plt.figure(figsize=(10, 6))
plt.plot(freqs[:len(freqs)//2], np.abs(fft)[:len(fft)//2])
plt.title('Power Consumption Frequency Analysis')
plt.xlabel('Frequency (Hz)')
plt.ylabel('Amplitude')
plt.grid(True)
plt.show()FFT analysis can reveal hidden patterns in your data, such as daily, weekly, or seasonal cycles that might not be immediately obvious in the time domain.
Method 5 – Savitzky-Golay Filtering
For preserving features like peaks while removing noise, Savitzky-Golay filters are excellent:
# Create sample data (e.g., heart rate during exercise)
time = np.linspace(0, 30, 300) # 30 minutes of exercise
# Resting heart rate + warm-up + intense exercise + cool down
heart_rate = 70 + 30 * (1 - np.exp(-0.5 * time)) + 40 * np.exp(-0.2 * ((time - 15) ** 2)) + 5 * np.random.randn(len(time))
# Apply Savitzky-Golay filter
window_length = 21
poly_order = 3
heart_rate_filtered = signal.savgol_filter(heart_rate, window_length, poly_order)
# Plot the results
plt.figure(figsize=(10, 6))
plt.plot(time, heart_rate, 'b-', alpha=0.5, label='Raw Data')
plt.plot(time, heart_rate_filtered, 'r-', linewidth=2, label='Filtered')
plt.title('Heart Rate During Exercise')
plt.xlabel('Time (minutes)')
plt.ylabel('Heart Rate (BPM)')
plt.legend()
plt.grid(True)
plt.show()Savitzky-Golay filters are particularly useful for biomedical signals, spectroscopic data, or any signal where preserving the shape of peaks is important.
Check out Python SciPy Curve Fit
Method 6 – Wavelet Analysis
For signals with varying frequency content over time, wavelet transforms can be more useful than FFT:
# Generate a chirp signal (a signal with increasing frequency)
t = np.linspace(0, 10, 1000)
chirp_signal = signal.chirp(t, f0=1, f1=20, t1=10, method='linear')
chirp_signal = chirp_signal + 0.2 * np.random.randn(len(t))
# Compute continuous wavelet transform
widths = np.arange(1, 31)
cwtmatr = signal.cwt(chirp_signal, signal.morlet, widths)
# Plot the signal and its CWT
plt.figure(figsize=(10, 8))
plt.subplot(211)
plt.plot(t, chirp_signal)
plt.title('Chirp Signal (Increasing Frequency)')
plt.xlabel('Time (s)')
plt.subplot(212)
plt.imshow(np.abs(cwtmatr), aspect='auto', extent=[0, 10, 1, 31])
plt.colorbar(label='Magnitude')
plt.title('Continuous Wavelet Transform')
plt.ylabel('Scale')
plt.xlabel('Time (s)')
plt.tight_layout()
plt.show()Wavelet analysis is especially useful for non-stationary signals like music, speech, or financial data, where the frequency content changes over time.
SciPy’s signal module provides a comprehensive toolbox for signal processing tasks in Python. From basic filtering to advanced spectral analysis, it offers practical solutions for handling real-world data.
The methods I’ve covered here are just the beginning. SciPy signal offers many more functions for specialized tasks like deconvolution, filter design, and signal generation.
I hope you found this guide helpful for your signal processing needs. Remember that choosing the right technique depends on your specific data and goals. With these tools in your arsenal, you’ll be well-equipped to extract meaningful insights from your signals, whether you’re working with financial data, scientific measurements, or audio processing.
You may also read:

I am Bijay Kumar, a Microsoft MVP in SharePoint. Apart from SharePoint, I started working on Python, Machine learning, and artificial intelligence for the last 5 years. During this time I got expertise in various Python libraries also like Tkinter, Pandas, NumPy, Turtle, Django, Matplotlib, Tensorflow, Scipy, Scikit-Learn, etc… for various clients in the United States, Canada, the United Kingdom, Australia, New Zealand, etc. Check out my profile.