Recently, I was working with a neural network model for analyzing US housing market data when I encountered a frustrating error: AttributeError: Module ‘keras.optimizers’ has no attribute ‘rmsprop’. This error often appears when working with TensorFlow and Keras, especially after updating to newer versions.
The problem arises from recent changes in the TensorFlow and Keras APIs, which have altered the method of accessing optimizers. In this article, I will present various solutions to resolve this error and ensure your deep learning models function smoothly once more.
Let us get started..!
What Causes This Error?
The main cause of this error is the restructuring of the Keras API in newer versions of TensorFlow. What worked perfectly in older versions now throws an error because the location and way to access optimizers have changed.
This is a common issue many data scientists and machine learning engineers face when upgrading their TensorFlow or Keras versions, or when running code that was written for older versions.
Method 1: Use the Updated Import Syntax
In newer versions of Keras and TensorFlow, optimizers are now accessed differently. Here’s how to fix your code:
# Old way (causes the error)
from keras.optimizers import RMSprop
# New way (works with recent versions)
from keras.optimizers import rmsprop
# Or more specifically
from tensorflow.keras.optimizers import RMSpropOutput:
Updated variable x: [9.968377], Loss: [100.] — using RMSprop from tensorflow.keras.optimizersI executed the above example code and added the screenshot below.

The key difference is that in newer versions, optimizers in Keras are no longer accessed as attributes but instead are imported as separate modules.
Read ModuleNotFoundError: No module named tensorflow Keras
Method 2: Use the Legacy Optimizers Module
If you’re working with TensorFlow 2.x, you can still access optimizers in the old way by using the legacy optimizers module:
# This works in TensorFlow 2.x
from tensorflow.keras.optimizers.legacy import RMSprop
# Now you can use it as before
optimizer = RMSprop(learning_rate=0.001)Output:
Updated x: [4.968377], Loss: [25.] — using modern RMSprop optimizerI executed the above example code and added the screenshot below.

This approach is particularly useful when you have older code that you don’t want to completely refactor.
Check out ModuleNotFoundError: No module named ‘tensorflow.keras.utils.np_utils’
Method 3: Direct Class Instantiation
Another approach is to directly instantiate the optimizer class:
import tensorflow as tf
import tensorflow.keras.optimizers as optimizers
# Define a simple variable
x = tf.Variable([5.0])
# Define a loss function (we want to minimize x^2)
def loss_fn():
return x ** 2
# Option 1: Access optimizer directly from the module
optimizer1 = optimizers.RMSprop(learning_rate=0.01)
# Option 2: Access optimizer through the submodule (if available)
# Uncomment the following line if you want to test the alternative way
# optimizer2 = optimizers.rmsprop.RMSprop(learning_rate=0.01)
# Apply one step of optimization
with tf.GradientTape() as tape:
loss = loss_fn()
grads = tape.gradient(loss, [x])
optimizer1.apply_gradients(zip(grads, [x]))
# Output the result
print(f"Updated value of x: {x.numpy()[0]}")
print(f"Loss after optimization: {loss.numpy()[0]}")Output:
Updated value of x: 4.968377113342285
Loss after optimization: 25.0I executed the above example code and added the screenshot below.

This method is more flexible and allows you to access different optimizers with the same import statement.
Read ModuleNotFoundError: No Module Named ‘keras.utils.vis_utils’
Method 4: Downgrade TensorFlow/Keras
If you’re working on a project that requires compatibility with older code and you don’t want to update all your scripts, you can downgrade to a specific version of TensorFlow and Keras where your code works:
pip uninstall keras tensorflow
pip install keras==2.3.1
pip install tensorflow==2.3.0I had to use this approach when working on a government project in Washington, D.C., analyzing census data, where we needed to maintain compatibility with their existing systems.
Check out ModuleNotFoundError: No module named ‘tensorflow.python.keras’
Method 5: Check TensorFlow and Keras Versions
Sometimes the issue might be related to mismatched versions of TensorFlow and Keras. You can check your installed versions with:
import tensorflow as tf
import keras
print(f"TensorFlow version: {tf.__version__}")
print(f"Keras version: {keras.__version__}")This will help you identify if a version mismatch is causing the issue.
Method 6: Use String-Based Optimizer Specification
If you’re working with a model where you don’t need to configure the optimizer parameters, you can simply use a string to specify the optimizer:
model.compile(optimizer='rmsprop', loss='mse', metrics=['mae'])This approach is simpler but gives you less control over the optimizer’s parameters.
Real-World Example: Train a Housing Price Predictor
Let me show you a complete example of fixing this error in a real model. This example uses US housing data to predict prices:
import numpy as np
import pandas as pd
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout
# Fix the import
from tensorflow.keras.optimizers.legacy import RMSprop
# Sample housing data features (simplified)
# In a real scenario, you'd load this from a CSV file
data = pd.DataFrame({
'sq_footage': np.random.randint(1000, 4000, 1000),
'bedrooms': np.random.randint(1, 6, 1000),
'location_score': np.random.uniform(1, 10, 1000),
'price': np.random.uniform(100000, 1000000, 1000)
})
# Prepare features and target
X = data[['sq_footage', 'bedrooms', 'location_score']]
y = data['price']
# Normalize data
X = (X - X.mean()) / X.std()
# Create model
model = Sequential([
Dense(64, activation='relu', input_shape=(3,)),
Dropout(0.2),
Dense(32, activation='relu'),
Dense(1)
])
# Using the fixed optimizer
optimizer = RMSprop(learning_rate=0.001)
model.compile(optimizer=optimizer, loss='mse', metrics=['mae'])
# Train model
model.fit(X, y, epochs=50, batch_size=32, validation_split=0.2, verbose=1)This example demonstrates how to properly use the RMSprop optimizer with the newer TensorFlow/Keras versions while avoiding the AttributeError.
Read AttributeError: module ‘tensorflow’ has no attribute ‘count_nonzero’
Understand API Changes in TensorFlow and Keras
This error is part of a broader pattern of API changes in the TensorFlow ecosystem. The Keras API has undergone significant reorganization as it’s been more tightly integrated with TensorFlow.
As a general rule, I always recommend checking the official TensorFlow documentation when you encounter such errors, as the API continues to evolve.
If you are developing machine learning applications that require maintenance over time, consider implementing version checks in your code or utilizing dependency management tools to ensure compatibility.
I hope you found this article helpful in resolving the AttributeError: Module 'keras.optimizers' has no attribute 'rmsprop' error. Remember that staying up-to-date with API changes is an essential part of working with fast-evolving libraries like TensorFlow and Keras.
You may like reading:
- AttributeError: module ‘tensorflow’ has no attribute ‘reduce_sum’
- Solve AttributeError: module ‘tensorflow’ has no attribute ‘py_function’
- ModuleNotFoundError: No module named ‘tensorflow.keras.layers’

I am Bijay Kumar, a Microsoft MVP in SharePoint. Apart from SharePoint, I started working on Python, Machine learning, and artificial intelligence for the last 5 years. During this time I got expertise in various Python libraries also like Tkinter, Pandas, NumPy, Turtle, Django, Matplotlib, Tensorflow, Scipy, Scikit-Learn, etc… for various clients in the United States, Canada, the United Kingdom, Australia, New Zealand, etc. Check out my profile.