PyTorch Softmax [Complete tutorial]

The PyTorch Softmax is a function that is applied to the n-dimensional input tensor and rescaled them and the elements of the n-dimensional output tensor lie in the range [0,1]. In detail, we will discuss Softmax using PyTorch in Python.

And additionally, we will also cover different examples related to PyTorch softmax. And we will cover these topics.

  • What is PyTorch softmax
  • PyTorch softmax example
  • How to use PyTorch softmax activation function
  • PyTorch functional softmax
  • PyTorch softmax dimension
  • What is PyTorch softmax2d
  • PyTorch softmax cross entropy
  • What is PyTorch Logsoftmax

What is PyTorch softmax

In this section, we will learn about the PyTorch softmax in python.

The PyTorch softmax is applied to the n-dimensional input tensor and rescaling them so that the output tensor of the n-dimensional tensor lies in the range[0,1].

Syntax:

Syntax of the softmax tensor is:

torch.nn.Softmax(dim=None)

Parameter:

The following is the parameter of the PyTorch softmax:

dim: dim is used as a dimension along with softmax will be computed and every chunk along dim will be sum to one.

The softmax returns a tensor in the form of input with the same dimension and shape with values in the range of [0,1].

So, with this, we understood about the PyTorch softmax by using the softmax() function.

Read Cross Entropy Loss PyTorch

PyTorch softmax example

In this section, we will learn about how to implement Pytorch softmax with the help of an example.

The softmax() functionis applied to the n-dimensional input tensor and rescaled them. The softmax() can be executed by using nn.softmax() function.

Code:

In the following code, we will import all necessary libraries such as import torch and import torch.nn as nn.

  • a = nn.Softmax(dim=1): Here we are using the softmax() function.
  • input = torch.randn(4, 5): Here we are declaring the input variable by using torch.randn() function.
  • print(output) is used to print the output by using the print() function.
# Importing Libraries
import torch
import torch.nn as nn
# Using Softmax() function
a = nn.Softmax(dim=1)
# Describing a variable
input = torch.randn(4, 5)
# Declaring the output
output = a(input)
# Print the output
print(output)

Output:

After running the above code, we get the following output in which we can see that the PyTorch softmax value is printed on the screen.

PyTorch softmax example
PyTorch softmax example

This is how we can implement the PyTorch softmax function with the help of an example.

Read PyTorch Save Model

How we can use PyTorch softmax activation function

In this section, we will learn about the PyTorch softmax activation function in python.

Before moving forward we should have a piece of knowledge about the activation function.

The activation function is a function that performs computations to give an output that acts as an input for the next neuron.

Code:

In the following code firstly we will import all the necessary libraries such as import torch, import torch.nn as nn.

  • b = nn.Softmax(dim=0) is used as the softmax function with dim=0 as dimension starts from 0.
  • input = torch.Tensor([3,-4,5,-7]) is used to declaring the tensor.
  • output = b(input) is used for applying a function to the tensor.
  • print(output) is used to print the output with the help of the print() function.
# Importing libraries
import torch
import torch.nn as nn
 
# Using the Softmax function with dim = 0 as dimension starts from 0
b = nn.Softmax(dim=0) 
 
# Declaring the tensor
input = torch.Tensor([3,-4,5,-7])
 
# Applying function to the tensor
output = b(input)
# Print the output
print(output)

Output:

In the below output, we can see that the PyTorch softmax activation function value is printed on the screen.

PyTorch softmax activation function
PyTorch softmax activation function

So, with this, we understood about the Pytorch softmax activation function in python.

PyTorch functional softmax

In this section, we will learn about the PyTorch functional softmax in python.

The PyTorch functional softmax is applied to all the pieces along with dim and rescale them so that the elements lie in the range [0,1].

Syntax:

Syntax of the PyTorch functional softmax:

torch.nn.functional.softmax(input, dim=None, dtype=None)

Parameters:

The following are the parameters of the PyTorch functional softmax:

  • input: The input parameter is defined as an input tensor.
  • dim: The dim parameter is defined as a dimension along with softmax that will be computed.
  • dtype: is defined as the desired datatype of returned tensor that is useful for preventing datatype overflows and the default value of dtype is None.

This is how we can understand the PyTorch functional softmax by using a torch.nn.functional.Softmax().

Read Adam optimizer PyTorch with Examples

PyTorch softmax dimension

In this section, we will learn about the PyTorch softmax dimension in python.

Before moving forward we should have a piece of knowledge about the dimension.

The dimension is defined as a quantifiable increase of a specific kind like length, height, width, and depth.

The dimension is a parameter that is defined as a dim along with softmax that will be computed.

Code:

In the following code, we will import all the necessary libraries such as import torch, import nn from torch.

  • m = nn.Softmax(dim=1): Here we are using the softmax() function.
  • input = torch.tensor([[5.0, 6.0, 3.0]],dtype=torch.float) is used to declaring the input variable by using torch.tensor() function.
  • output = m(input) is used for declaring the output variable.
  • print(output) is used to print the output with the help of print() function.
# Importing Libarray
import torch
from torch import nn
# Using the Softmax() function 
m = nn.Softmax(dim=1)
# Declaring the input variable
input = torch.tensor([[5.0, 6.0, 3.0]],dtype=torch.float)
# Declaring the output variable
output = m(input)
# Print output
print(output)

Output:

In the below output, you can see that the Pytorch softmax dimension values are printed on the screen.

PyTorch softmax dimension
PyTorch softmax dimension

So, with this, we understood about the PyTorch softmax dimension by using nn.softmax() function.

Read PyTorch nn linear

What is PyTorch softmax2d

In this section, we will learn about What is PyTorch softmax2d in python.

The PyTorch Softmax2d is a class that applies SoftMax above the features to every conceptual location.

Syntax:

torch.nn.softmax2d()

It returns the tensor of the same dimension and shapes as the input with values in the range of [0,1].

Example:

In the following code, we will import the torch library as import torch.

  • m = nn.Softmax2d(): Here we are using the softmax2d().
  • input = torch.randn(4, 5, 14, 15): Here we are declaring the softmax over the second dimension.
  • output = m(input): Here we are describing the output variable.
  • print(output) is used to print the output by using the print() function.
# Import library
import torch
# Using softmax2d
m = nn.Softmax2d()
# softmax over the 2nd dimension
input = torch.randn(4, 5, 14, 15)
#Declaring the output variable
output = m(input)
#Print output
print(output)

Output:

In the below output you can see that the PyTorch Softmax2d values are printed on the screen.

PyTorch

This is how we understand about the PyTorch softmax2d with the help of the softmax2d() function.

Read PyTorch Batch Normalization

PyTorch softmax cross entropy

In this section, we will learn about the PyTorch softmax cross entropy in python.

The torch.nn.CrossEntropyLoss() class computes the cross entropy loss between the input and target and the softmax() function is used to target with class probabilities.

Code:

In the following code firstly we will import the torch library such as import torch.

  • loss = nn.CrossEntropyLoss(): Here we are using the CrossEntropyLoss() function.
  • input = torch.randn(5, 7, requires_grad=True) is used to declaring the input variable.
  • target = torch.randn(5, 7).softmax(dim=1): Here we are declaring the target variable.
  • output = loss(input, target): Here we are describing the output variable.
  • print(output) is used to print the output with the help of the print() function.
# Import library
import torch
loss = nn.CrossEntropyLoss()
# Declaring the input variable
input = torch.randn(5, 7, requires_grad=True)
#Decaring the target variable
target = torch.randn(5, 7).softmax(dim=1)
# Describing the output variable
output = loss(input, target)
output.backward()
# Print output
print(output)

Output:

After running the above code, we get the following output in which we can see that the PyTorch softmax cross entropy values are printed on the screen.

PyTorch softmax cross entropy
PyTorch softmax cross entropy

So, with this, we understood about the PyTorch softmax cross entropy in python.

Read PyTorch Load Model

What is Pytorch Logsoftmax

In this section, we will learn about the PyTorch Logsoftmax in python.

The PyTorch Logsoftmax applies the logsoftmax() function to an n-dimensional input tensor.

Code:

In the following code, we will import all the necessary libraries such as import torch, import torch.nn as nn.

  • input = torch.rand(4,5): Here we are declaring the input variable by using torch.rand() function.
  • a = nn.Softmax(dim=1): Here we are using the softmax() function.
  • a = nn.LogSoftmax(dim=1) Here we are declaring the Logsoftmax() function.
# Importing libraries
import torch
import torch.nn as nn

# Declaring the variable
input = torch.rand(4,5)

# Using the softmax
a = nn.Softmax(dim=1)
soft = a(input)
print(soft)

soft = torch.softmax(input, dim=1) 
print(soft)

# Declaring the Logsoftmax() function
input = torch.randn(4,5,6)
a = nn.LogSoftmax(dim=1)
output = m(input)
# print the output
print(output)

Output:

After running the above code we get the following output in which we can see that the PyTorch Logsoftmax values are printed on the screen.

PyTorch Logsoftmax
PyTorch Logsoftmax

This is how we understand about PyTorch Logsoftmax with the help of the Logsigmoid() function in python.

So, in this tutorial, we discuss PyTorch Softmax and we have also covered different examples related to its implementation. Here is the list of examples that we have covered.

  • What is PyTorch softmax
  • PyTorch softmax example
  • How we can use PyTorch softmax activation function
  • PyTorch functional softmax
  • PyTorch softmax dimension
  • What is PyTorch softmax2d
  • PyTorch softmax cross entropy
  • What is PyTorch Logsoftmax

You may like the following PyTorch tutorials: