Skip to content
geeksforgeeks
  • Tutorials
    • Python
    • Java
    • Data Structures & Algorithms
    • ML & Data Science
    • Interview Corner
    • Programming Languages
    • Web Development
    • CS Subjects
    • DevOps And Linux
    • School Learning
    • Practice Coding Problems
  • Courses
    • DSA to Development
    • Get IBM Certification
    • Newly Launched!
      • Master Django Framework
      • Become AWS Certified
    • For Working Professionals
      • Interview 101: DSA & System Design
      • Data Science Training Program
      • JAVA Backend Development (Live)
      • DevOps Engineering (LIVE)
      • Data Structures & Algorithms in Python
    • For Students
      • Placement Preparation Course
      • Data Science (Live)
      • Data Structure & Algorithm-Self Paced (C++/JAVA)
      • Master Competitive Programming (Live)
      • Full Stack Development with React & Node JS (Live)
    • Full Stack Development
    • Data Science Program
    • All Courses
  • Data Science
  • Data Science Projects
  • Data Analysis
  • Data Visualization
  • Machine Learning
  • ML Projects
  • Deep Learning
  • NLP
  • Computer Vision
  • Artificial Intelligence
Open In App
Next Article:
Jacobian matrix in PyTorch
Next article icon

Jacobian matrix in PyTorch

Last Updated : 15 Jul, 2021
Comments
Improve
Suggest changes
Like Article
Like
Report

Introduction:

The Jacobian is a very powerful operator used to calculate the partial derivatives of a given function with respect to its constituent latent variables. For refresher purposes, the Jacobian of a given function f : \R^{n} \rightarrow \R^{m}           with respect to a vector \mathbf{x} = \{x_1, ..., x_n\} \in \R^{n}           is defined as

 \mathbf{J}_f(\mathbf{x}) = \begin{bmatrix}   \frac{\partial f}{\partial x_1} &    \frac{\partial f}{\partial x_2} & \ldots &    \frac{\partial f}{\partial x_n}\end{bmatrix}=    \begin{bmatrix}  \frac{\partial f_1}{\partial x_1} & \ldots &    \frac{\partial f_1}{\partial x_n} \\[1ex] \vdots & \ddots & \vdots \\ \frac{\partial f_m}{\partial x_1} &    \ldots &    \frac{\partial f_m}{\partial x_n} \end{bmatrix}

Example:

Suppose we have a vector \mathbf{x} = \begin{bmatrix}  x_{1} \\[1ex]x_{2} \\[1ex]x_{3} \end{bmatrix}        and a function f(\mathbf{x}) = f(x_1,x_2,x_3)= \begin{bmatrix}f_{1} \\[1ex]f_{2}\\[1ex]f_{3}\end{bmatrix} = \begin{bmatrix}x_1+x_2 \\[1ex]x_1 \times x_3 \\[1ex]x_2^{3}\end{bmatrix}      . To calculate the Jacobian of f       with respect to \mathbf{x}      , we can use the above-mentioned formula to get

\mathbf{J}_f(\mathbf{x}) = \begin{bmatrix}   \frac{\partial f}{\partial x_1} &    \frac{\partial f}{\partial x_2} &    \frac{\partial f}{\partial x_3}\end{bmatrix}=    \begin{bmatrix}  \frac{\partial f_1}{\partial x_1} & \frac{\partial f_1}{\partial x_2} &    \frac{\partial f_1}{\partial x_3} \\[1ex]   \frac{\partial f_2}{\partial x_1} & \frac{\partial f_2}{\partial x_2} &    \frac{\partial f_2}{\partial x_3} \\[1ex]  \frac{\partial f_3}{\partial x_1} & \frac{\partial f_3}{\partial x_2} &    \frac{\partial f_3}{\partial x_3} \end{bmatrix}=\begin{bmatrix}  \frac{\partial (x_1 + x_2)}{\partial x_1} & \frac{\partial (x_1 + x_2)}{\partial x_2} &    \frac{\partial (x_1 + x_2)}{\partial x_3} \\[1ex]   \frac{\partial (x_1 \times x_3)}{\partial x_1} & \frac{\partial (x_1 \times x_3)}{\partial x_2} &    \frac{\partial (x_1 \times x_3)}{\partial x_3} \\[1ex]  \frac{\partial x_2^3}{\partial x_1} & \frac{\partial x_2^3}{\partial x_2} &    \frac{\partial x_2^3}{\partial x_3} \end{bmatrix} = \begin{bmatrix}1 & 1 & 0 \\[1ex] x_3 & 0 & x_1 \\[1ex] 0 & 3\times x_2^2&0\end{bmatrix}

To achieve the same functionality as above, we can use the jacobian() function from Pytorch's torch.autograd.functional utility to compute the Jacobian matrix of a given function for some inputs.

Syntax: torch.autograd.functional.jacobian(func, inputs, create_graph=False, strict=False, vectorize=False)

Parameters:

  • func: A Python function which takes input and outputs a Pytorch Tensor(or a tuple of Tensors).
  • inputs: The inputs are passed as parameters to the 'func' method. The input can be a single Pytorch Tensor(or a tuple of Tensors)
  • create_graph: If True, the autograd engine creates a backpropable graph for doing further operations on gradients. Defaults to False.
  • strict: If True, an error will be raised when the engine detects that there exists an input such that all the outputs are independent of it. If False, returns zero gradients for such inputs. Defaults to False.
  • vectorize: Still in it's experimental phase if True, the function uses the vmap prototype feature to compute the gradients by only one call of the autograd engine instead of one call per each row of the matrix. Defaults to False.

Installation:

For this article, you only need the torch utility, which can be downloaded through the pip package manager using:

pip install torch

Example usage of the function:

We'll be using the same function and vector for ease of flow, as discussed in the above example. Since tensors are the basic building blocks of the Pytorch package, we'll be using them for representing both the inputs vectors and the given function. This article assumes a basic familiarity with Pytorch tensors which can be quickly reviewed by going through Pytorch articles. 

Theoretical verification:

Suppose we have a vector \mathbf{x} = \begin{bmatrix}  x_{1} \\[1ex]x_{2} \\[1ex]x_{3} \end{bmatrix}  = \begin{bmatrix}  3 \\[1ex]4 \\[1ex]5 \end{bmatrix}         as a given input. By plugging in the values of \mathbf{x}       into the above derived equation we will get J_f({\mathbf{x}}) =\begin{bmatrix}1 & 1 & 0 \\[1ex] x_3 & 0 & x_1 \\[1ex] 0 & 3\times x_2^2&0\end{bmatrix}=\begin{bmatrix}1 & 1 & 0 \\[1ex] 5 & 0 & 3 \\[1ex] 0 & 3\times 4^2&0\end{bmatrix}= \begin{bmatrix}1 & 1 & 0 \\[1ex] 5 & 0 & 3 \\[1ex] 0 & 48 & 0\end{bmatrix}

Code: Python implementation to show the working of Jacobian Matrix using Pytorch

Python
from torch.autograd.functional import jacobian from torch import tensor  #Defining the main function def f(x1,x2,x3):     return (x1 + x2, x3*x1, x2**3)  #Defining input tensors x1 = tensor(3.0) x2 = tensor(4.0) x3 = tensor(5.0)  #Printing the Jacobian print(jacobian(f,(x1,x2,x3))) 

Output:

((tensor(1.), tensor(1.), tensor(0.)),  (tensor(5.), tensor(0.), tensor(3.)),  (tensor(0.), tensor(48.), tensor(0.)))

The output is exactly similar to our theoretical verification! Using a similar approach, we can calculate the Jacobian matrix of any given function using the Pytorch API.

References:

  1. https://pytorch.org/docs/stable/autograd.html#torch.autograd.functional.jacobian
  2. https://pytorch.org/docs/stable/tensors.html

Next Article
Jacobian matrix in PyTorch

A

adityasaini70
Improve
Article Tags :
  • Machine Learning
  • AI-ML-DS
  • Python-PyTorch
  • python
  • Python Framework
Practice Tags :
  • Machine Learning
  • python

Similar Reads

    Jacobians in TensorFlow
    In the field of machine learning and numerical computation, it is extremely important to understand mathematical concepts. One of such fundamental concepts is Jacobian matrix. It is important part of calculus and has extensive applications in diverse fields. In this article, we will discuss Jacobian
    6 min read
    Find determinant of a complex matrix in PyTorch
    Linear Algebra module torch.linalg of PyTorch provides the function torch.linalg.det() to compute the determinant of a square(real or complex) matrix. This function also computes the determinant of each square matrix in a batch.  Syntax: torch.linalg.det(input) --> Tensor input is an (n, n) matri
    3 min read
    Tensor Operations in PyTorch
    In this article, we will discuss tensor operations in PyTorch. PyTorch is a scientific package used to perform operations on the given data like tensor in python. A Tensor is a collection of data like a numpy array. We can create a tensor using the tensor function: Syntax: torch.tensor([[[element1,e
    5 min read
    How to get the rank of a matrix in PyTorch
    In this article, we are going to discuss how to get the rank of a matrix in PyTorch. we can get the rank of a matrix by using torch.linalg.matrix_rank() method.torch.linalg.matrix_rank() methodmatrix_rank() method accepts a matrix and a batch of matrices as the input. This method returns a new tenso
    2 min read
    Vector Operations in Pytorch
    In this article, we are going to discuss vector operations in PyTorch. Vectors are a one-dimensional tensor, which is used to manipulate the data. Vector operations are of different types such as mathematical operation, dot product, and linspace. PyTorch is an optimized tensor library majorly used f
    4 min read
geeksforgeeks-footer-logo
Corporate & Communications Address:
A-143, 7th Floor, Sovereign Corporate Tower, Sector- 136, Noida, Uttar Pradesh (201305)
Registered Address:
K 061, Tower K, Gulshan Vivante Apartment, Sector 137, Noida, Gautam Buddh Nagar, Uttar Pradesh, 201305
GFG App on Play Store GFG App on App Store
Advertise with us
  • Company
  • About Us
  • Legal
  • Privacy Policy
  • In Media
  • Contact Us
  • Advertise with us
  • GFG Corporate Solution
  • Placement Training Program
  • Languages
  • Python
  • Java
  • C++
  • PHP
  • GoLang
  • SQL
  • R Language
  • Android Tutorial
  • Tutorials Archive
  • DSA
  • Data Structures
  • Algorithms
  • DSA for Beginners
  • Basic DSA Problems
  • DSA Roadmap
  • Top 100 DSA Interview Problems
  • DSA Roadmap by Sandeep Jain
  • All Cheat Sheets
  • Data Science & ML
  • Data Science With Python
  • Data Science For Beginner
  • Machine Learning
  • ML Maths
  • Data Visualisation
  • Pandas
  • NumPy
  • NLP
  • Deep Learning
  • Web Technologies
  • HTML
  • CSS
  • JavaScript
  • TypeScript
  • ReactJS
  • NextJS
  • Bootstrap
  • Web Design
  • Python Tutorial
  • Python Programming Examples
  • Python Projects
  • Python Tkinter
  • Python Web Scraping
  • OpenCV Tutorial
  • Python Interview Question
  • Django
  • Computer Science
  • Operating Systems
  • Computer Network
  • Database Management System
  • Software Engineering
  • Digital Logic Design
  • Engineering Maths
  • Software Development
  • Software Testing
  • DevOps
  • Git
  • Linux
  • AWS
  • Docker
  • Kubernetes
  • Azure
  • GCP
  • DevOps Roadmap
  • System Design
  • High Level Design
  • Low Level Design
  • UML Diagrams
  • Interview Guide
  • Design Patterns
  • OOAD
  • System Design Bootcamp
  • Interview Questions
  • Inteview Preparation
  • Competitive Programming
  • Top DS or Algo for CP
  • Company-Wise Recruitment Process
  • Company-Wise Preparation
  • Aptitude Preparation
  • Puzzles
  • School Subjects
  • Mathematics
  • Physics
  • Chemistry
  • Biology
  • Social Science
  • English Grammar
  • Commerce
  • World GK
  • GeeksforGeeks Videos
  • DSA
  • Python
  • Java
  • C++
  • Web Development
  • Data Science
  • CS Subjects
@GeeksforGeeks, Sanchhaya Education Private Limited, All rights reserved
We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have read and understood our Cookie Policy & Privacy Policy
Lightbox
Improvement
Suggest Changes
Help us improve. Share your suggestions to enhance the article. Contribute your expertise and make a difference in the GeeksforGeeks portal.
geeksforgeeks-suggest-icon
Create Improvement
Enhance the article with your expertise. Contribute to the GeeksforGeeks community and help create better learning resources for all.
geeksforgeeks-improvement-icon
Suggest Changes
min 4 words, max Words Limit:1000

Thank You!

Your suggestions are valuable to us.

What kind of Experience do you want to share?

Interview Experiences
Admission Experiences
Career Journeys
Work Experiences
Campus Experiences
Competitive Exam Experiences