• This was a fun project I built to better understand back-propagation operation and how they are implemented in libraries like pytorch and tensorflow.
  • It builds a dynamic computation graph similar to PyTorch. It correctly handles optimization and gradient flow using the backpropagation algorithm for various operations defined in a computation graph (eg: convolutional layers, fully connected layers, sigmoid, softmax etc.).
Aniket Didolkar
Research Intern

My research interests include deep learning, natural language processing and reinforcement learning.