Skip to content

A minimal PyTorch-like deep learning framework built from scratch using NumPy, featuring autograd, neural network modules and optimizers.

License

Notifications You must be signed in to change notification settings

ShardulJunagade/miniTorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

miniTorch

miniTorch is a minimal, educational deep learning framework inspired by PyTorch, designed for learning, research, and rapid prototyping. Built on top of NumPy, miniTorch provides a clear and concise implementation of core deep learning concepts, including tensors, autograd, neural network modules, optimizers, and data utilities.

👤 Maintainers

✨ Key Features

  • NumPy-based Tensor Engine: Custom Tensor class with full support for broadcasting, dtype management, and automatic differentiation (autograd).
  • Neural Network Building Blocks:
    • PyTorch-like Module base class for custom layers and models
    • Prebuilt layers: Linear, Sequential, and activation modules (ReLU, LeakyReLU, Sigmoid, Tanh)
    • Parameter management and easy extensibility
  • Functional API: Core tensor operations (sum, exp, log, pow, transpose, relu, sigmoid, tanh, leaky_relu, softmax) for building custom computations
  • Loss Functions: Ready-to-use MSELoss and BCELoss for regression and classification tasks
  • Optimizers: Implementations of SGD and Adam for training neural networks
  • Data Utilities: Simple Dataset and DataLoader classes for batching, shuffling, and iterating over data
  • Computation Graph Visualization: Visualize your model's computation graph using Graphviz for better debugging and understanding.
  • Educational Codebase: Clean, well-documented code ideal for students, educators, and researchers

🚀 Installation

Clone the repository:

git clone https://github.com/Umang-Shikarvar/miniTorch
cd miniTorch
pip install -r requirements.txt

To use minitorch as a library in your own projects, install it in editable mode:

pip install -e ./minitorch

Now you can import minitorch from anywhere in your environment and your changes will be picked up automatically.

Example:

import minitorch
from minitorch.nn.modules import Linear, Sequential
from minitorch.optim import SGD

# Define a simple model
model = Sequential(
    Linear(4, 8),
    minitorch.nn.modules.ReLU(),
    Linear(8, 1)
)

# Create optimizer
optimizer = SGD(model.parameters(), lr=0.01)

🛠️ Documentation & Tips

🤝 Contributing

We welcome contributions from everyone!

For questions, suggestions, or discussions, open an issue or start a discussion on GitHub.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Happy learning and building with miniTorch!

About

A minimal PyTorch-like deep learning framework built from scratch using NumPy, featuring autograd, neural network modules and optimizers.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 80.0%
  • Python 19.9%
  • Other 0.1%