miniTorch is a minimal, educational deep learning framework inspired by PyTorch, designed for learning, research, and rapid prototyping. Built on top of NumPy, miniTorch provides a clear and concise implementation of core deep learning concepts, including tensors, autograd, neural network modules, optimizers, and data utilities.
- NumPy-based Tensor Engine: Custom
Tensorclass with full support for broadcasting, dtype management, and automatic differentiation (autograd). - Neural Network Building Blocks:
- PyTorch-like
Modulebase class for custom layers and models - Prebuilt layers:
Linear,Sequential, and activation modules (ReLU,LeakyReLU,Sigmoid,Tanh) - Parameter management and easy extensibility
- PyTorch-like
- Functional API: Core tensor operations (sum, exp, log, pow, transpose, relu, sigmoid, tanh, leaky_relu, softmax) for building custom computations
- Loss Functions: Ready-to-use
MSELossandBCELossfor regression and classification tasks - Optimizers: Implementations of
SGDandAdamfor training neural networks - Data Utilities: Simple
DatasetandDataLoaderclasses for batching, shuffling, and iterating over data - Computation Graph Visualization: Visualize your model's computation graph using Graphviz for better debugging and understanding.
- Educational Codebase: Clean, well-documented code ideal for students, educators, and researchers
Clone the repository:
git clone https://github.com/Umang-Shikarvar/miniTorch
cd miniTorch
pip install -r requirements.txtTo use minitorch as a library in your own projects, install it in editable mode:
pip install -e ./minitorchNow you can import minitorch from anywhere in your environment and your changes will be picked up automatically.
Example:
import minitorch
from minitorch.nn.modules import Linear, Sequential
from minitorch.optim import SGD
# Define a simple model
model = Sequential(
Linear(4, 8),
minitorch.nn.modules.ReLU(),
Linear(8, 1)
)
# Create optimizer
optimizer = SGD(model.parameters(), lr=0.01)- For tips and useful commands, see
docs/tips_and_commands.md. - Explore the
minitorch/directory for source code and examples.
We welcome contributions from everyone!
- Maintainers: see
docs/CONTRIBUTING (maintainers).md - Non-maintainers: see
docs/CONTRIBUTING.md
For questions, suggestions, or discussions, open an issue or start a discussion on GitHub.
This project is licensed under the MIT License. See the LICENSE file for details.
Happy learning and building with miniTorch! ⭐