I'm Adam. If you don't know who I am, then my work is still at least one order of magnitude away from echoing my name --- which soon it will:)
Pinned Loading
-
Lora-Without-Regret
Lora-Without-Regret PublicReproducing Lora Without Regret blog from scratch + Researching the ratio between LoRA and FullFT learning rates
Python 3
-
Vit-on-small-data
Vit-on-small-data PublicThe Lightest Vision Transformer (ViT) trained from scratch out there to achieve 93.37 ± 0.07%” top-1 accuracy on CIFAR-10 within just 50 epochs.
Python 7
-
training_models_from_scratch
training_models_from_scratch PublicTraining tiny models from scratch using NumPy in code and linear algebra on a piece of paper.
Python 12
-
optimizers-from-scratch
optimizers-from-scratch Publictraining models with different optimizers using NumPy only. Featuring SGD, Adam, Adagrad, NAG, RMSProp, and Momentum. This repo also includes a benchmark against Pytorch developed optims.
-
COCO-CONVERTER
COCO-CONVERTER PublicA CLI that converts CSV files and folders with image data to a JSON file with COCO annotations and builds a custom dataset off of it ready to train for object detection tasks.
Python 6
-
Transformer-from-scratch
Transformer-from-scratch Publicelaborate transformer implementation + detailed explanation
Python 4
If the problem persists, check the GitHub status page or contact support.