Transfer Learning with TensorFlow/Keras — CIFAR-10 Classification Using Pretrained CNNs
This project demonstrates transfer learning in TensorFlow/Keras by adapting a pretrained convolutional neural network (CNN) for CIFAR-10 image classification. The goals are to:
- Leverage ImageNet-trained models (e.g., VGG16, ResNet50) as feature extractors.
- Implement both frozen-base feature extraction and fine-tuning strategies.
- Showcase best practices for efficiently adapting large models to smaller datasets.
-
Dataset Loading & Preprocessing
- CIFAR-10 loaded from
tf.keras.datasets. - Pixel values normalized to
[0, 1]. - Labels one-hot encoded with
tf.keras.utils.to_categorical.
- CIFAR-10 loaded from
-
Base Model Selection
- Pretrained model from
tf.keras.applicationswithimagenetweights. include_top=Falseto remove the original classifier.- Input resized to match the base model requirements.
- Pretrained model from
-
Custom Classifier Head
GlobalAveragePooling2D → Dense(256, relu) → Dense(10, softmax).- Designed to adapt ImageNet features to CIFAR-10 classes.
-
Training Phases
- Feature Extraction: Base layers frozen, only new classifier layers trained.
- Fine-Tuning (optional): Top layers of base model unfrozen, trained with a lower LR.
-
Evaluation & Visualization
- Model summary printed to verify architecture and trainable parameters.
- Optional plots for training history.
- Python 3.x
- TensorFlow / Keras
- NumPy
- Matplotlib (for visualizations)
-
CIFAR-10:
- 60,000 images (32×32 RGB), 10 classes.
- Train: 50,000 images | Test: 10,000 images.
- Classes: airplane, automobile, bird, cat, deer, dog, frog, horse, ship, truck.
-
Preprocessing: normalization + one-hot encoding.
Prerequisites
- Python 3.x
- TensorFlow (GPU optional)
Install
pip install tensorflow numpy matplotlibRun
-
Open
Transfer Learning_me.ipynbin Jupyter/Colab/VS Code. -
Execute cells sequentially.
-
The notebook will:
- Load and preprocess CIFAR-10.
- Load the pretrained model and build a custom head.
- Train in feature extraction and/or fine-tuning modes.
- Not provided — The notebook focuses on demonstrating the transfer learning process, not maximizing accuracy.
Example: Model Summary Output
Model: "sequential"
_________________________________________________________________
vgg16 (Functional) (None, 7, 7, 512) 14714688
global_average_pooling2d (None, 512) 0
dense (None, 10) 5130
=================================================================
Total params: 14,719,818
Trainable params: 5,130
Non-trainable params: 14,714,688
_________________________________________________________________
- Freezing pretrained layers preserves valuable learned features.
- Fine-tuning selectively can further improve domain adaptation.
- Transfer learning enables strong performance on small datasets with reduced compute cost.
- This workflow is adaptable to any image classification task.
Mehran Asgari Email: imehranasgari@gmail.com GitHub: https://github.com/imehranasgari
This project is licensed under the Apache 2.0 License – see the LICENSE file for details.
💡 Some interactive outputs (e.g., plots, widgets) may not display correctly on GitHub. If so, please view this notebook via nbviewer.org for full rendering.