Skip to content

This project implements a machine learning model to predict Antimicrobial Resistance (AMR) from genomic sequences. It utilizes the powerful DNABERT pre-trained language model from Hugging Face as a feature extractor, followed by a custom, dense classifier head for binary prediction.

Notifications You must be signed in to change notification settings

MaryamAsgarinezhad/Predicting-Antimicrobial-Resistance-AMR-using-DNABERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Predicting Antimicrobial Resistance (AMR) using DNABERT

This project implements a machine learning model to predict Antimicrobial Resistance (AMR) from genomic sequences.
It utilizes the powerful DNABERT pre-trained language model from Hugging Face as a feature extractor, followed by a custom, dense classifier head for binary prediction.


1. Project Goal

The primary objective is to develop a reliable and reusable Python ML model capable of classifying Cefoxitin resistance in the Staphylococcus aureus pbp4 gene based solely on its raw genomic sequence data.


2. Model Architecture

The core of the solution is a custom PyTorch class that wraps the feature extraction and classification steps.

  • Input: Tokenized DNA sequences (6-mers).
  • Feature Extraction: The tokenized sequences are passed through the pre-trained DNABERT model (via Hugging Face Transformers).
  • Embedding: The embedding of the [CLS] token is extracted, representing the aggregated features of the entire sequence.
  • Classification Head: This embedding is fed into a custom, 4-layer fully connected (linear) network featuring non-linear activations (e.g., ReLU) and a final binary output layer (e.g., using a sigmoid activation for probability output).
    Dropout is utilized for regularization.

3. Setup and Installation

Prerequisites

  • Python (3.8+)
  • PyTorch (recommended backend)
  • Git

About

This project implements a machine learning model to predict Antimicrobial Resistance (AMR) from genomic sequences. It utilizes the powerful DNABERT pre-trained language model from Hugging Face as a feature extractor, followed by a custom, dense classifier head for binary prediction.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published