Skip to content

A computer vision setup for longitudinal studies of Drosophila development

License

Notifications You must be signed in to change notification settings

Changyuan-Wang/Auxodrome

Repository files navigation

The Fruit Fly Auxodrome: a computer vision setup for longitudinal studies of Drosophila development


Auxo-: growth, increase; a Greek goddess representing growth
-drome: a place for running or racing

Auxodrome” is also a real word referring to “a plotted curve indicating the relative development of a child at any given age.”


Installation & Substitution

Option 1

Please follow the installation instructions provided on the pytorch-3dunet GitHub page.

After completing those steps:

  1. Create a conda environment as instructed on the pytorch-3dunet page and activate it.

  2. Install the required packages listed on the pytorch-3dunet page, and additionally:

    conda install -c pytorch torchvision pytorch -c conda-forge numpy av
    
  3. Replace the following files in the cloned pytorch-3dunet repository:

Option 2

  1. Create a conda environment and activate it.

    conda create --name 3dunet-env python=3.11
    
  2. Install the following packages:

    conda install -c conda-forge numpy=1.26.4 av=12.3.0 tensorboard tqdm setuptools h5py scipy scikit-image pyyaml pytest
    conda install pytorch==2.2.2 torchvision==0.17.2 torchaudio==2.2.2 pytorch-cuda=12.1 -c pytorch -c nvidia
    
  3. Download packages provided on the pytorch-3dunet GitHub page. In the cloned pytorch-3dunet repository, replace the pytorch3dunet folder with the version provided in the substitution folder of this repository.


Training & Validation

  1. Install QuPath.

  2. Annotate larvae as foreground and all other components (e.g., food, eggs, pupae, etc.) as background.

  3. Export annotations using the provided export script. Label indices should be: foreground = 1, background = 0, unlabeled = 2 (ignored during training).

  4. Convert the training and validation datasets into HDF5 format with /raw and /label datasets, as specified by pytorch-3dunet. A minimum of 16 frames along the time axis is required for both training and validation sets.

  5. An example training YAML file is provided as train_config.yml in the example folder.


YAML File Generator

  1. We use a YAML file generator to create YAML files for analyzing experimental videos. This generator identifies the center of each well, and creates a testing yaml file for each well separately. Use the trained model for each larval stage (eggs & L1, L2, L3 & pupae) and adults to run tests on that stage separately.

  2. There are two types of testing yaml files, test_config-VideoDataset.yml and test_config-ProbField.yml. The example yaml files for one example well in the example folder.

    • The VideoDataset yaml file is to use the trained 3D-Unet model to run predictions on the testing frames you specified, and it will generate batches of raw frames and predicted frames for your specified well. The raw frames are just original avi videos of that well, and the predicted frames are a probability field representing the probability of each pixel being the foreground.

    • Use CombineVideo.ipynb to combine all the batches of predicted frames into a large video for each well. Then use ProbField yaml file to threshold the probability field, turn the thresholded predicted frames into batches of avi videos, calculate the areas and centroids of the predicted larvae and save those two metrics into batches csv files for future analysis.


Hatch, Pupation, and Eclosion Analysis

We use a PlotGenerator to find the timings of hatching, pupation, and eclosion for all wells. This ipynb file will read the csv files generated from ProbField testings, apply noise filters on the generated metrics, and spit out the timings of hatching, pupation, and eclosion.


License

This code is released under the MIT License. See the LICENSE.md file for details.

About

A computer vision setup for longitudinal studies of Drosophila development

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published