Skip to content

"To Softmax, or not to Softmax: that is the question when applying Active Learning for Transformer Models"

License

Notifications You must be signed in to change notification settings

jgonsior/active-learning-softmax-uncertainty-clipping

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

309 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code for Research Paper "To Softmax, or not to Softmax: that is the question when applying Active Learning for Transformer Models"


Running the experiments

First, make sure to install the needed dependecies as defined in Pipfile, preferably using Pipenv.

We have defined the parameter grid of our experiments at the beginning of the file run_experiment.py. To run a single workload use:

python run_experiment.py --workload baselines --n_array_jobs 1 --array_job_id 0

By appending the CLI parameter --dry_run you can examine first, what the single experimnt runs are.

If you hav eaccess to SLURM based HPC cluster, you can also use our SLURM files.

For evaluation, the file evaluate_experiments.py can be used to generate the plots of the paper (and more), simply uncomment at the bottom of the file the desired plots (note that some of them use a lot of Memory and might take some time).

Upon request, we can also provide you access to the raw experiment results (~20GB of Data) in order to save a lot of computer ressources on your end. Or, if you know a place where we can host freely host our 20GB of experiment results, please also feel free to reach out to us.

Acknowledgments

The underlying small-text framework is a software created by Christopher Schröder (@chschroeder) at Leipzig University's NLP group which is a part of the Webis research network.

About

"To Softmax, or not to Softmax: that is the question when applying Active Learning for Transformer Models"

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •