Skip to content
#

t5-small

Here are 34 public repositories matching this topic...

Derived from Medical Literature Development: Injecting domain expertise into T5 via precision vocabulary-guided masking. 源自医学文献开发:通过精准词库引导遮蔽,为 T5 注入领域专业知识。医学文献開発に基づく実装:高精度な語彙ガイド付きマスキングにより、T5モデルにドメイン専門知識を注入。(機械翻訳)

  • Updated Feb 6, 2026
  • Python

This project provides code for fine-tuning T5/mT5 models on data preprocessed by T5-Refiner-DomainFocus, enabling improved performance on domain-specific text. 本项目提供针对经过 T5-Refiner-DomainFocus 预处理后的数据进行微调训练的代码,便于在专业领域文本上优化 T5/mT5 模型性能。本プロジェクトは、T5-Refiner-DomainFocus によって前処理されたデータを用いた T5/mT5 モデルのファインチューニング用コードを提供し、専門分野のテキストにおける性能向上を可能にします。(機械翻訳)

  • Updated Feb 6, 2026
  • Python

Thesis scope: Train and Develop a Table-to-Text Transformer-based model for contextual summarization of tabular data. To achieve this T5-small , T5-base, Bart-base and Llama2 7B chat were finetuned on ToTTo and QTSumm. Regarding ToTTo, the models outperformed the benchmark.

  • Updated Apr 24, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the t5-small topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the t5-small topic, visit your repo's landing page and select "manage topics."

Learn more