Edit model card

Model Description

This is a fine-tuned T5-small model for abstractive summarization. This model is fine-tuned with XSUM dataset and CNN/DailyMail dataset where only 20% of the train split and test split are used due to the limited resources for fine-tuning the model (the fine-tuning is done via Google Colab freeplan). The way the datasets are used is by concatenating the the train split and the test split of each datasets. This model achieved 0.25 in ROUGE-1, 0.08 in ROUGE-2, and 0.20 in both ROUGE-L and ROUGE-LSum. The low scores achieved might be happened because the amount of data used for both training and testing, the limit put on the hyperparameters, and the method of preprocessing the data for the model to train and test with. A better resourceful environment might improve the performance of this model.

  • Developed by: Daviadi Auzan Fadhlillah
  • Model type: T5-small
  • Language(s) (NLP): English
  • Finetuned from model [optional]: T5-small

Uses

The aim of fine-tuning this model is to generate an abstractive summarization of online news articles, however this model can be used for other purposes where a summarization is needed.

How to Get Started with the Model

Use the code below to get started with the model or to improve the fine-tuning of the model.

https://colab.research.google.com/drive/1pFUqp51uPdTIIeFUdrpKAYlWD-0VRnAC?usp=sharing

Model Card Contact

I open for any suggestions, therefore you can contact me via email. mas.davi1706@gmail.com

Your suggestions are highly appreciated.

Thank you for using this model.

Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train DaviadiAF/T5-Small_AbsSumm_XSumCNN