File size: 1,589 Bytes
c88c725 66e7494 c88c725 66e7494 c88c725 66e7494 c88c725 66e7494 c88c725 66e7494 c88c725 66e7494 c88c725 66e7494 c88c725 66e7494 c88c725 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
---
license: apache-2.0
base_model: facebook/bart-base
tags:
- generated_from_trainer
- summarization
- finance-news
model-index:
- name: bart-base-finance-news-summarization
results: []
---
# bart-base-finance-news-summarization
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) specifically for summarizing finance-related news articles.
## Model description
BART-Base Finance News Summarization is designed to quickly condense finance news articles into shorter summaries, capturing key financial data and market trends. This model helps financial analysts, investors, and journalists rapidly gather insights from extensive news coverage.
## Intended uses & limitations
This model is tailored for summarizing financial news content. It is not intended for non-finance news or for use in generating original news content. Users should be aware of potential biases in the training data that might affect the neutrality of the summaries.
## Training and evaluation data
The model was trained on a dataset composed of thousands of finance-related news articles, each paired with professionally written summaries to ensure high-quality training.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9, 0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Tokenizers 0.15.2
|