HausaSentiLex

This model is a fine-tuned version of bert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0557
  • Train Accuracy: 0.9799
  • Epoch: 4

The sentiment fine-tuning was done on Hausa Language. Model Repository : https://github.com/idimohammed/HausaSentiLex

Model description

HausaSentiLex is a pretrained lexicon low resources language model. The model was trained on Hausa Language (Hausa is a Chadic language spoken by the Hausa people in the northern half of Nigeria, Niger, Ghana, Cameroon, Benin and Togo, and the southern half of Niger, Chad and Sudan, with significant minorities in Ivory Coast. It is the most widely spoken language in West Africa, and one of the most widely spoken languages in Africa as a whole). The model has been shown to obtain competitive downstream performances on text classification on trained language.

Intended uses & limitations

You can use this model with Transformers for sentiment analysis task in Hausa Language.

Training and evaluation data

Training and Evaluation Dataset can be access via https://data.mendeley.com/datasets/9cbw2b7h57/1

Training procedure

Article currently under review process.

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 5e-06, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train Accuracy Epoch
0.1586 0.9363 0
0.1130 0.9577 1
0.0969 0.9642 2
0.0740 0.9731 3
0.0557 0.9799 4

Framework versions

  • Transformers 4.34.1
  • TensorFlow 2.13.0
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mangaphd/HausaSentiLex

Finetuned
(2018)
this model