Model Overview
This model was built using the NeMo Nvidia BERTQAModel, using the SQuAD v2.0 dataset in Indonesian.
NVIDIA NeMo: Training
To train, fine-tune or play with the model you will need to install NVIDIA NeMo. We recommend you install it after you've installed latest Pytorch version.
pip install nemo_toolkit['all']
How to Use this Model
The model is available for use in the NeMo toolkit [3], and can be used as a pre-trained checkpoint for inference or for fine-tuning on another dataset.
Automatically instantiate the model
import nemo.collections.nlp as model
model = nemo_nlp.models.question_answering.qa_bert_model.BERTQAModel.from_pretrained("raihanpf22/nlp_id_qa_bert_base_uncased")
Transcribing using Python
Simply do:
eval_device = [config.trainer.devices[0]] if isinstance(config.trainer.devices, list) else 1
model.trainer = pl.Trainer(
devices=eval_device,
accelerator=config.trainer.accelerator,
precision=16,
logger=False,
)
config.exp_manager.create_checkpoint_callback = False
exp_dir = exp_manager(model.trainer, config.exp_manager)
output_nbest_file = os.path.join(exp_dir, "output_nbest_file.json")
output_prediction_file = os.path.join(exp_dir, "output_prediction_file.json")
all_preds, all_nbest = model.inference(
"questions.json",
output_prediction_file=output_prediction_file,
output_nbest_file=output_nbest_file,
num_samples=-1, # setting to -1 will use all samples for inference
)
for question_id in all_preds:
print(all_preds[question_id])
Input
This model accepts SQuAD Format v2.0 as input.
Output
This model provides output in the form of answers to questions according to the existing context.
Model Architecture
Using an uncased BERT base architecture model.
Training
50 Epochs, 8 Batch size per GPU, 1 num_layer
Datasets
using SQuAD v2.0 as train data
Performance
test_HasAns_exact = 98.0 test_HasAns_f1 = 98.0465087890625 test_HasAns_total = 100.0 test_NoAns_exact = 0.0 test_NoAns_f1 = 0.0 test_NoAns_total = 0.0 test_exact = 98.0 test_f1 = 98.0465087890625 test_loss = 0.00019806227646768093 test_total = 100.0
References
- Downloads last month
- 6