Sinhala BERT
Collection
Sinhala BERT collection
•
2 items
•
Updated
This model is pretrained on Sinhala data srources.
hidden_size = 384
num_hidden_layers = 6
num_attention_heads = 6
intermediate_size = 1024
More information needed
More information needed
The following hyperparameters were used during training: