Model Card for Model ID

Model Details

Model Description

This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: Yuzuki Tsukagoshi
  • Model type: RoBERTa, MLM
  • Language(s) (NLP): Classic Sanskrit
  • License: CC-BY-4.0

Model Sources

  • Repository: GitHub (will be public)
  • Paper [optional]: (Prepairing)

Uses

This model is for Classic Sanskrit. Supported texts are transliterated into Latin characters according to ISO 15919

Direct Use

The model needs fine-tuning.

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.

Training Details

Training Data

[More Information Needed]

Training Procedure

Preprocessing [optional]

[More Information Needed]

Training Hyperparameters

  • Training regime: [More Information Needed]

Speeds, Sizes, Times [optional]

[More Information Needed]

Evaluation

Testing Data, Factors & Metrics

Testing Data

[More Information Needed]

Factors

[More Information Needed]

Metrics

[More Information Needed]

Results

[More Information Needed]

Summary

Citation [optional]

BibTeX:

[More Information Needed]

APA:

[More Information Needed]

Downloads last month
5
Safetensors
Model size
66.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including yzk/roberta-classic-sanskrit-iso-base