Instructions to use deprem-ml/intent_128k_v13 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deprem-ml/intent_128k_v13 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="deprem-ml/intent_128k_v13")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("deprem-ml/intent_128k_v13") model = AutoModelForSequenceClassification.from_pretrained("deprem-ml/intent_128k_v13") - Notebooks
- Google Colab
- Kaggle
| { | |
| "cls_token": "[CLS]", | |
| "mask_token": "[MASK]", | |
| "pad_token": "[PAD]", | |
| "sep_token": "[SEP]", | |
| "unk_token": "[UNK]" | |
| } | |