|
--- |
|
language: en |
|
license: apache-2.0 |
|
tags: |
|
- metaphor-detection |
|
- bert |
|
- text-classification |
|
- nlp |
|
- transformer |
|
model-index: |
|
- name: Fine-Tuned Metaphor Detection Model |
|
results: |
|
- task: |
|
name: text-classification |
|
type: text-classification |
|
metrics: |
|
- name: Accuracy |
|
value: 72 |
|
type: accuracy |
|
metrics: |
|
- accuracy |
|
base_model: |
|
- Sasidhar1826/common_metaphors_detection |
|
pipeline_tag: text-classification |
|
datasets: |
|
- Sasidhar1826/manual_data_on_metaphors |
|
--- |
|
|
|
# Fine-Tuned Metaphor Detection Model |
|
|
|
This is the extention of my previously trained model. |
|
|
|
This is a fine-tuned version of a BERT-based model used for metaphor detection in text. The model was trained on a custom dataset with sentences labeled as either metaphors or literals. |
|
|
|
## Model Details |
|
- **Model architecture**: BERT-based model |
|
- **Number of labels**: 2 (Metaphor, Literal) |
|
- **Training epochs**: 1 |
|
- **Batch size**: 8 |
|
- **Learning rate**: 1e-5 |
|
- **Evaluation metric**: Accuracy |
|
- **Accuracy**: 72% |
|
|
|
## How to use |
|
You can use this model to predict whether a sentence contains a metaphor or not. Below is an example of how to load the model and use it for inference: |
|
|
|
```python |
|
from transformers import AutoTokenizer, AutoModelForSequenceClassification |
|
import torch |
|
|
|
# Load model and tokenizer |
|
tokenizer = AutoTokenizer.from_pretrained("your-username/fine-tuned-metaphor-detection") |
|
model = AutoModelForSequenceClassification.from_pretrained("your-username/fine-tuned-metaphor-detection") |
|
|
|
# Example text |
|
text = "Time is a thief." |
|
|
|
# Tokenize input and get predictions |
|
inputs = tokenizer(text, return_tensors="pt", padding=True, truncation=True) |
|
with torch.no_grad(): |
|
outputs = model(**inputs) |
|
logits = outputs.logits |
|
prediction = torch.argmax(logits, dim=-1) |
|
|
|
print("Prediction:", "Metaphor" if prediction.item() == 1 else "Literal") |