Edit model card

Model Card for Sentiment Classifier for Depression

This model is a fine-tuned version of BERT (bert-base-uncased) for classifying text as either Depression or Non-depression. The model was trained on a custom dataset of mental health-related social media posts and has shown high accuracy in sentiment classification.

Training Data

The model was trained on a custom dataset of tweets labeled as either depression-related or not. Data pre-processing included tokenization and removal of special characters.

Training Procedure

The model was trained using Hugging Face's transformers library. The training was conducted on a T4 GPU over 3 epochs, with a batch size of 16 and a learning rate of 5e-5.

Evaluation and Testing Data

The model was evaluated on a 20% holdout set from the custom dataset.

Results

  • Accuracy: 99.87%
  • Precision: 99.91%
  • Recall: 99.81%
  • F1 Score: 99.86%

Environmental Impact

The carbon emissions from training this model can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: T4 GPU
  • Hours used: 1 hour
  • Cloud Provider: Google Cloud (Colab)
  • Carbon Emitted: Estimated at 0.45 kg CO2eq

Technical Specifications

  • Architecture: BERT (bert-base-uncased)
  • Training Hardware: T4 GPU in Colab
  • Training Library: Hugging Face transformers

Citation

BibTeX:

@misc{poudel2024sentimentclassifier,
  author = {Poudel, Ashish},
  title = {Sentiment Classifier for Depression},
  year = {2024},
  url = {https://huggingface.co/poudel/sentiment-classifier},
}
Downloads last month
188
Safetensors
Model size
109M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for poudel/Depression_and_Non-Depression_Classifier

Finetuned
(2157)
this model

Space using poudel/Depression_and_Non-Depression_Classifier 1

Evaluation results