metadata
language:
- en
base_model:
- distilbert/distilroberta-base
pipeline_tag: text-classification
library_name: transformers
DistilRoBERTa finetuned for Emotion Recognition Task.
🗨️ Base Model: distilbert/distilroberta-base
🎯 Accuracy: 0.9006
✔️ F1 Score: 0.8991
📉 Loss: 0.3183
Training Hyperparameters
train_epochs: 20
Batch train_batch_size: 32
warmup_steps: 50
weight_decay: 0.02
Datasets:
1️⃣ Emotion Dataset,
2️⃣ Emotion Dataset,
3️⃣ Emotion Dataset,
Emotions
(0) anger (1) disgust (2) fear (3) joy (4) love (5) neutral (6) sadness (7) surprise
Classification Report
precision recall f1-score support
anger 0.8970 0.8714 0.8840 3679
disgust 0.9777 1.0000 0.9887 3680
fear 0.9035 0.8647 0.8836 3680
joy 0.8348 0.7399 0.7845 3680
love 0.9756 1.0000 0.9877 3680
neutral 0.9351 0.9984 0.9657 3680
sadness 0.8649 0.7916 0.8266 3680
surprise 0.8133 0.9389 0.8716 3680
accuracy 0.9006 29439
macro avg 0.9002 0.9006 0.8991 29439
weighted avg 0.9002 0.9006 0.8991 29439
Sneak Peak: To be used as a part of a larger multimodal emotion recognition framework. (Late Fusion, Early Fusion and RL based approach 😱)