DistilRoBERTa finetuned for Emotion Recognition Task.
๐จ๏ธ Base Model: distilbert/distilroberta-base
๐ฏ Accuracy: 0.9006
โ๏ธ F1 Score: 0.8991
๐ Loss: 0.3183
Training Hyperparameters
train_epochs: 20
Batch train_batch_size: 32
warmup_steps: 50
weight_decay: 0.02
Datasets:
1๏ธโฃ Emotion Dataset,
2๏ธโฃ Emotion Dataset,
3๏ธโฃ Emotion Dataset,
Emotions
(0) anger (1) disgust (2) fear (3) joy (4) love (5) neutral (6) sadness (7) surprise
Classification Report
precision recall f1-score support
anger 0.8970 0.8714 0.8840 3679
disgust 0.9777 1.0000 0.9887 3680
fear 0.9035 0.8647 0.8836 3680
joy 0.8348 0.7399 0.7845 3680
love 0.9756 1.0000 0.9877 3680
neutral 0.9351 0.9984 0.9657 3680
sadness 0.8649 0.7916 0.8266 3680
surprise 0.8133 0.9389 0.8716 3680
accuracy 0.9006 29439
macro avg 0.9002 0.9006 0.8991 29439
weighted avg 0.9002 0.9006 0.8991 29439
Sneak Peak: To be used as a part of a larger multimodal emotion recognition framework. (Late Fusion, Early Fusion and RL based approach ๐ฑ)
- Downloads last month
- 12
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for argish/text-emotion-classifier-distilroberta
Base model
distilbert/distilroberta-base