File size: 1,706 Bytes
de11a6e
 
 
 
 
 
6e92923
de11a6e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6e92923
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---
language:
- en
base_model:
- distilbert/distilroberta-base
pipeline_tag: text-classification
library_name: transformers
---

DistilRoBERTa finetuned for Emotion Recognition Task. 

🗨️ Base Model: [distilbert/distilroberta-base](https://huggingface.co/distilbert/distilroberta-base) 

🎯 Accuracy: 0.9006 \
✔️ F1 Score: 0.8991 \
📉 Loss: 0.3183


### Training Hyperparameters
train_epochs: 20 \
Batch train_batch_size: 32 \
warmup_steps: 50 \
weight_decay: 0.02


### Datasets: 
1️⃣ [Emotion Dataset](https://www.kaggle.com/datasets/abdallahwagih/emotion-dataset), \
2️⃣ [Emotion Dataset](https://www.kaggle.com/datasets/parulpandey/emotion-dataset), \
3️⃣ [Emotion Dataset](https://www.kaggle.com/datasets/chanakyar/emotion-dataset-link),



### Emotions  
(0) anger
(1) disgust 
(2) fear 
(3) joy 
(4) love
(5) neutral 
(6) sadness 
(7) surprise


### Classification Report
```
              precision    recall  f1-score   support

       anger     0.8970    0.8714    0.8840      3679
     disgust     0.9777    1.0000    0.9887      3680
        fear     0.9035    0.8647    0.8836      3680
         joy     0.8348    0.7399    0.7845      3680
        love     0.9756    1.0000    0.9877      3680
     neutral     0.9351    0.9984    0.9657      3680
     sadness     0.8649    0.7916    0.8266      3680
    surprise     0.8133    0.9389    0.8716      3680

    accuracy                         0.9006     29439
   macro avg     0.9002    0.9006    0.8991     29439
weighted avg     0.9002    0.9006    0.8991     29439
```

*Sneak Peak*: To be used as a part of a larger multimodal emotion recognition framework. (Late Fusion, Early Fusion and RL based approach 😱)