Model Card for train20_encoder_freezed_1epoch_10e-5
A transformer-based multihead parser for CoBaLD annotation.
This model parses a pre-tokenized CoNLL-U text and jointly labels each token with three tiers of tags:
- Grammatical tags (lemma, UPOS, XPOS, morphological features),
- Syntactic tags (basic and enhanced Universal Dependencies),
- Semantic tags (deep slot and semantic class).
Model Sources
- Repository: https://github.com/CobaldAnnotation/CobaldParser
- Paper: https://dialogue-conf.org/wp-content/uploads/2025/04/BaiukIBaiukAPetrovaM.009.pdf
- Demo: [coming soon]
Citation
@inproceedings{baiuk2025cobald,
title={CoBaLD Parser: Joint Morphosyntactic and Semantic Annotation},
author={Baiuk, Ilia and Baiuk, Alexandra and Petrova, Maria},
booktitle={Proceedings of the International Conference "Dialogue"},
volume={I},
year={2025}
}
- Downloads last month
- 15
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for E-katrin/train20_encoder_freezed_1epoch_10e-5
Base model
FacebookAI/xlm-roberta-baseDataset used to train E-katrin/train20_encoder_freezed_1epoch_10e-5
Evaluation results
- Null F1 on train20validation set self-reported0.748
- Lemma F1 on train20validation set self-reported0.014
- Morphology F1 on train20validation set self-reported0.048
- Ud Jaccard on train20validation set self-reported0.577
- Eud Jaccard on train20validation set self-reported0.403
- Miscs F1 on train20validation set self-reported0.746
- Deepslot F1 on train20validation set self-reported0.464
- Semclass F1 on train20validation set self-reported0.356