Model Card for train100_10e-5_30ep
A transformer-based multihead parser for CoBaLD annotation.
This model parses a pre-tokenized CoNLL-U text and jointly labels each token with three tiers of tags:
- Grammatical tags (lemma, UPOS, XPOS, morphological features),
- Syntactic tags (basic and enhanced Universal Dependencies),
- Semantic tags (deep slot and semantic class).
Model Sources
- Repository: https://github.com/CobaldAnnotation/CobaldParser
- Paper: https://dialogue-conf.org/wp-content/uploads/2025/04/BaiukIBaiukAPetrovaM.009.pdf
- Demo: [coming soon]
Citation
@inproceedings{baiuk2025cobald,
title={CoBaLD Parser: Joint Morphosyntactic and Semantic Annotation},
author={Baiuk, Ilia and Baiuk, Alexandra and Petrova, Maria},
booktitle={Proceedings of the International Conference "Dialogue"},
volume={I},
year={2025}
}
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for E-katrin/train100_10e-5_30ep
Base model
FacebookAI/xlm-roberta-baseDataset used to train E-katrin/train100_10e-5_30ep
Evaluation results
- Null F1 on train100validation set self-reported0.807
- Lemma F1 on train100validation set self-reported0.032
- Morphology F1 on train100validation set self-reported0.052
- Ud Jaccard on train100validation set self-reported0.634
- Eud Jaccard on train100validation set self-reported0.488
- Miscs F1 on train100validation set self-reported0.747
- Deepslot F1 on train100validation set self-reported0.527
- Semclass F1 on train100validation set self-reported0.438