Feature Extraction
Transformers
English
File size: 1,229 Bytes
87d1038
4921276
 
87d1038
 
 
 
4921276
 
 
 
 
87d1038
 
4921276
f27c132
 
 
caff688
ae961b4
 
 
 
 
 
 
 
 
 
caff688
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
base_model:
- mistralai/Mistral-7B-Instruct-v0.3
datasets:
- noystl/Recombination-Extraction
language:
- en
license: cc-by-4.0
metrics:
- accuracy
pipeline_tag: feature-extraction
library_name: transformers
---

This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://arxiv.org/abs/2505.20779). The model employs a LoRA adapter on top of a Mistral base model.

For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository.

**Bibtex**
```bibtex
@misc{sternlicht2025chimeraknowledgebaseidea,
      title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, 
      author={Noy Sternlicht and Tom Hope},
      year={2025},
      eprint={2505.20779},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.20779}, 
}
```

**Quick Links**
- ๐ŸŒ [Project](https://noy-sternlicht.github.io/CHIMERA-Web)
- ๐Ÿ“ƒ [Paper](https://arxiv.org/abs/2505.20779)
- ๐Ÿ› ๏ธ [Code](https://github.com/noy-sternlicht/CHIMERA-KB)