Feature Extraction
Transformers
English
noystl's picture
Add pipeline tag, library name, metrics and correct license (#1)
4921276 verified
metadata
base_model:
  - mistralai/Mistral-7B-Instruct-v0.3
datasets:
  - noystl/Recombination-Extraction
language:
  - en
license: cc-by-4.0
metrics:
  - accuracy
pipeline_tag: feature-extraction
library_name: transformers

This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature. The model employs a LoRA adapter on top of a Mistral base model.

For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository.

Bibtex

@misc{sternlicht2025chimeraknowledgebaseidea,
      title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, 
      author={Noy Sternlicht and Tom Hope},
      year={2025},
      eprint={2505.20779},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.20779}, 
}

Quick Links