Feature Extraction
Transformers
English
noystl nielsr HF Staff commited on
Commit
4921276
·
verified ·
1 Parent(s): f27c132

Add pipeline tag, library name, metrics and correct license (#1)

Browse files

- Add pipeline tag, library name, metrics and correct license (180f57a72dcd98341242292d35836c1719fd22d8)


Co-authored-by: Niels Rogge <nielsr@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -1,14 +1,18 @@
1
  ---
2
- license: cc
 
3
  datasets:
4
  - noystl/Recombination-Extraction
5
  language:
6
  - en
7
- base_model:
8
- - mistralai/Mistral-7B-Instruct-v0.3
 
 
 
9
  ---
10
 
11
- This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://huggingface.co/papers/2505.20779). The model employs a LoRA adapter on top of a Mistral base model.
12
 
13
  For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository.
14
 
 
1
  ---
2
+ base_model:
3
+ - mistralai/Mistral-7B-Instruct-v0.3
4
  datasets:
5
  - noystl/Recombination-Extraction
6
  language:
7
  - en
8
+ license: cc-by-4.0
9
+ metrics:
10
+ - accuracy
11
+ pipeline_tag: feature-extraction
12
+ library_name: transformers
13
  ---
14
 
15
+ This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://arxiv.org/abs/2505.20779). The model employs a LoRA adapter on top of a Mistral base model.
16
 
17
  For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository.
18