Update README.md
Browse files
README.md
CHANGED
@@ -26,14 +26,13 @@ JPharmatron-7B-base is a 7B large language model designed for pharmaceutical app
|
|
26 |
|
27 |
<!-- Provide a longer summary of what this model is. -->
|
28 |
|
29 |
-
The JPharmatron-7B-base is continually pre-trained using
|
30 |
|
31 |
- **Developed by:** EQUES Inc.
|
32 |
- **Funded by [optional]:** [GENIAC Project](https://www.meti.go.jp/policy/mono_info_service/geniac/index.html)
|
33 |
- **Model type:** Causal decoder-only
|
34 |
- **Language(s) (NLP):** Japanese, English
|
35 |
- **License:** CC-BY-SA-4.0
|
36 |
-
- **Finetuned from model [optional]:** Qwen2.5-7B
|
37 |
|
38 |
### Model Sources [optional]
|
39 |
|
|
|
26 |
|
27 |
<!-- Provide a longer summary of what this model is. -->
|
28 |
|
29 |
+
The JPharmatron-7B-base is continually pre-trained using 2B tokens from Japanese datasets, based on Qwen2.5-7B.
|
30 |
|
31 |
- **Developed by:** EQUES Inc.
|
32 |
- **Funded by [optional]:** [GENIAC Project](https://www.meti.go.jp/policy/mono_info_service/geniac/index.html)
|
33 |
- **Model type:** Causal decoder-only
|
34 |
- **Language(s) (NLP):** Japanese, English
|
35 |
- **License:** CC-BY-SA-4.0
|
|
|
36 |
|
37 |
### Model Sources [optional]
|
38 |
|