Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ Evo 2 40B and 7B checkpoints, trained up to 1 million sequence length, are avail
|
|
14 |
| [evo2_40b](https://huggingface.co/arcinstitute/evo2_40b) | 50 | 40B |
|
15 |
| [evo2_7b](https://huggingface.co/arcinstitute/evo2_7b) | 32 | 7B |
|
16 |
|
17 |
-
|
18 |
| Checkpoint name | Num layers | Num parameters |
|
19 |
|------------------------------|----|----------|
|
20 |
| [evo2_40b_base](https://huggingface.co/arcinstitute/evo2_40b_base) | 50 | 40B |
|
|
|
14 |
| [evo2_40b](https://huggingface.co/arcinstitute/evo2_40b) | 50 | 40B |
|
15 |
| [evo2_7b](https://huggingface.co/arcinstitute/evo2_7b) | 32 | 7B |
|
16 |
|
17 |
+
We also share 40B, 7B, and 1B base checkpoints trained on 8192 context length:
|
18 |
| Checkpoint name | Num layers | Num parameters |
|
19 |
|------------------------------|----|----------|
|
20 |
| [evo2_40b_base](https://huggingface.co/arcinstitute/evo2_40b_base) | 50 | 40B |
|