model_type
stringclasses
5 values
model
stringlengths
12
62
AVG
float64
0.03
0.7
CG
float64
0
0.68
EL
float64
0
0.77
FA
float64
0
0.62
HE
float64
0
0.83
MC
float64
0
0.95
MR
float64
0
0.95
MT
float64
0.19
0.86
NLI
float64
0
0.97
QA
float64
0
0.77
RC
float64
0
0.94
SUM
float64
0
0.29
aio_char_f1
float64
0
0.9
alt-e-to-j_bert_score_ja_f1
float64
0
0.88
alt-e-to-j_bleu_ja
float64
0
16
alt-e-to-j_comet_wmt22
float64
0.2
0.92
alt-j-to-e_bert_score_en_f1
float64
0
0.96
alt-j-to-e_bleu_en
float64
0
20.1
alt-j-to-e_comet_wmt22
float64
0.17
0.89
chabsa_set_f1
float64
0
0.77
commonsensemoralja_exact_match
float64
0
0.94
jamp_exact_match
float64
0
1
janli_exact_match
float64
0
1
jcommonsenseqa_exact_match
float64
0
0.98
jemhopqa_char_f1
float64
0
0.71
jmmlu_exact_match
float64
0
0.81
jnli_exact_match
float64
0
0.94
jsem_exact_match
float64
0
0.96
jsick_exact_match
float64
0
0.93
jsquad_char_f1
float64
0
0.94
jsts_pearson
float64
-0.35
0.94
jsts_spearman
float64
-0.6
0.91
kuci_exact_match
float64
0
0.93
mawps_exact_match
float64
0
0.95
mbpp_code_exec
float64
0
0.68
mbpp_pylint_check
float64
0
0.99
mmlu_en_exact_match
float64
0
0.86
niilc_char_f1
float64
0
0.7
wiki_coreference_set_f1
float64
0
0.4
wiki_dependency_set_f1
float64
0
0.88
wiki_ner_set_f1
float64
0
0.33
wiki_pas_set_f1
float64
0
0.57
wiki_reading_char_f1
float64
0
0.94
wikicorpus-e-to-j_bert_score_ja_f1
float64
0
0.88
wikicorpus-e-to-j_bleu_ja
float64
0
24
wikicorpus-e-to-j_comet_wmt22
float64
0.18
0.87
wikicorpus-j-to-e_bert_score_en_f1
float64
0
0.93
wikicorpus-j-to-e_bleu_en
float64
0
15.9
wikicorpus-j-to-e_comet_wmt22
float64
0.17
0.79
xlsum_ja_bert_score_ja_f1
float64
0
0.79
xlsum_ja_bleu_ja
float64
0
10.2
xlsum_ja_rouge1
float64
0
52.8
xlsum_ja_rouge2
float64
0
29.2
xlsum_ja_rouge2_scaling
float64
0
0.29
xlsum_ja_rougeLsum
float64
0
44.9
architecture
stringclasses
12 values
precision
stringclasses
3 values
license
stringclasses
14 values
params
float64
0
70.6
likes
int64
0
6.19k
revision
stringclasses
1 value
num_few_shot
int64
0
4
add_special_tokens
stringclasses
2 values
llm_jp_eval_version
stringclasses
1 value
vllm_version
stringclasses
1 value
๐ŸŸข : pretrained
Qwen/Qwen1.5-MoE-A2.7B
0.4386
0.0863
0.3744
0.0911
0.5078
0.575
0.592
0.7597
0.6169
0.2917
0.8385
0.0907
0.2586
0.8229
8.9796
0.8436
0.9378
13.4295
0.853
0.3744
0.5564
0.4741
0.5389
0.7346
0.3285
0.4504
0.7588
0.6932
0.6196
0.8385
0.2298
0.2547
0.4341
0.592
0.0863
0.2349
0.5652
0.2878
0.0051
0.0974
0.0088
0.0318
0.3124
0.7337
7.3131
0.6786
0.875
8.8123
0.6636
0.6875
1.5071
27.1068
9.0872
0.0907
22.8268
Qwen2MoeForCausalLM
bfloat16
other
14.316
195
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
Qwen/Qwen1.5-MoE-A2.7B
0.2984
0.0863
0.032
0.0402
0.2393
0.4067
0.158
0.7376
0.592
0.1866
0.7135
0.0907
0.1371
0.7859
6.6823
0.8049
0.9338
11.5663
0.8417
0.032
0.5772
0.4511
0.5
0.3512
0.2471
0.1186
0.5744
0.6717
0.7629
0.7135
0.4976
0.5216
0.2917
0.158
0.0863
0.2349
0.3601
0.1756
0
0
0
0
0.2008
0.7078
4.7317
0.6649
0.8626
6.9768
0.6389
0.6875
1.5071
27.1068
9.0872
0.0907
22.8268
Qwen2MoeForCausalLM
bfloat16
other
14.316
195
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
Qwen/Qwen1.5-MoE-A2.7B-Chat
0.132
0
0.0147
0.0315
0.0119
0.2834
0.042
0.5432
0.1053
0.0875
0.2662
0.0659
0.0514
0.6426
1.9922
0.5859
0.8299
8.2738
0.5428
0.0147
0.3898
0.0948
0.1819
0.2815
0.1208
0.0011
0.0645
0.0265
0.1589
0.2662
0.5081
0.5001
0.179
0.042
0
0
0.0226
0.0904
0.0007
0
0
0
0.1566
0.6003
1.8859
0.5314
0.8059
5.1442
0.5125
0.6627
1.4848
17.9876
6.5929
0.0659
15.7201
Qwen2MoeForCausalLM
bfloat16
other
14.316
113
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
Qwen/Qwen1.5-MoE-A2.7B-Chat
0.4026
0
0.2937
0.0752
0.4997
0.6444
0.586
0.7237
0.475
0.2323
0.8325
0.0659
0.2399
0.7804
8.1237
0.7863
0.9019
11.8844
0.7543
0.2937
0.7112
0.4339
0.5236
0.7435
0.2003
0.4456
0.574
0.2961
0.5474
0.8325
0.4445
0.4623
0.4785
0.586
0
0
0.5538
0.2567
0.0039
0.0949
0.0088
0.0176
0.2505
0.7104
6.5279
0.6786
0.8702
7.566
0.6756
0.6627
1.4848
17.9876
6.5929
0.0659
15.7201
Qwen2MoeForCausalLM
bfloat16
other
14.316
113
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Base-70B
0.5235
0.0743
0.3467
0.1181
0.6628
0.8151
0.614
0.8419
0.7034
0.5459
0.8984
0.138
0.6817
0.8598
12.8622
0.9015
0.9525
15.6291
0.88
0.3467
0.8284
0.6006
0.7097
0.9124
0.4437
0.6145
0.6163
0.7885
0.8019
0.8984
0.8866
0.8702
0.7044
0.614
0.0743
0.2811
0.711
0.5123
0
0
0.0442
0.0034
0.5431
0.833
10.9344
0.8411
0.9024
10.8355
0.7448
0.7206
3.1555
37.5003
13.7963
0.138
31.2
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Base-70B
0.6075
0.0743
0.5569
0.2863
0.7172
0.8627
0.91
0.8445
0.7436
0.6263
0.9223
0.138
0.7223
0.8667
13.7175
0.9059
0.9536
16.6449
0.8795
0.5569
0.8923
0.6293
0.8208
0.9231
0.5743
0.671
0.7342
0.7967
0.7372
0.9223
0.8957
0.8636
0.7728
0.91
0.0743
0.2811
0.7634
0.5824
0.0631
0.3562
0.1327
0.0701
0.8095
0.8481
14.7284
0.8473
0.9079
12.4961
0.7453
0.7206
3.1555
37.5003
13.7963
0.138
31.2
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V2-70B
0.6869
0.5763
0.6121
0.3492
0.7434
0.9071
0.94
0.8589
0.7842
0.7304
0.9231
0.1316
0.8779
0.8747
14.6956
0.9129
0.9612
19.6814
0.8934
0.6121
0.9259
0.6638
0.8583
0.9508
0.6398
0.7193
0.7905
0.7841
0.8242
0.9231
0.9058
0.879
0.8445
0.94
0.5763
0.988
0.7676
0.6734
0.1292
0.5374
0.1062
0.0654
0.9075
0.8614
16.5599
0.8598
0.9173
12.964
0.7693
0.7141
4.1127
31.754
13.1557
0.1316
27.6066
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V2-70B
0.6192
0.5763
0.3044
0.1649
0.7056
0.8782
0.912
0.8519
0.7724
0.6035
0.9103
0.1316
0.8118
0.866
14.3229
0.9085
0.9602
18.6973
0.8918
0.3044
0.9148
0.6983
0.7639
0.9419
0.4801
0.6829
0.7933
0.7999
0.8064
0.9103
0.8994
0.8674
0.7778
0.912
0.5763
0.988
0.7283
0.5187
0
0.0138
0.0177
0.0064
0.7868
0.8421
13.1854
0.8486
0.9091
12.2812
0.7587
0.7141
4.1127
31.754
13.1557
0.1316
27.6066
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿค : base merges and moerges
nitky/Llama-3.3-FakeSwallow-70B-Instruct-v0.1
0.2962
0.012
0.1568
0.0886
0.1681
0.8662
0
0.7481
0.5263
0.3231
0.287
0.0823
0.3846
0.8299
13.8209
0.8579
0.9298
16.9974
0.825
0.1568
0.9061
0.5489
0.7847
0.9151
0.3202
0.0291
0.2872
0.6402
0.3704
0.287
0.8796
0.8452
0.7774
0
0.012
0.0181
0.3071
0.2644
0.0045
0.0098
0.0068
0.0027
0.4192
0.748
11.3192
0.7259
0.8347
10.5215
0.5835
0.6797
3.3725
18.3905
8.2221
0.0823
16.59
LlamaForCausalLM
bfloat16
llama3.3
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿค : base merges and moerges
nitky/Llama-3.3-FakeSwallow-70B-Instruct-v0.1
0.6209
0.012
0.5909
0.2866
0.7656
0.9084
0.93
0.8517
0.7951
0.7123
0.8945
0.0823
0.8604
0.8719
14.6538
0.9113
0.9591
18.1405
0.8901
0.5909
0.9336
0.6753
0.8833
0.9473
0.6168
0.73
0.7843
0.8049
0.8277
0.8945
0.875
0.8404
0.8443
0.93
0.012
0.0181
0.8012
0.6596
0.0305
0.4083
0.0619
0.057
0.875
0.8525
15.5976
0.8536
0.9094
12.1566
0.7517
0.6797
3.3725
18.3905
8.2221
0.0823
16.59
LlamaForCausalLM
bfloat16
llama3.3
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V3-70B
0.6516
0.5904
0.5507
0.2941
0.7434
0.892
0.95
0.733
0.7793
0.6947
0.9152
0.0249
0.8473
0.8252
12.035
0.7138
0.9597
18.7671
0.8914
0.5507
0.8973
0.6667
0.8889
0.9535
0.5882
0.7173
0.749
0.7765
0.8155
0.9152
0.8996
0.8717
0.8253
0.95
0.5904
0.994
0.7695
0.6485
0.1123
0.3552
0.0973
0.0377
0.8679
0.7829
13.3156
0.5649
0.9144
12.7946
0.7619
0.6578
2.543
7.591
2.4919
0.0249
6.1488
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V3-70B
0.5824
0.5904
0.3697
0.1477
0.7065
0.8691
0.902
0.7108
0.7649
0.4313
0.8888
0.0249
0.5583
0.8086
11.5099
0.6872
0.9592
17.8806
0.8908
0.3697
0.893
0.6753
0.8
0.9428
0.3588
0.6755
0.7555
0.7917
0.8019
0.8888
0.8923
0.8615
0.7714
0.902
0.5904
0.994
0.7375
0.3768
0
0.0212
0.0442
0.0041
0.669
0.7544
10.6504
0.509
0.9091
12.1008
0.7562
0.6578
2.543
7.591
2.4919
0.0249
6.1488
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
lightblue/qwen2.5-7B-instruct-simpo
0.5442
0
0.4828
0.2259
0.6593
0.8366
0.798
0.8257
0.7261
0.4391
0.8824
0.1104
0.4022
0.8482
11.0474
0.8982
0.9468
15.022
0.8726
0.4828
0.8602
0.5805
0.7306
0.9169
0.5004
0.6224
0.8435
0.6888
0.7873
0.8824
0.8501
0.8349
0.7327
0.798
0
0
0.6962
0.4147
0.0129
0.304
0.0531
0.0705
0.6888
0.7998
8.5364
0.8166
0.8841
9.3814
0.7155
0.7129
1.8351
36.7964
11.0654
0.1104
29.2243
Qwen2ForCausalLM
bfloat16
other
7.616
0
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
lightblue/qwen2.5-7B-instruct-simpo
0.429
0
0.3362
0.1016
0.6107
0.7649
0.47
0.68
0.6801
0.2546
0.7102
0.1104
0.2279
0.7469
7.5092
0.7948
0.8338
11.6715
0.5879
0.3362
0.858
0.523
0.6458
0.79
0.2642
0.5657
0.7099
0.7481
0.7737
0.7102
0.8802
0.8445
0.6468
0.47
0
0
0.6557
0.2716
0.0042
0.0084
0.0088
0
0.4866
0.7108
6.656
0.7437
0.8239
7.623
0.5935
0.7129
1.8351
36.7964
11.0654
0.1104
29.2243
Qwen2ForCausalLM
bfloat16
other
7.616
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V1-70B
0.6704
0.5743
0.5662
0.3163
0.7432
0.8934
0.938
0.8467
0.7865
0.7061
0.9253
0.0778
0.8595
0.867
13.7766
0.907
0.9587
19.1048
0.8895
0.5662
0.898
0.6523
0.8764
0.9517
0.6001
0.7153
0.7568
0.7961
0.8508
0.9253
0.9004
0.8729
0.8304
0.938
0.5743
0.9779
0.7711
0.6588
0.0811
0.4547
0.0885
0.0705
0.8866
0.843
16.086
0.8226
0.9163
12.9703
0.7677
0.6585
3.4394
20.4636
7.7787
0.0778
17.6038
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V1-70B
0.6124
0.5743
0.3459
0.1635
0.7128
0.8726
0.882
0.8411
0.7755
0.5806
0.9107
0.0778
0.7407
0.8567
13.0545
0.8976
0.9583
17.9176
0.8888
0.3459
0.9126
0.7126
0.7667
0.9401
0.4846
0.6862
0.7613
0.8049
0.8319
0.9107
0.8922
0.8622
0.765
0.882
0.5743
0.9779
0.7394
0.5166
0.001
0.0002
0.0354
0.0063
0.7743
0.8298
12.671
0.8212
0.9071
11.5502
0.7567
0.6585
3.4394
20.4636
7.7787
0.0778
17.6038
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V2-70B
0.6414
0.5803
0.5427
0.2765
0.7441
0.8848
0.938
0.6936
0.7772
0.6819
0.9156
0.0211
0.8522
0.8025
11.1648
0.6256
0.9578
18.1879
0.8878
0.5427
0.8755
0.6351
0.8806
0.9517
0.5334
0.717
0.7445
0.7879
0.838
0.9156
0.8942
0.873
0.8271
0.938
0.5803
0.9859
0.7712
0.6601
0.0851
0.3243
0.0708
0.0441
0.858
0.7589
13.8172
0.4988
0.9142
12.6592
0.7622
0.6476
2.3307
6.4034
2.1156
0.0211
5.2067
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V2-70B
0.5714
0.5803
0.3355
0.1534
0.7073
0.8662
0.818
0.6776
0.7781
0.4557
0.8925
0.0211
0.5777
0.7823
10.2364
0.5925
0.9575
17.5235
0.8876
0.3355
0.9038
0.7069
0.7667
0.9366
0.4053
0.6798
0.7732
0.7967
0.8468
0.8925
0.8866
0.8616
0.7582
0.818
0.5803
0.9859
0.7349
0.384
0.001
0.0008
0.0442
0.0018
0.7189
0.7474
10.622
0.4748
0.9071
11.5086
0.7554
0.6476
2.3307
6.4034
2.1156
0.0211
5.2067
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V3-70B
0.5556
0.5843
0.34
0.1394
0.7036
0.863
0.792
0.604
0.7782
0.4093
0.8832
0.0148
0.5108
0.7502
8.9918
0.4073
0.9572
17.5227
0.887
0.34
0.9026
0.7155
0.7625
0.9321
0.3948
0.6766
0.772
0.7904
0.8506
0.8832
0.886
0.8622
0.7543
0.792
0.5843
0.988
0.7306
0.3222
0.001
0
0.0354
0.0015
0.6593
0.7315
9.8413
0.3672
0.9068
11.3705
0.7544
0.6578
2.1434
4.5715
1.4801
0.0148
3.833
LlamaForCausalLM
bfloat16
apache-2.0
70.554
3
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V3-70B
0.6282
0.5843
0.522
0.2499
0.7434
0.8797
0.938
0.6199
0.7762
0.6732
0.9093
0.0148
0.8473
0.7688
9.8735
0.4435
0.9572
17.8521
0.8872
0.522
0.8622
0.6437
0.8764
0.9526
0.5214
0.7167
0.742
0.7847
0.834
0.9093
0.8916
0.8708
0.8242
0.938
0.5843
0.988
0.7701
0.6508
0.079
0.2757
0.0531
0.0245
0.8174
0.7436
12.8255
0.3898
0.9131
12.5965
0.7591
0.6578
2.1434
4.5715
1.4801
0.0148
3.833
LlamaForCausalLM
bfloat16
apache-2.0
70.554
3
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
qingy2024/NaturalLM
0.3895
0.4699
0.0443
0.0835
0.3313
0.5721
0.226
0.7929
0.5986
0.2886
0.8022
0.0756
0.3814
0.8298
9.6491
0.8822
0.9417
12.3844
0.8629
0.0443
0.8014
0.4569
0.6694
0.6095
0.1728
0.2666
0.4634
0.6806
0.7225
0.8022
0.5583
0.6099
0.3054
0.226
0.4699
0.9056
0.3961
0.3115
0
0.0004
0
0
0.4171
0.753
7.0077
0.7506
0.8765
7.5573
0.6757
0.6706
2.3734
18.8037
7.5589
0.0756
16.5739
MistralForCausalLM
bfloat16
apache-2.0
12.248
6
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
qingy2024/NaturalLM
0.5668
0.4699
0.4763
0.2408
0.5683
0.7954
0.728
0.8293
0.645
0.5034
0.9024
0.0756
0.536
0.8553
11.9026
0.8988
0.9487
15.3826
0.8742
0.4763
0.8319
0.5287
0.7014
0.9071
0.5153
0.512
0.574
0.7241
0.697
0.9024
0.7892
0.7571
0.6472
0.728
0.4699
0.9056
0.6247
0.4589
0.0017
0.3344
0.0442
0.0824
0.7413
0.8132
10.4862
0.8137
0.8965
9.6616
0.7305
0.6706
2.3734
18.8037
7.5589
0.0756
16.5739
MistralForCausalLM
bfloat16
apache-2.0
12.248
6
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Llama-3.3-Japanese-70B-sft-dpo-base
0.6105
0.1928
0.5473
0.2297
0.7535
0.8647
0.936
0.8227
0.766
0.6151
0.9178
0.0704
0.7056
0.8706
13.6638
0.9084
0.9564
17.55
0.8863
0.5473
0.9013
0.6379
0.8694
0.9312
0.5675
0.7035
0.696
0.7992
0.8273
0.9178
0.884
0.8514
0.7617
0.936
0.1928
0.253
0.8036
0.5721
0.0184
0.2945
0.0088
0.06
0.7666
0.789
15.1796
0.7435
0.9079
12.1657
0.7524
0.6569
3.0087
18.266
7.0351
0.0704
16.1309
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Llama-3.3-Japanese-70B-sft-dpo-base
0.3428
0.1928
0.2753
0.0815
0.008
0.8174
0.008
0.7984
0.6252
0.2789
0.6154
0.0704
0.5025
0.8285
12.7224
0.8518
0.9436
15.629
0.8658
0.2753
0.8795
0.5718
0.3528
0.8919
0.1784
0
0.5916
0.7759
0.8338
0.6154
0.8228
0.785
0.6808
0.008
0.1928
0.253
0.0161
0.1557
0
0.003
0
0
0.4045
0.7856
12.3351
0.7666
0.8896
10.7575
0.7093
0.6569
3.0087
18.266
7.0351
0.0704
16.1309
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Sakalti/Saba1.5-Pro
0.3009
0.1888
0
0.0497
0.3469
0.4485
0.212
0.6397
0.5359
0.1169
0.7149
0.0565
0.0803
0.75
3.4061
0.7493
0.8667
5.5063
0.6911
0
0.5321
0.342
0.4944
0.4835
0.1243
0.3457
0.5542
0.6635
0.6251
0.7149
0.225
0.2087
0.3299
0.212
0.1888
0.5261
0.3481
0.1461
0
0
0
0
0.2485
0.6815
2.2445
0.6197
0.79
2.5214
0.4987
0.6469
1.3798
15.7074
5.6613
0.0565
13.7093
Qwen2ForCausalLM
float16
apache-2.0
1.544
2
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Sakalti/Saba1.5-Pro
0.4511
0.1888
0.4156
0.1016
0.4943
0.6727
0.604
0.7816
0.5541
0.2405
0.8523
0.0565
0.2114
0.815
7.9897
0.8505
0.935
12.2059
0.8478
0.4156
0.6849
0.4224
0.5847
0.8043
0.2726
0.44
0.6997
0.678
0.3856
0.8523
0.5131
0.4669
0.5289
0.604
0.1888
0.5261
0.5486
0.2375
0.0055
0.1094
0.0531
0.0182
0.3217
0.7528
6.1322
0.7342
0.8794
7.5862
0.694
0.6469
1.3798
15.7074
5.6613
0.0565
13.7093
Qwen2ForCausalLM
float16
apache-2.0
1.544
2
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Sakalti/Saba2-Preview
0.4511
0.1888
0.4156
0.1016
0.4943
0.6727
0.604
0.7816
0.5541
0.2405
0.8523
0.0565
0.2114
0.815
7.9897
0.8505
0.935
12.2059
0.8478
0.4156
0.6849
0.4224
0.5847
0.8043
0.2726
0.44
0.6997
0.678
0.3856
0.8523
0.5131
0.4669
0.5289
0.604
0.1888
0.5261
0.5486
0.2375
0.0055
0.1094
0.0531
0.0182
0.3217
0.7528
6.1322
0.7342
0.8794
7.5862
0.694
0.6469
1.3798
15.7074
5.6613
0.0565
13.7093
Qwen2ForCausalLM
float16
apache-2.0
1.544
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Sakalti/Saba2-Preview
0.3009
0.1888
0
0.0497
0.3469
0.4485
0.212
0.6397
0.5359
0.1169
0.7149
0.0565
0.0803
0.75
3.4061
0.7493
0.8667
5.5063
0.6911
0
0.5321
0.342
0.4944
0.4835
0.1243
0.3457
0.5542
0.6635
0.6251
0.7149
0.225
0.2087
0.3299
0.212
0.1888
0.5261
0.3481
0.1461
0
0
0
0
0.2485
0.6815
2.2445
0.6197
0.79
2.5214
0.4987
0.6469
1.3798
15.7074
5.6613
0.0565
13.7093
Qwen2ForCausalLM
float16
apache-2.0
1.544
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V4-70B
0.3889
0.1145
0.3293
0.1768
0.3924
0.6739
0
0.8205
0.2965
0.5496
0.826
0.098
0.6553
0.8373
12.3782
0.8336
0.9589
17.9522
0.8906
0.3293
0.2943
0.5948
0.0056
0.9339
0.456
0.4911
0.3426
0
0.5395
0.826
0.8992
0.871
0.7935
0
0.1145
0.1566
0.2938
0.5374
0.009
0.0219
0.0265
0.0059
0.8205
0.8158
11.8559
0.8044
0.9027
11.2744
0.7532
0.6962
3.6547
23.5417
9.8118
0.098
20.8872
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V4-70B
0.6382
0.1145
0.5868
0.3139
0.7662
0.9165
0.94
0.8559
0.8109
0.7069
0.9108
0.098
0.8108
0.8657
12.7158
0.9121
0.9602
18.8573
0.8924
0.5868
0.9346
0.6983
0.8764
0.9607
0.6826
0.7368
0.864
0.8074
0.8084
0.9108
0.9004
0.8637
0.8542
0.94
0.1145
0.1566
0.7957
0.6274
0.0649
0.4175
0.115
0.0978
0.8744
0.8496
16.9735
0.8486
0.914
12.4186
0.7706
0.6962
3.6547
23.5417
9.8118
0.098
20.8872
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V4-70B
0.6255
0.5904
0.3026
0.1715
0.6957
0.8763
0.908
0.8533
0.7667
0.6404
0.9137
0.1617
0.8335
0.8675
14.1741
0.9096
0.9601
19.0434
0.8918
0.3026
0.9208
0.704
0.7514
0.933
0.5262
0.6758
0.7831
0.7936
0.8015
0.9137
0.8988
0.864
0.7749
0.908
0.5904
0.9779
0.7156
0.5616
0
0.0105
0.0354
0.0066
0.8048
0.8478
13.5613
0.8539
0.9079
12.0353
0.7578
0.729
5.0011
35.4482
16.157
0.1617
30.7872
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V4-70B
0.6908
0.5904
0.61
0.3478
0.7374
0.911
0.938
0.8604
0.7792
0.739
0.9239
0.1617
0.8819
0.8763
14.8132
0.9141
0.961
19.8139
0.8932
0.61
0.9316
0.6523
0.8444
0.9508
0.6531
0.7125
0.7909
0.7803
0.8281
0.9239
0.903
0.8768
0.8506
0.938
0.5904
0.9779
0.7623
0.6819
0.1255
0.5356
0.0885
0.0785
0.9111
0.8659
17.5317
0.8634
0.9175
12.748
0.7708
0.729
5.0011
35.4482
16.157
0.1617
30.7872
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Llama-3.3-Japanese-70B-Advanced
0.5928
0.004
0.5551
0.2375
0.743
0.8574
0.936
0.8257
0.7619
0.6155
0.9149
0.0694
0.699
0.8684
13.3542
0.9077
0.9555
17.8453
0.8829
0.5551
0.9011
0.6437
0.8333
0.9169
0.5982
0.6871
0.7477
0.7948
0.7897
0.9149
0.8688
0.8479
0.7543
0.936
0.004
0.0141
0.7988
0.5492
0.0029
0.3176
0
0.0559
0.811
0.8015
14.7057
0.7616
0.9071
11.8052
0.7506
0.6484
3.3629
18.3056
6.9474
0.0694
16.1232
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Llama-3.3-Japanese-70B-Advanced
0.358
0.004
0.3694
0.0919
0.0126
0.8131
0
0.8231
0.6369
0.2973
0.8205
0.0694
0.4553
0.8615
13.1068
0.9006
0.9285
15
0.8394
0.3694
0.883
0.5718
0.6889
0.8776
0.1769
0.0006
0.4618
0.7109
0.7512
0.8205
0.8265
0.7836
0.6788
0
0.004
0.0141
0.0246
0.2595
0.0043
0.0033
0
0.0009
0.4507
0.8202
11.7234
0.823
0.8958
10.6518
0.7293
0.6484
3.3629
18.3056
6.9474
0.0694
16.1232
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-70B
0.6216
0
0.5821
0.2899
0.7615
0.9114
0.944
0.851
0.8057
0.6996
0.9143
0.0786
0.8126
0.8631
13.0823
0.9081
0.9603
19.0549
0.892
0.5821
0.9306
0.6782
0.8833
0.9553
0.6533
0.7295
0.848
0.8081
0.8108
0.9143
0.9003
0.8721
0.8484
0.944
0
0
0.7935
0.6328
0.0345
0.3947
0.0708
0.0888
0.8608
0.8414
16.9415
0.8355
0.9143
12.8465
0.7685
0.6793
3.2192
19.3891
7.8488
0.0786
17.2373
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-70B
0.3076
0
0.3373
0.1576
0.0006
0.5749
0
0.8344
0.3524
0.2811
0.7672
0.0786
0.1787
0.8479
12.9434
0.8701
0.9578
17.6351
0.8885
0.3373
0
0.6034
0
0.9366
0.2839
0.0008
0.4224
0
0.7361
0.7672
0.8858
0.8535
0.7883
0
0
0
0.0003
0.3808
0.0017
0.0159
0.0177
0.0029
0.75
0.8262
13.43
0.8289
0.9029
11.5838
0.7501
0.6793
3.2192
19.3891
7.8488
0.0786
17.2373
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-1B-Instruct
0.2436
0
0.1678
0.0151
0.3607
0.3396
0.126
0.4862
0.5379
0.1837
0.452
0.0104
0.1029
0.6511
2.6558
0.3968
0.8711
7.209
0.6394
0.1678
0.519
0.3764
0.5028
0.244
0.3146
0.2951
0.5596
0.6604
0.5902
0.452
-0.234
-0.1919
0.2559
0.126
0
0
0.4264
0.1335
0
0.0407
0.0088
0.0023
0.0234
0.6008
2.5046
0.3711
0.8294
4.8814
0.5374
0.5943
0.6303
6.7308
1.0453
0.0104
5.7768
LlamaForCausalLM
bfloat16
other
1.669
25
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-1B-Instruct
0.0663
0
0
0.0215
0.0994
0.0003
0
0.3518
0
0.0929
0.1529
0.0104
0.0636
0.5968
0.8644
0.3793
0.765
0.0187
0.3472
0
0
0
0
0
0.1409
0.0003
0
0
0
0.1529
0
0
0.0008
0
0
0
0.1985
0.0741
0
0
0
0
0.1076
0.5629
0.624
0.3491
0.7587
0.0309
0.3317
0.5943
0.6303
6.7308
1.0453
0.0104
5.7768
LlamaForCausalLM
bfloat16
other
1.669
25
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-3B-Instruct
0.279
0
0.311
0.0817
0.3459
0.2355
0.4
0.6009
0.2676
0.1263
0.6266
0.0729
0.1142
0.7312
4.3517
0.5532
0.8845
7.9429
0.7021
0.311
0
0.2816
0.45
0.4236
0.1582
0.1536
0.3028
0.0088
0.2947
0.6266
0.4556
0.4756
0.2829
0.4
0
0
0.5382
0.1067
0.0064
0.1853
0.0044
0.0261
0.1862
0.675
3.7493
0.5328
0.8518
6.637
0.6156
0.6691
1.62
24.2322
7.2984
0.0729
20.0927
LlamaForCausalLM
bfloat16
other
3.228
18
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-3B-Instruct
0.161
0
0.0064
0.0298
0.0551
0.0234
0.04
0.5293
0.3717
0.1089
0.534
0.0729
0.0965
0.6884
2.5278
0.5059
0.8359
5.0357
0.6277
0.0064
0.0694
0.3477
0.4764
0
0.1462
0
0.3513
0.233
0.4502
0.534
0.527
0.5196
0.0008
0.04
0
0
0.1102
0.084
0
0
0
0
0.1488
0.6401
1.7615
0.483
0.7943
3.1157
0.5004
0.6691
1.62
24.2322
7.2984
0.0729
20.0927
LlamaForCausalLM
bfloat16
other
3.228
18
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-7B-Instruct
0.2813
0
0.0031
0.0451
0.453
0.1703
0.304
0.6606
0.5102
0.1568
0.6929
0.0986
0.1123
0.7306
3.424
0.639
0.8981
7.09
0.7601
0.0031
0.511
0.4943
0.6736
0
0.229
0.2926
0.5267
0.3624
0.4942
0.6929
0.7822
0.7519
0
0.304
0
0
0.6135
0.129
0.0009
0
0
0
0.2248
0.6818
2.9359
0.5962
0.8539
5.557
0.6471
0.695
2.0286
29.2184
9.8677
0.0986
24.8378
LlamaForCausalLM
bfloat16
other
7.456
30
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-7B-Instruct
0.3504
0
0.3605
0.1373
0.5042
0.4326
0.284
0.6797
0.2896
0.2519
0.8161
0.0986
0.1656
0.7649
6.1232
0.6874
0.8995
10.6309
0.7405
0.3605
0.1548
0.3707
0
0.6819
0.437
0.366
0.3073
0.4628
0.3073
0.8161
0.7787
0.7464
0.4612
0.284
0
0
0.6424
0.1532
0.0061
0.2727
0.0398
0.0447
0.323
0.7016
5.9369
0.6371
0.8668
7.7597
0.6541
0.695
2.0286
29.2184
9.8677
0.0986
24.8378
LlamaForCausalLM
bfloat16
other
7.456
30
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-10B-Instruct
0.4411
0
0.3773
0.1554
0.5414
0.6329
0.676
0.7087
0.6246
0.1702
0.8529
0.1128
0.16
0.789
7.041
0.7432
0.9204
11.375
0.8098
0.3773
0.7042
0.569
0.7389
0.6899
0.2236
0.3875
0.7621
0.3775
0.6753
0.8529
0.8215
0.7657
0.5046
0.676
0
0
0.6954
0.127
0.0194
0.3143
0.028
0.065
0.3503
0.6996
6.7684
0.6089
0.8706
8.2487
0.6727
0.7034
1.3674
35.0273
11.2997
0.1128
25.4568
LlamaForCausalLM
bfloat16
other
10.306
62
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tiiuae/Falcon3-10B-Instruct
0.1544
0
0.0409
0.0656
0.0259
0.1072
0
0.6637
0.1249
0.1268
0.431
0.1128
0.0915
0.7602
2.3907
0.6563
0.901
5.3712
0.7377
0.0409
0.0624
0.0086
0.4931
0.017
0.1743
0
0.0201
0
0.1025
0.431
0.0963
0.0841
0.2423
0
0
0
0.0518
0.1145
0.01
0
0
0
0.3181
0.7082
2.1624
0.6254
0.8613
4.6148
0.6353
0.7034
1.3674
35.0273
11.2997
0.1128
25.4568
LlamaForCausalLM
bfloat16
other
10.306
62
main
0
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tiiuae/Falcon3-10B-base
0.2913
0
0.0168
0.0507
0.2913
0.3883
0.204
0.6781
0.6081
0.121
0.7591
0.0867
0.0792
0.7326
4.1018
0.6599
0.9181
10.1283
0.8009
0.0168
0.4689
0.4943
0.6667
0.3905
0.1622
0.2694
0.4934
0.6351
0.751
0.7591
0.7847
0.7403
0.3055
0.204
0
0
0.3133
0.1215
0
0.0004
0
0
0.253
0.6808
3.6089
0.5964
0.8621
6.5577
0.6551
0.6744
1.6912
24.1675
8.6748
0.0867
20.71
LlamaForCausalLM
bfloat16
other
10.306
29
main
0
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tiiuae/Falcon3-10B-base
0.4683
0
0.4135
0.1845
0.5641
0.6146
0.746
0.7417
0.6852
0.2614
0.854
0.0867
0.1452
0.7878
6.8573
0.7419
0.9245
12.0529
0.8199
0.4135
0.7109
0.5316
0.7361
0.6828
0.4489
0.4293
0.6898
0.7045
0.7642
0.854
0.807
0.7619
0.4501
0.746
0
0
0.6989
0.1901
0.0149
0.3668
0.0796
0.0811
0.3799
0.7596
7.8865
0.7076
0.8802
8.5149
0.6972
0.6744
1.6912
24.1675
8.6748
0.0867
20.71
LlamaForCausalLM
bfloat16
other
10.306
29
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tiiuae/Falcon3-7B-base
0.2599
0
0
0.0431
0.3801
0.3546
0.014
0.6374
0.5953
0.1275
0.6241
0.0825
0.0697
0.72
3.6097
0.6035
0.9003
8.9668
0.7785
0
0.4679
0.4971
0.6292
0.336
0.2226
0.2316
0.5045
0.6427
0.7029
0.6241
0.3904
0.5568
0.2597
0.014
0
0
0.5286
0.0901
0
0
0
0
0.2155
0.6605
2.7174
0.5444
0.845
5.433
0.6232
0.6704
1.525
24.3722
8.2702
0.0825
20.5076
LlamaForCausalLM
bfloat16
other
7.456
18
main
0
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tiiuae/Falcon3-7B-base
0.4165
0
0.351
0.1367
0.5138
0.5458
0.612
0.6823
0.6324
0.2365
0.7879
0.0825
0.131
0.7543
5.1126
0.6531
0.9169
10.2302
0.7986
0.351
0.5794
0.4483
0.6917
0.6345
0.4053
0.3812
0.6656
0.6881
0.6686
0.7879
0.7281
0.7004
0.4235
0.612
0
0
0.6463
0.1732
0.0028
0.3225
0.0265
0.0392
0.2923
0.7178
5.5347
0.6162
0.8657
6.6555
0.6615
0.6704
1.525
24.3722
8.2702
0.0825
20.5076
LlamaForCausalLM
bfloat16
other
7.456
18
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tiiuae/Falcon3-3B-base
0.1113
0
0
0.0354
0.0117
0.1774
0
0.4673
0.1269
0.1259
0.2228
0.0566
0.0537
0.6613
0.459
0.4886
0.765
0.7313
0.5297
0
0.259
0.0029
0.0125
0.0197
0.2563
0
0.0058
0.5979
0.0156
0.2228
-0.0173
-0.0159
0.2534
0
0
0
0.0234
0.0676
0
0
0
0
0.177
0.6235
0.3452
0.4306
0.7594
0.6908
0.4201
0.6321
0.8408
21.9059
5.665
0.0566
17.5055
LlamaForCausalLM
bfloat16
other
3.228
11
main
0
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tiiuae/Falcon3-3B-base
0.2984
0
0.2792
0.079
0.4221
0.3554
0.452
0.5045
0.4376
0.1787
0.5174
0.0566
0.0936
0.6806
4.1885
0.4468
0.8847
8.0439
0.6776
0.2792
0.5256
0.3621
0.5319
0.2824
0.3278
0.3132
0.3311
0.6591
0.3036
0.5174
-0.0368
-0.014
0.2582
0.452
0
0
0.531
0.1147
0.0078
0.2004
0
0.0537
0.1335
0.6099
3.5593
0.4026
0.8162
5.0417
0.491
0.6321
0.8408
21.9059
5.665
0.0566
17.5055
LlamaForCausalLM
bfloat16
other
3.228
11
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V2-70B
0.5085
0.5622
0.3624
0.1482
0.1744
0.8749
0.742
0.8375
0.6741
0.4813
0.6426
0.0942
0.7498
0.8707
14.6608
0.9083
0.9561
18.0801
0.8838
0.3624
0.9138
0.6092
0.6556
0.9294
0.2163
0.0616
0.6454
0.6723
0.7881
0.6426
0.8961
0.8642
0.7816
0.742
0.5622
0.9317
0.2872
0.4778
0.0043
0.0145
0.0354
0.0129
0.6737
0.8268
12.4992
0.8285
0.8957
11.2659
0.7292
0.6826
3.5366
23.7945
9.4105
0.0942
20.4612
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V2-70B
0.6761
0.5622
0.6051
0.3448
0.719
0.9013
0.932
0.8573
0.7786
0.7229
0.9195
0.0942
0.8823
0.8806
15.959
0.9154
0.9592
18.7354
0.8903
0.6051
0.9311
0.6379
0.7986
0.9357
0.6171
0.6942
0.8332
0.7715
0.852
0.9195
0.908
0.8798
0.8372
0.932
0.5622
0.9317
0.7438
0.6692
0.0844
0.4804
0.1681
0.088
0.903
0.8646
16.6439
0.8612
0.9143
12.4809
0.7623
0.6826
3.5366
23.7945
9.4105
0.0942
20.4612
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
llm-jp/llm-jp-3-1.8b
0.3377
0
0.3819
0.1887
0.2436
0.3284
0.096
0.7826
0.4689
0.5049
0.7084
0.0114
0.6217
0.8386
10.4983
0.8718
0.9322
13.2917
0.836
0.3819
0.5323
0.3707
0.5
0.2029
0.4516
0.244
0.5559
0.6559
0.2618
0.7084
-0.039
-0.0229
0.2499
0.096
0
0.2711
0.2432
0.4413
0.0108
0.1569
0.0177
0.011
0.747
0.7673
7.7539
0.7477
0.8799
9.5106
0.6747
0.5862
0.3595
8.2858
1.1461
0.0114
6.989
LlamaForCausalLM
bfloat16
apache-2.0
1.868
12
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
llm-jp/llm-jp-3-1.8b
0.1237
0
0
0.0325
0.0037
0.1929
0.002
0.5454
0
0.2174
0.3555
0.0114
0.2974
0.6812
0.8979
0.532
0.8504
5.7482
0.7302
0
0.3938
0
0
0.0018
0.1489
0
0
0
0
0.3555
-0.0091
-0.0061
0.1832
0.002
0
0.2711
0.0075
0.206
0
0
0
0
0.1627
0.6274
0.9219
0.4336
0.788
2.7747
0.4859
0.5862
0.3595
8.2858
1.1461
0.0114
6.989
LlamaForCausalLM
bfloat16
apache-2.0
1.868
12
main
0
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
llm-jp/llm-jp-3-3.7b
0.0803
0
0
0.0273
0
0
0
0.5023
0.0017
0.1166
0.2075
0.0282
0.1505
0.6636
1.815
0.5261
0.8158
2.8972
0.6196
0
0
0
0.0083
0
0.0633
0
0
0
0
0.2075
0
0
0
0
0
0.99
0
0.1358
0
0
0
0
0.1363
0.6195
2.0215
0.4514
0.7645
1.16
0.412
0.6201
0.5918
12.4251
2.8139
0.0282
10.5888
LlamaForCausalLM
bfloat16
apache-2.0
3.783
6
main
0
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
llm-jp/llm-jp-3-3.7b
0.3773
0
0.3385
0.2401
0.2516
0.3358
0.368
0.8076
0.4103
0.5701
0.8002
0.0282
0.7401
0.8481
10.9992
0.8893
0.9377
13.7621
0.8506
0.3385
0.5463
0.3333
0.5
0.21
0.4302
0.2482
0.3956
0.661
0.1618
0.8002
-0.0802
-0.1116
0.2509
0.368
0
0.99
0.2549
0.5399
0.0159
0.3012
0.0354
0.039
0.809
0.7923
9.0251
0.7851
0.8909
9.8623
0.7056
0.6201
0.5918
12.4251
2.8139
0.0282
10.5888
LlamaForCausalLM
bfloat16
apache-2.0
3.783
6
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V5-70B
0.6828
0.5884
0.5986
0.3456
0.7413
0.9067
0.94
0.8551
0.778
0.7256
0.9228
0.109
0.874
0.8703
14.2122
0.9089
0.961
19.3978
0.8932
0.5986
0.9248
0.6523
0.8542
0.9508
0.6264
0.7162
0.7823
0.7835
0.8179
0.9228
0.9035
0.8764
0.8443
0.94
0.5884
0.994
0.7665
0.6764
0.1266
0.5264
0.1062
0.0649
0.9039
0.8561
16.1865
0.8495
0.9171
12.8469
0.7686
0.6928
3.9556
26.7761
10.9092
0.109
23.2374
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V5-70B
0.6179
0.5884
0.3251
0.1656
0.7054
0.8781
0.912
0.8488
0.7735
0.5878
0.9036
0.109
0.7903
0.8621
14.0286
0.9058
0.96
18.7036
0.8914
0.3251
0.9151
0.7011
0.7653
0.9419
0.4702
0.6823
0.7942
0.7967
0.8104
0.9036
0.8971
0.8646
0.7773
0.912
0.5884
0.994
0.7285
0.5028
0
0.015
0.0177
0.0051
0.7901
0.8375
12.9669
0.8395
0.9093
12.1737
0.7587
0.6928
3.9556
26.7761
10.9092
0.109
23.2374
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
huihui-ai/aya-expanse-32b-abliterated
0.5069
0.1145
0.5276
0.293
0.4195
0.8456
0.242
0.8553
0.5806
0.6444
0.9111
0.1422
0.7483
0.8755
14.7603
0.9168
0.9593
19.3249
0.8914
0.5276
0.768
0.5661
0.1375
0.9643
0.5739
0.1327
0.8299
0.7803
0.5892
0.9111
0.8847
0.8648
0.8046
0.242
0.1145
0.2289
0.7063
0.611
0.0225
0.3978
0.0885
0.086
0.8702
0.8454
12.9915
0.8502
0.9102
11.4654
0.763
0.7193
3.6733
36.7673
14.2206
0.1422
30.6953
CohereForCausalLM
bfloat16
cc-by-nc-4.0
32.296
1
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
huihui-ai/aya-expanse-32b-abliterated
0.475
0.1145
0.2079
0.1745
0.5547
0.8416
0.53
0.852
0.3856
0.5266
0.8949
0.1422
0.6805
0.8616
13.1523
0.9141
0.958
18.0438
0.8916
0.2079
0.8858
0.5661
0.0292
0.924
0.369
0.5964
0.5879
0.2841
0.4605
0.8949
0.8679
0.8447
0.7149
0.53
0.1145
0.2289
0.513
0.5303
0.0157
0
0.0221
0.0036
0.8312
0.817
9.7503
0.8428
0.9033
10.2223
0.7596
0.7193
3.6733
36.7673
14.2206
0.1422
30.6953
CohereForCausalLM
bfloat16
cc-by-nc-4.0
32.296
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Kendamarron/Qwen2.5-7B-o1-ja-v0.1
0.5442
0
0.459
0.2376
0.656
0.8407
0.818
0.8318
0.718
0.4196
0.9027
0.1022
0.4001
0.8499
11.3597
0.8982
0.95
15.3472
0.8785
0.459
0.865
0.5603
0.6986
0.9285
0.4524
0.619
0.82
0.7172
0.794
0.9027
0.8655
0.8384
0.7285
0.818
0
0
0.693
0.4064
0.0217
0.3604
0.0708
0.0637
0.6714
0.7968
8.5627
0.8164
0.8912
9.4589
0.7341
0.6962
2.4642
30.5895
10.2226
0.1022
25.3452
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Kendamarron/Qwen2.5-7B-o1-ja-v0.1
0.3788
0
0.2193
0.1108
0.3473
0.6797
0.492
0.7495
0.6409
0.2548
0.57
0.1022
0.2738
0.8015
7.3101
0.853
0.9051
12.9384
0.7638
0.2193
0.6473
0.5316
0.5194
0.7837
0.2472
0.4872
0.6701
0.6484
0.8348
0.57
0.8664
0.8283
0.608
0.492
0
0
0.2074
0.2435
0.0023
0.0098
0
0
0.5421
0.7504
6.5096
0.7617
0.8465
7.9571
0.6197
0.6962
2.4642
30.5895
10.2226
0.1022
25.3452
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V5-70B
0.6992
0.5743
0.6142
0.354
0.7373
0.9171
0.942
0.8638
0.7843
0.7473
0.9257
0.2311
0.8905
0.881
15.5556
0.918
0.9605
19.5563
0.8927
0.6142
0.9386
0.6638
0.8181
0.9553
0.6762
0.7162
0.8451
0.7702
0.8242
0.9257
0.9023
0.8808
0.8574
0.942
0.5743
0.9679
0.7585
0.6754
0.1313
0.5462
0.115
0.0724
0.9051
0.8707
17.9356
0.8692
0.9187
13.0831
0.7756
0.7631
6.2427
47.0128
23.1021
0.2311
40.1601
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V5-70B
0.6391
0.5743
0.3371
0.1849
0.6952
0.8825
0.908
0.8566
0.7588
0.6848
0.917
0.2311
0.855
0.8751
14.7965
0.9156
0.9603
19.1691
0.8919
0.3371
0.9261
0.7011
0.7444
0.9339
0.572
0.6826
0.7925
0.7854
0.7704
0.917
0.9048
0.8742
0.7876
0.908
0.5743
0.9679
0.7079
0.6274
0
0.0402
0.0885
0.0014
0.7945
0.853
13.9419
0.8592
0.9075
12.0012
0.7595
0.7631
6.2427
47.0128
23.1021
0.2311
40.1601
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-22B
0.5032
0.5301
0.1435
0.1112
0.5077
0.6608
0.694
0.8241
0.7038
0.3951
0.8545
0.11
0.4045
0.8427
10.3342
0.8875
0.947
14.9637
0.871
0.1435
0.7202
0.5805
0.7333
0.7605
0.4255
0.451
0.6524
0.7607
0.7922
0.8545
0.8656
0.813
0.5016
0.694
0.5301
0.9618
0.5643
0.3554
0.0033
0
0.0088
0.0006
0.5435
0.7932
7.8123
0.807
0.8926
9.3137
0.7308
0.7055
2.7325
31.57
11.0044
0.11
26.745
MistralForCausalLM
bfloat16
apache-2.0
22.247
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-22B
0.5731
0.5301
0.4695
0.2413
0.5842
0.6965
0.778
0.8322
0.7053
0.4522
0.9052
0.11
0.4505
0.8504
11.1397
0.8943
0.9482
16.2792
0.8725
0.4695
0.619
0.5948
0.8097
0.8293
0.504
0.5001
0.751
0.7506
0.6201
0.9052
0.8505
0.8028
0.6412
0.778
0.5301
0.9618
0.6682
0.4022
0
0.3477
0.0885
0.0663
0.7038
0.814
10.2167
0.8215
0.898
10.1982
0.7407
0.7055
2.7325
31.57
11.0044
0.11
26.745
MistralForCausalLM
bfloat16
apache-2.0
22.247
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V6-70B
0.172
0.004
0.0759
0.0752
0
0.5807
0
0.4528
0.2592
0.1374
0.2458
0.0613
0.145
0.6673
11.6405
0.5159
0.8556
16.3103
0.498
0.0759
0
0.5891
0
0.9446
0.1857
0
0.3353
0
0.3714
0.2458
0.8962
0.8628
0.7974
0
0.004
0.0301
0
0.0816
0
0.0083
0
0.0034
0.3645
0.6043
11.9566
0.4149
0.8163
9.9538
0.3824
0.5839
3.487
15.5448
6.1297
0.0613
12.0015
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V6-70B
0.4109
0.004
0.5116
0.1869
0.5258
0.9063
0.004
0.4596
0.7713
0.5135
0.5761
0.0613
0.4859
0.7032
12.527
0.5841
0.8711
16.3046
0.5581
0.5116
0.9241
0.6322
0.8458
0.9464
0.6395
0.3332
0.8127
0.7936
0.7725
0.5761
0.9051
0.8761
0.8483
0.004
0.004
0.0301
0.7183
0.415
0.0201
0.2378
0.0354
0.0677
0.5737
0.5947
15.3946
0.3945
0.8099
11.1131
0.3017
0.5839
3.487
15.5448
6.1297
0.0613
12.0015
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-27B
0.5592
0
0.4962
0.2631
0.6445
0.797
0.834
0.845
0.6813
0.5775
0.9106
0.1016
0.6828
0.8693
13.4805
0.9073
0.9511
17.1329
0.8756
0.4962
0.7931
0.5287
0.7625
0.9151
0.5235
0.5922
0.7896
0.7551
0.5705
0.9106
0.8773
0.8553
0.6829
0.834
0
0
0.6968
0.5261
0.0287
0.3235
0.1416
0.0743
0.7477
0.8431
13.3477
0.8376
0.9082
11.7151
0.7596
0.6947
2.4227
27.4314
10.1745
0.1016
23.6174
Gemma2ForCausalLM
bfloat16
apache-2.0
27.227
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-27B
0.3969
0
0.1418
0.1066
0.2657
0.7119
0.544
0.7703
0.5068
0.3824
0.8344
0.1016
0.4938
0.7943
10.6865
0.8514
0.8709
10.9799
0.803
0.1418
0.7891
0.4569
0.5597
0.8525
0.3116
0.3239
0.3591
0.7525
0.4057
0.8344
0.8393
0.8005
0.494
0.544
0
0
0.2075
0.3417
0
0
0.0088
0
0.5243
0.7359
7.2999
0.7581
0.8193
7.3101
0.6688
0.6947
2.4227
27.4314
10.1745
0.1016
23.6174
Gemma2ForCausalLM
bfloat16
apache-2.0
27.227
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V6-70B
0.5955
0.5763
0.3412
0.1619
0.7109
0.869
0.856
0.7856
0.7763
0.5302
0.9059
0.0371
0.6726
0.828
12.0314
0.8193
0.9579
17.7512
0.8881
0.3412
0.9106
0.704
0.7653
0.9357
0.4409
0.682
0.7707
0.803
0.8382
0.9059
0.8907
0.8636
0.7608
0.856
0.5763
0.9839
0.7398
0.4771
0.001
0.0002
0.0442
0.0024
0.7617
0.7854
11.9288
0.6789
0.9071
11.6428
0.756
0.6204
3.009
10.8252
3.718
0.0371
8.8261
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V6-70B
0.6581
0.5763
0.5614
0.3025
0.7433
0.8893
0.938
0.7972
0.7818
0.6904
0.9215
0.0371
0.8589
0.844
13.0867
0.8515
0.9581
18.3985
0.8883
0.5614
0.8878
0.6466
0.8778
0.9508
0.5523
0.7156
0.7473
0.7904
0.847
0.9215
0.8981
0.8731
0.8292
0.938
0.5763
0.9839
0.771
0.66
0.0866
0.4089
0.0796
0.0641
0.8734
0.7967
15.2096
0.6831
0.9154
12.6876
0.7658
0.6204
3.009
10.8252
3.718
0.0371
8.8261
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿค : base merges and moerges
nitky/Llama-3.3-SuperSwallow-70B-Instruct-v0.1
0.3796
0.008
0.1735
0.1619
0.3598
0.8663
0.328
0.7489
0.6672
0.364
0.4192
0.0788
0.4678
0.8014
13.3892
0.8305
0.9165
16.8218
0.7892
0.1735
0.9231
0.569
0.7944
0.9106
0.4012
0.4041
0.6294
0.4949
0.8482
0.4192
0.8705
0.8346
0.765
0.328
0.008
0.0161
0.3154
0.2229
0.0113
0.0119
0.0029
0.0029
0.7806
0.7441
10.675
0.7345
0.8544
10.6318
0.6416
0.68
3.4132
18.9298
7.8674
0.0788
16.9484
LlamaForCausalLM
bfloat16
llama3.3
70.554
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿค : base merges and moerges
nitky/Llama-3.3-SuperSwallow-70B-Instruct-v0.1
0.6192
0.008
0.5926
0.2906
0.765
0.9085
0.916
0.8522
0.8049
0.7076
0.8867
0.0788
0.8605
0.8734
14.2994
0.9136
0.9587
18.0112
0.8899
0.5926
0.9311
0.6839
0.9153
0.9526
0.6025
0.7297
0.7872
0.8087
0.8293
0.8867
0.873
0.8505
0.8418
0.916
0.008
0.0161
0.8002
0.6597
0.0332
0.4326
0.0442
0.0621
0.8808
0.8503
15.4093
0.8529
0.9068
11.7516
0.7526
0.68
3.4132
18.9298
7.8674
0.0788
16.9484
LlamaForCausalLM
bfloat16
llama3.3
70.554
1
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V7-70B
0.5453
0.002
0.5805
0.204
0.7412
0.9094
0.692
0.445
0.7908
0.6994
0.8503
0.0834
0.8117
0.7037
13.7751
0.5882
0.8532
17.1006
0.5216
0.5805
0.9274
0.658
0.8486
0.9508
0.673
0.7122
0.8439
0.8024
0.8009
0.8503
0.9064
0.8765
0.8501
0.692
0.002
0.0442
0.7702
0.6136
0.0846
0.3409
0.0354
0.0849
0.474
0.5962
15.9247
0.3871
0.8089
11.536
0.283
0.6188
3.6041
20.5243
8.3439
0.0834
16.7947
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V7-70B
0.1885
0.002
0.1029
0.1003
0
0.5807
0
0.5543
0.2576
0.1368
0.2554
0.0834
0.1525
0.7238
12.6493
0.628
0.8894
16.9361
0.6524
0.1029
0
0.5891
0
0.9464
0.1858
0
0.3295
0
0.3696
0.2554
0.8963
0.8608
0.7958
0
0.002
0.0442
0
0.072
0
0.0091
0
0.0048
0.4877
0.6384
12.698
0.4814
0.835
10.3511
0.4554
0.6188
3.6041
20.5243
8.3439
0.0834
16.7947
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V3-70B
0.6086
0.5141
0.2645
0.1303
0.6963
0.8795
0.89
0.8231
0.7356
0.6515
0.8931
0.2169
0.8464
0.7982
7.9103
0.8595
0.9277
15.3441
0.8415
0.2645
0.9181
0.6667
0.7431
0.9294
0.5255
0.682
0.7818
0.7866
0.6998
0.8931
0.8995
0.8658
0.7911
0.89
0.5141
0.8675
0.7107
0.5825
0
0.0098
0.0442
0.0046
0.5931
0.8396
13.4029
0.8458
0.8974
11.2506
0.7457
0.7301
10.238
38.5793
21.6533
0.2169
34.0106
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V3-70B
0.6862
0.5141
0.5842
0.3358
0.735
0.9132
0.928
0.861
0.7708
0.765
0.9244
0.2169
0.8883
0.8634
13.326
0.9089
0.9592
19.556
0.8903
0.5842
0.9319
0.6293
0.7931
0.9535
0.703
0.7184
0.8431
0.7759
0.8127
0.9244
0.8957
0.8685
0.8542
0.928
0.5141
0.8675
0.7515
0.7037
0.0727
0.5237
0.115
0.0743
0.8934
0.8643
17.228
0.8621
0.9215
13.7566
0.7827
0.7301
10.238
38.5793
21.6533
0.2169
34.0106
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-8B-Instruct-v0.3
0.3611
0
0.1268
0.1611
0.367
0.718
0.076
0.7517
0.4493
0.4047
0.8057
0.1121
0.4821
0.8422
10.7234
0.8861
0.8953
13.2109
0.6808
0.1268
0.6946
0.4598
0.4514
0.8311
0.3277
0.3058
0.2354
0.7443
0.3554
0.8057
0.8723
0.8329
0.6283
0.076
0
0
0.4282
0.4042
0.005
0
0.0177
0.0008
0.782
0.7877
8.2668
0.7897
0.872
9.1974
0.6503
0.7016
2.882
29.3068
11.2212
0.1121
25.4418
LlamaForCausalLM
bfloat16
llama3.1;gemma
8.03
8
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-8B-Instruct-v0.3
0.5522
0
0.5417
0.2763
0.5582
0.8309
0.704
0.8451
0.6989
0.6087
0.8981
0.1121
0.7376
0.8686
14.075
0.9095
0.9529
17.0684
0.881
0.5417
0.8675
0.5661
0.6903
0.9249
0.4983
0.536
0.8077
0.7557
0.6749
0.8981
0.8629
0.8368
0.7003
0.704
0
0
0.5804
0.5902
0.0061
0.3579
0.0796
0.1026
0.8354
0.8305
11.3283
0.8346
0.9077
11.2207
0.7551
0.7016
2.882
29.3068
11.2212
0.1121
25.4418
LlamaForCausalLM
bfloat16
llama3.1;gemma
8.03
8
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-8B-Instruct-v0.1
0.3946
0
0.1107
0.1616
0.0986
0.7917
0.404
0.7713
0.6382
0.4278
0.8574
0.0795
0.5897
0.8529
11.0826
0.9014
0.8715
15.087
0.7001
0.1107
0.8742
0.4511
0.7097
0.8615
0.4175
0.1124
0.562
0.7506
0.7173
0.8574
0.8298
0.782
0.6394
0.404
0
0.002
0.0847
0.2763
0.005
0
0
0
0.8031
0.7953
8.0078
0.814
0.8682
9.4689
0.6699
0.6772
2.6541
19.5345
7.9522
0.0795
17.3746
LlamaForCausalLM
bfloat16
llama3.1;gemma
8.03
15
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-8B-Instruct-v0.1
0.5516
0
0.5252
0.2719
0.5517
0.844
0.728
0.8478
0.6751
0.6265
0.9177
0.0795
0.7386
0.869
13.7124
0.9097
0.9533
16.7809
0.8825
0.5252
0.8863
0.5144
0.7167
0.924
0.562
0.5239
0.8155
0.7582
0.5709
0.9177
0.8129
0.8086
0.7216
0.728
0
0.002
0.5795
0.5788
0.0144
0.3583
0.0442
0.0868
0.8556
0.8319
11.178
0.8382
0.9094
11.446
0.7607
0.6772
2.6541
19.5345
7.9522
0.0795
17.3746
LlamaForCausalLM
bfloat16
llama3.1;gemma
8.03
15
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tokyotech-llm/Llama-3.1-Swallow-8B-v0.1
0.5498
0.1767
0.458
0.2765
0.5443
0.7994
0.714
0.8445
0.6491
0.6164
0.8932
0.076
0.7972
0.8745
14.1823
0.9092
0.9546
16.7253
0.8832
0.458
0.8477
0.4741
0.6486
0.9088
0.4641
0.5134
0.7473
0.7645
0.6111
0.8932
0.8135
0.774
0.6417
0.714
0.1767
0.4177
0.5752
0.5878
0.0084
0.3813
0.0531
0.0661
0.8736
0.8359
12.7557
0.8304
0.9105
11.9822
0.7551
0.6736
1.8472
19.1456
7.6098
0.076
16.9134
LlamaForCausalLM
bfloat16
llama3.1
8.03
10
main
4
False
v1.4.1
v0.6.3.post1
๐ŸŸข : pretrained
tokyotech-llm/Llama-3.1-Swallow-8B-v0.1
0.392
0.1767
0.0119
0.1463
0.3783
0.5702
0.406
0.82
0.4318
0.4769
0.8181
0.076
0.6883
0.8563
11.5331
0.8956
0.9489
13.4145
0.8739
0.0119
0.7578
0.3592
0.5
0.5898
0.2864
0.3092
0.2001
0.7071
0.3927
0.8181
0.3888
0.3891
0.3631
0.406
0.1767
0.4177
0.4474
0.4561
0
0.0053
0
0
0.7264
0.7958
8.4726
0.797
0.8907
9.1623
0.7135
0.6736
1.8472
19.1456
7.6098
0.076
16.9134
LlamaForCausalLM
bfloat16
llama3.1
8.03
10
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V4-70B
0.6995
0.5843
0.5949
0.3418
0.7377
0.9094
0.938
0.8582
0.7741
0.7523
0.926
0.2777
0.8826
0.8658
13.4485
0.9094
0.9599
19.275
0.892
0.5949
0.9339
0.6379
0.8417
0.9464
0.7012
0.7165
0.7744
0.7797
0.8366
0.926
0.8956
0.8702
0.848
0.938
0.5843
0.9819
0.7589
0.6732
0.0908
0.5105
0.115
0.0859
0.9068
0.8644
17.9708
0.8608
0.9177
13.5108
0.7704
0.7709
9.7075
49.0147
27.7847
0.2777
42.6813
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V4-70B
0.6025
0.5843
0.2473
0.1413
0.692
0.8797
0.904
0.8319
0.767
0.6403
0.6614
0.2777
0.8276
0.825
9.2353
0.8785
0.9312
15.6962
0.8511
0.2473
0.9218
0.7126
0.7514
0.9321
0.5032
0.6848
0.7851
0.7999
0.7859
0.6614
0.8935
0.8608
0.7852
0.904
0.5843
0.9819
0.6992
0.5901
0
0.0008
0.0442
0.0079
0.6535
0.8465
14.7082
0.8499
0.9012
11.6846
0.7484
0.7709
9.7075
49.0147
27.7847
0.2777
42.6813
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V8-70B
0.3801
0.0863
0.2306
0.1605
0.0075
0.8672
0.184
0.7766
0.7715
0.3051
0.7549
0.0371
0.4718
0.8237
12.145
0.8008
0.9515
18.019
0.8706
0.2306
0.9198
0.704
0.8125
0.924
0.2001
0.0031
0.7239
0.7431
0.8742
0.7549
0.4392
0.8134
0.7578
0.184
0.0863
0.1767
0.012
0.2435
0.0029
0.0037
0
0
0.796
0.7788
12.1524
0.7008
0.8997
11.4443
0.7341
0.6093
2.8107
11.3818
3.7185
0.0371
9.2989
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V8-70B
0.6106
0.0863
0.563
0.2552
0.7734
0.8807
0.932
0.7965
0.8009
0.6899
0.9018
0.0371
0.8215
0.82
11.9307
0.7906
0.9585
18.4496
0.8893
0.563
0.8943
0.6983
0.9222
0.9437
0.6368
0.7343
0.7284
0.8068
0.8486
0.9018
0.8701
0.8497
0.8041
0.932
0.0863
0.1767
0.8125
0.6114
0.0316
0.3327
0.0088
0.0194
0.8833
0.8152
15.5944
0.7541
0.908
12.2901
0.7522
0.6093
2.8107
11.3818
3.7185
0.0371
9.2989
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V5-70B
0.5326
0.2329
0.1761
0.1484
0.522
0.8703
0.86
0.683
0.7323
0.5773
0.8739
0.1821
0.7518
0.7047
4.4341
0.7671
0.7518
1.6655
0.5355
0.1761
0.9196
0.6897
0.7486
0.9285
0.453
0.3225
0.749
0.7771
0.697
0.8739
0.8809
0.8572
0.7629
0.86
0.2329
0.3976
0.7214
0.5271
0
0.0054
0.0531
0
0.6835
0.7934
12.9875
0.7958
0.8349
7.7163
0.6335
0.7123
9.9828
32.947
18.2336
0.1821
29.1103
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V5-70B
0.6531
0.2329
0.5799
0.3254
0.7346
0.9022
0.93
0.8504
0.7685
0.7524
0.9261
0.1821
0.878
0.8634
12.9152
0.908
0.9453
16.767
0.8683
0.5799
0.9266
0.6293
0.7917
0.9401
0.6846
0.7153
0.8077
0.7778
0.8358
0.9261
0.8948
0.8651
0.8399
0.93
0.2329
0.3976
0.7538
0.6946
0.0456
0.5073
0.115
0.0647
0.8944
0.8553
17.2779
0.8517
0.9181
13.3587
0.7735
0.7123
9.9828
32.947
18.2336
0.1821
29.1103
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V2-70B
0.3801
0.0863
0.2306
0.1605
0.0075
0.8672
0.184
0.7766
0.7715
0.3051
0.7549
0.0371
0.4718
0.8237
12.145
0.8008
0.9515
18.019
0.8706
0.2306
0.9198
0.704
0.8125
0.924
0.2001
0.0031
0.7239
0.7431
0.8742
0.7549
0.4392
0.8134
0.7578
0.184
0.0863
0.1767
0.012
0.2435
0.0029
0.0037
0
0
0.796
0.7788
12.1524
0.7008
0.8997
11.4443
0.7341
0.6093
2.8107
11.3818
3.7185
0.0371
9.2989
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V2-70B
0.6106
0.0863
0.563
0.2552
0.7734
0.8807
0.932
0.7965
0.8009
0.6899
0.9018
0.0371
0.8215
0.82
11.9307
0.7906
0.9585
18.4496
0.8893
0.563
0.8943
0.6983
0.9222
0.9437
0.6368
0.7343
0.7284
0.8068
0.8486
0.9018
0.8701
0.8497
0.8041
0.932
0.0863
0.1767
0.8125
0.6114
0.0316
0.3327
0.0088
0.0194
0.8833
0.8152
15.5944
0.7541
0.908
12.2901
0.7522
0.6093
2.8107
11.3818
3.7185
0.0371
9.2989
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V1-70B
0.6369
0
0.5885
0.2724
0.7617
0.902
0.926
0.8407
0.7945
0.707
0.9212
0.2916
0.863
0.849
12.0075
0.8953
0.9531
17.5708
0.8806
0.5885
0.9286
0.6552
0.9083
0.9464
0.5869
0.7201
0.7855
0.8018
0.8218
0.9212
0.8727
0.8452
0.8311
0.926
0
0.002
0.8032
0.6712
0.0206
0.3236
0.0708
0.057
0.8899
0.844
15.9576
0.8394
0.906
11.9797
0.7477
0.7843
9.5767
51.8067
29.1591
0.2916
44.8902
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V1-70B
0.3376
0
0.157
0.154
0.046
0.7849
0.02
0.6217
0.6908
0.35
0.5977
0.2916
0.4885
0.7302
4.509
0.7762
0.745
0.2586
0.5326
0.157
0.625
0.6264
0.8167
0.9392
0.2227
0.0319
0.6245
0.6408
0.7455
0.5977
0.6323
0.8483
0.7905
0.02
0
0.002
0.0602
0.3388
0.007
0
0.0088
0
0.7541
0.7435
12.4429
0.7145
0.7472
2.2312
0.4634
0.7843
9.5767
51.8067
29.1591
0.2916
44.8902
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avdvanced-V9-70B
0.616
0.5703
0.3476
0.1637
0.7139
0.8719
0.882
0.8467
0.773
0.596
0.9122
0.099
0.7626
0.8615
13.1963
0.9036
0.9584
18.0828
0.8888
0.3476
0.9106
0.704
0.7653
0.9383
0.4999
0.6877
0.7621
0.8037
0.8301
0.9122
0.8933
0.8609
0.7667
0.882
0.5703
0.9779
0.7401
0.5255
0.001
0.0002
0.0354
0.0069
0.7748
0.8372
12.7899
0.8379
0.9069
11.604
0.7564
0.6875
3.4474
25.3877
9.9078
0.099
21.9987
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avdvanced-V9-70B
0.6735
0.5703
0.5677
0.3206
0.743
0.8948
0.938
0.8531
0.7865
0.7092
0.9263
0.099
0.8619
0.8703
14.3108
0.9103
0.959
19.0061
0.8895
0.5677
0.9026
0.6494
0.8778
0.9508
0.6001
0.7151
0.7584
0.7948
0.8518
0.9263
0.8999
0.871
0.8311
0.938
0.5703
0.9779
0.771
0.6655
0.0827
0.4725
0.0885
0.071
0.8883
0.8517
16.4171
0.8443
0.9166
13.0273
0.7682
0.6875
3.4474
25.3877
9.9078
0.099
21.9987
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avdvanced-V10-70B
0.6847
0.5763
0.5905
0.3351
0.7621
0.9118
0.942
0.8611
0.8006
0.7283
0.9263
0.0981
0.8397
0.8759
15.2439
0.9153
0.9604
19.7851
0.8921
0.5905
0.9284
0.681
0.8639
0.9607
0.6842
0.7348
0.8188
0.8049
0.8342
0.9263
0.9015
0.874
0.8463
0.942
0.5763
0.9699
0.7894
0.6612
0.1205
0.5047
0.0531
0.0916
0.9055
0.8655
17.8849
0.8647
0.9163
13.0299
0.7724
0.6914
3.702
22.2644
9.8094
0.0981
19.9749
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avdvanced-V10-70B
0.4266
0.5763
0.301
0.184
0.4875
0.8816
0.016
0.8512
0.3415
0.4181
0.5369
0.0981
0.5355
0.8714
14.5121
0.9128
0.9591
19.0051
0.8905
0.301
0.9156
0.5603
0.2736
0.9455
0.4953
0.3663
0.4059
0.0057
0.4619
0.5369
0.8956
0.8611
0.7836
0.016
0.5763
0.9699
0.6087
0.2233
0.0067
0.0066
0.0442
0.0099
0.8526
0.8427
13.6671
0.8483
0.9041
11.6526
0.7531
0.6914
3.702
22.2644
9.8094
0.0981
19.9749
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avdvanced-V11-70B
0.5538
0.5723
0.3075
0.1886
0.6235
0.8816
0.554
0.853
0.6771
0.4829
0.8438
0.1076
0.594
0.8714
14.1764
0.9128
0.9592
18.7328
0.8906
0.3075
0.9168
0.6609
0.7153
0.941
0.4981
0.5716
0.627
0.6313
0.751
0.8438
0.9024
0.8695
0.7868
0.554
0.5723
0.9438
0.6753
0.3566
0.0055
0.0126
0.0619
0.0118
0.8512
0.8471
13.8352
0.8531
0.9053
11.6841
0.7556
0.6979
3.9999
24.141
10.7758
0.1076
21.4745
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avdvanced-V11-70B
0.6865
0.5703
0.6066
0.3313
0.7608
0.9119
0.942
0.8603
0.8049
0.7283
0.9272
0.1076
0.843
0.877
15.066
0.9154
0.9604
20.0575
0.8916
0.6066
0.9291
0.6782
0.8819
0.9598
0.6847
0.732
0.8221
0.81
0.8326
0.9272
0.9019
0.8731
0.847
0.942
0.5703
0.9438
0.7896
0.6574
0.111
0.4933
0.0708
0.0873
0.894
0.8651
17.8159
0.8633
0.916
13.2924
0.7709
0.6979
3.9999
24.141
10.7758
0.1076
21.4745
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1