eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Xclbr7_Hyena-12b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Xclbr7/Hyena-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/Hyena-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__Hyena-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Xclbr7/Hyena-12b
|
9dd5eb77ce8e0e05e260ae4d812631fb980527fa
| 20.437243
|
apache-2.0
| 1
| 12
| true
| false
| false
| false
| 1.859893
| 0.340446
| 34.044557
| 0.545718
| 34.665649
| 0.093656
| 9.365559
| 0.297819
| 6.375839
| 0.398427
| 11.070052
| 0.343916
| 27.101803
| false
| false
|
2024-09-19
|
2024-09-19
| 1
|
Xclbr7/Arcanum-12b
|
Xclbr7_caliburn-12b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Xclbr7/caliburn-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/caliburn-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__caliburn-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Xclbr7/caliburn-12b
|
f76fa67c7ca8bf7e75540baf55972ba52a46630b
| 22.808395
|
mit
| 0
| 12
| true
| false
| false
| false
| 1.85622
| 0.357631
| 35.763109
| 0.551863
| 35.636841
| 0.10423
| 10.422961
| 0.336409
| 11.521253
| 0.429188
| 13.781771
| 0.36752
| 29.724439
| false
| false
|
2024-09-14
|
2024-09-14
| 0
|
Xclbr7/caliburn-12b
|
Xclbr7_caliburn-v2-12b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Xclbr7/caliburn-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/caliburn-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__caliburn-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Xclbr7/caliburn-v2-12b
|
fa736b3b852298dd8c047ac6dcc620161df4a79b
| 20.953525
|
mit
| 0
| 12
| true
| false
| false
| false
| 1.632194
| 0.296682
| 29.668169
| 0.514143
| 30.387967
| 0.10423
| 10.422961
| 0.326342
| 10.178971
| 0.437031
| 14.128906
| 0.378408
| 30.934176
| false
| false
|
2024-09-16
|
2024-09-16
| 0
|
Xclbr7/caliburn-v2-12b
|
Yash21_TinyYi-7B-Test_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Yash21/TinyYi-7B-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Yash21/TinyYi-7B-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Yash21__TinyYi-7B-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Yash21/TinyYi-7B-Test
|
7750e5de73fbcf1dcc0832b4cdabaa9713c20475
| 4.495167
|
apache-2.0
| 0
| 6
| true
| false
| false
| false
| 0.763109
| 0.185649
| 18.564852
| 0.29098
| 2.267966
| 0
| 0
| 0.264262
| 1.901566
| 0.336448
| 3.222656
| 0.109126
| 1.013963
| true
| false
|
2024-01-06
|
2024-07-03
| 0
|
Yash21/TinyYi-7B-Test
|
Youlln_1PARAMMYL-8B-ModelStock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/1PARAMMYL-8B-ModelStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/1PARAMMYL-8B-ModelStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__1PARAMMYL-8B-ModelStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/1PARAMMYL-8B-ModelStock
|
4ce556da5ccd1ecac8d0f3e1e94d1982f11b910d
| 26.283975
| 0
| 8
| false
| false
| false
| false
| 0.892522
| 0.537134
| 53.713369
| 0.521584
| 31.799951
| 0.147281
| 14.728097
| 0.323826
| 9.8434
| 0.440938
| 14.283854
| 0.400017
| 33.33518
| false
| false
|
2024-09-20
|
2024-09-20
| 1
|
Youlln/1PARAMMYL-8B-ModelStock (Merge)
|
|
Youlln_2PRYMMAL-Yi1.5-6B-SLERP_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/2PRYMMAL-Yi1.5-6B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/2PRYMMAL-Yi1.5-6B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__2PRYMMAL-Yi1.5-6B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/2PRYMMAL-Yi1.5-6B-SLERP
|
b776bd3ce6784b96ff928b1d5ad51b2991909f2c
| 18.979223
|
apache-2.0
| 0
| 6
| true
| false
| false
| false
| 1.383177
| 0.282594
| 28.259352
| 0.466475
| 24.495644
| 0.112538
| 11.253776
| 0.307047
| 7.606264
| 0.475604
| 18.150521
| 0.316988
| 24.109781
| true
| false
|
2024-09-22
|
2024-09-23
| 1
|
Youlln/2PRYMMAL-Yi1.5-6B-SLERP (Merge)
|
Youlln_3PRYMMAL-PHI3-3B-SLERP_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/3PRYMMAL-PHI3-3B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/3PRYMMAL-PHI3-3B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__3PRYMMAL-PHI3-3B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/3PRYMMAL-PHI3-3B-SLERP
|
9396bcf1709ac8360a95a746482520fab4295706
| 24.849214
|
apache-2.0
| 0
| 3
| true
| false
| false
| false
| 1.078439
| 0.36555
| 36.555007
| 0.542183
| 35.827668
| 0.154079
| 15.407855
| 0.326342
| 10.178971
| 0.464844
| 17.772135
| 0.400183
| 33.35365
| true
| false
|
2024-09-23
|
2024-09-23
| 1
|
Youlln/3PRYMMAL-PHI3-3B-SLERP (Merge)
|
Youlln_4PRYMMAL-GEMMA2-9B-SLERP_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/4PRYMMAL-GEMMA2-9B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/4PRYMMAL-GEMMA2-9B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__4PRYMMAL-GEMMA2-9B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/4PRYMMAL-GEMMA2-9B-SLERP
|
7dac3b4ab4298113ae3103d63bb284e1ac8bf4d4
| 23.550238
|
apache-2.0
| 0
| 9
| true
| false
| false
| false
| 2.85266
| 0.271377
| 27.137661
| 0.592253
| 42.064172
| 0.082326
| 8.232628
| 0.330537
| 10.738255
| 0.467198
| 17.466406
| 0.420961
| 35.662308
| true
| false
|
2024-09-23
|
2024-09-23
| 1
|
Youlln/4PRYMMAL-GEMMA2-9B-SLERP (Merge)
|
Youlln_ECE-PRYMMAL-0.5B-FT-V3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-FT-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-FT-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-FT-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-0.5B-FT-V3
|
d542b4d53888fcc8e96c32892d47ec51afc9edc9
| 4.342504
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.573135
| 0.164191
| 16.419101
| 0.309313
| 3.616883
| 0
| 0
| 0.25755
| 1.006711
| 0.364448
| 3.222656
| 0.116107
| 1.789672
| false
| false
|
2024-10-16
|
2024-10-16
| 1
|
Youlln/ECE-PRYMMAL-0.5B-FT-V3 (Merge)
|
Youlln_ECE-PRYMMAL-0.5B-FT-V3-MUSR_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-FT-V3-MUSR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR
|
221dc80a1acd6f7dda0644699e6d61b90a5a0a05
| 5.551291
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 1.029985
| 0.15335
| 15.334978
| 0.304115
| 5.062186
| 0.024924
| 2.492447
| 0.249161
| 0
| 0.366031
| 3.253906
| 0.164478
| 7.164229
| false
| false
|
2024-10-21
|
2024-10-21
| 1
|
Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR (Merge)
|
Youlln_ECE-PRYMMAL-0.5B-FT-V4-MUSR_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-FT-V4-MUSR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR
|
f5b268d63bb10f05a229da4f2ee9cb0882c93971
| 4.198599
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.948691
| 0.113757
| 11.375705
| 0.303836
| 4.949092
| 0.011329
| 1.132931
| 0.270134
| 2.684564
| 0.352885
| 1.477344
| 0.132148
| 3.571956
| false
| false
|
2024-10-21
|
2024-10-21
| 1
|
Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR (Merge)
|
Youlln_ECE-PRYMMAL-0.5B-SLERP-V2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-0.5B-SLERP-V2
|
5e87669abcdc042774a63b94a13880f1acd6e15d
| 4.614607
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.654997
| 0.161193
| 16.119341
| 0.293477
| 1.917561
| 0
| 0
| 0.274329
| 3.243848
| 0.383115
| 5.35599
| 0.109458
| 1.050901
| false
| false
|
2024-10-22
|
2024-10-22
| 1
|
Youlln/ECE-PRYMMAL-0.5B-SLERP-V2 (Merge)
|
Youlln_ECE-PRYMMAL-0.5B-SLERP-V3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-0.5B-SLERP-V3
|
94bfab3b1f41458427e5f8598ceb3ec731ba1bd6
| 3.663014
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.637698
| 0.167014
| 16.701352
| 0.293838
| 2.319605
| 0
| 0
| 0.251678
| 0.223714
| 0.354125
| 1.765625
| 0.10871
| 0.96779
| false
| false
|
2024-10-22
|
2024-10-22
| 0
|
Youlln/ECE-PRYMMAL-0.5B-SLERP-V3
|
Youlln_ECE-PRYMMAL-YL-1B-SLERP-V1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-YL-1B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1
|
b5cd268edb0cc5c2c6ab2c49c950e611b2b8138c
| 16.404998
|
apache-2.0
| 0
| 1
| true
| false
| false
| false
| 0.595341
| 0.325108
| 32.510849
| 0.420851
| 18.279511
| 0.090634
| 9.063444
| 0.291107
| 5.480984
| 0.426583
| 11.589583
| 0.293551
| 21.505615
| false
| false
|
2024-11-08
|
2024-11-08
| 0
|
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1
|
Youlln_ECE-PRYMMAL-YL-1B-SLERP-V2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-YL-1B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2
|
3559f643c8d5774135a1cd8daea78fef31035679
| 16.404998
|
apache-2.0
| 0
| 1
| true
| false
| false
| false
| 0.604628
| 0.325108
| 32.510849
| 0.420851
| 18.279511
| 0.090634
| 9.063444
| 0.291107
| 5.480984
| 0.426583
| 11.589583
| 0.293551
| 21.505615
| false
| false
|
2024-11-08
|
2024-11-08
| 0
|
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2
|
Youlln_ECE-PRYMMAL-YL-7B-SLERP-V4_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-YL-7B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4
|
4939b9e24be6f03d5df1e9bb7dc1b4fd5d59404a
| 10.416375
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.770157
| 0.25097
| 25.096965
| 0.376973
| 13.157437
| 0.026435
| 2.643505
| 0.265101
| 2.013423
| 0.37449
| 7.011198
| 0.213182
| 12.575724
| false
| false
|
2024-11-06
|
2024-11-06
| 0
|
Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4
|
Youlln_ECE-PRYMMAL0.5-FT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL0.5-FT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL0.5-FT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL0.5-FT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL0.5-FT
|
56b9fd5f26e5b6379fe4aa62e0f66b87b5c6f8e8
| 5.19551
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.503391
| 0.185073
| 18.507338
| 0.313209
| 5.1516
| 0
| 0
| 0.255872
| 0.782998
| 0.330125
| 1.432292
| 0.147689
| 5.298833
| false
| false
|
2024-10-02
|
2024-10-02
| 1
|
Youlln/ECE-PRYMMAL0.5-FT (Merge)
|
Youlln_ECE-PRYMMAL0.5B-Youri_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL0.5B-Youri" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL0.5B-Youri</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL0.5B-Youri-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL0.5B-Youri
|
1477d3deff98f35f523aa222bc0442278d464566
| 3.505274
| 1
| 0
| false
| false
| false
| false
| 0.655846
| 0.144632
| 14.46318
| 0.281736
| 1.501296
| 0
| 0
| 0.243289
| 0
| 0.369656
| 4.007031
| 0.109541
| 1.060136
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
Youlln/ECE-PRYMMAL0.5B-Youri (Merge)
|
|
Youlln_ECE-PRYMMAL1B-FT-V1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL1B-FT-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL1B-FT-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL1B-FT-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-PRYMMAL1B-FT-V1
|
d0fc3a6e93f91c8d586eb25c9f2a4ea4ca99e9f4
| 11.923327
|
apache-2.0
| 0
| 1
| true
| false
| false
| false
| 0.737596
| 0.214375
| 21.437453
| 0.403265
| 16.189386
| 0.068731
| 6.873112
| 0.278523
| 3.803132
| 0.341656
| 3.873698
| 0.274269
| 19.36318
| false
| false
|
2024-10-12
|
2024-10-12
| 1
|
Youlln/ECE-PRYMMAL1B-FT-V1 (Merge)
|
Youlln_ECE-Qwen0.5B-FT-V2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE-Qwen0.5B-FT-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-Qwen0.5B-FT-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-Qwen0.5B-FT-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE-Qwen0.5B-FT-V2
|
c87da3f19ab74854fca30f9ca71ce5c4884ef629
| 7.48657
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.5233
| 0.252593
| 25.259312
| 0.328971
| 7.632148
| 0.015106
| 1.510574
| 0.266779
| 2.237136
| 0.306281
| 0.885156
| 0.166556
| 7.395095
| false
| false
|
2024-10-11
|
2024-10-11
| 1
|
Youlln/ECE-Qwen0.5B-FT-V2 (Merge)
|
Youlln_ECE.EIFFEIL.ia-0.5B-SLERP_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Youlln/ECE.EIFFEIL.ia-0.5B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE.EIFFEIL.ia-0.5B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE.EIFFEIL.ia-0.5B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Youlln/ECE.EIFFEIL.ia-0.5B-SLERP
|
e376ce416af881eefa778d2566d15d9a6d29e7d9
| 8.716673
|
apache-2.0
| 0
| 0
| true
| false
| false
| false
| 0.604059
| 0.25614
| 25.614037
| 0.330567
| 8.405356
| 0.05287
| 5.287009
| 0.265101
| 2.013423
| 0.310219
| 0.94401
| 0.190326
| 10.0362
| true
| false
|
2024-10-14
|
2024-10-14
| 1
|
Youlln/ECE.EIFFEIL.ia-0.5B-SLERP (Merge)
|
YoungPanda_qwenqwen_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2MoeForCausalLM
|
<a target="_blank" href="https://huggingface.co/YoungPanda/qwenqwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YoungPanda/qwenqwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YoungPanda__qwenqwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
YoungPanda/qwenqwen
|
3b5d9b63076acc8988b8f7e9734cf1d78bb39c25
| 4.443749
| 0
| 14
| false
| false
| false
| true
| 7.122669
| 0.126397
| 12.639685
| 0.337899
| 8.19478
| 0.015106
| 1.510574
| 0.25
| 0
| 0.343365
| 2.453906
| 0.116772
| 1.863549
| false
| false
|
2024-09-12
| 0
|
Removed
|
||
Yuma42_KangalKhan-RawRuby-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Yuma42/KangalKhan-RawRuby-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Yuma42/KangalKhan-RawRuby-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Yuma42__KangalKhan-RawRuby-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Yuma42/KangalKhan-RawRuby-7B
|
54f56d4c6889eaf43fdd5f7d6dcef3c2ebe51929
| 20.440737
|
apache-2.0
| 7
| 7
| true
| false
| false
| true
| 0.653719
| 0.547675
| 54.767461
| 0.475473
| 26.387284
| 0.063444
| 6.344411
| 0.287752
| 5.033557
| 0.394958
| 7.636458
| 0.302277
| 22.475251
| true
| false
|
2024-02-17
|
2024-06-26
| 1
|
Yuma42/KangalKhan-RawRuby-7B (Merge)
|
ZeroXClem_L3-Aspire-Heart-Matrix-8B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZeroXClem/L3-Aspire-Heart-Matrix-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/L3-Aspire-Heart-Matrix-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__L3-Aspire-Heart-Matrix-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZeroXClem/L3-Aspire-Heart-Matrix-8B
|
d63917595e911b077cff38109c74622c3ec41704
| 25.752283
|
apache-2.0
| 3
| 8
| true
| false
| false
| true
| 0.787701
| 0.483353
| 48.335306
| 0.538421
| 34.307547
| 0.179003
| 17.900302
| 0.324664
| 9.955257
| 0.418708
| 13.071875
| 0.378491
| 30.94341
| true
| false
|
2024-11-21
|
2024-11-22
| 1
|
ZeroXClem/L3-Aspire-Heart-Matrix-8B (Merge)
|
ZeroXClem_Qwen-2.5-Aether-SlerpFusion-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen-2.5-Aether-SlerpFusion-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
|
23992e1be9f77d767181dc7bcb42176395f42c30
| 29.589629
|
apache-2.0
| 2
| 7
| true
| false
| false
| true
| 0.676354
| 0.62616
| 62.61597
| 0.546224
| 36.011209
| 0.241692
| 24.169184
| 0.298658
| 6.487696
| 0.417781
| 11.289323
| 0.43268
| 36.964391
| true
| false
|
2024-11-13
|
2024-11-20
| 1
|
ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B (Merge)
|
ZeroXClem_Qwen2.5-7B-HomerAnvita-NerdMix_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-HomerAnvita-NerdMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix
|
cd87a9d7c9a9c8950af84e1f4c72fff5d4625d8a
| 34.169056
|
apache-2.0
| 4
| 7
| true
| false
| false
| true
| 0.784193
| 0.770765
| 77.07649
| 0.554132
| 36.579206
| 0.295317
| 29.531722
| 0.319631
| 9.284116
| 0.439052
| 14.414844
| 0.443152
| 38.127955
| true
| false
|
2024-11-21
|
2024-11-21
| 1
|
ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix (Merge)
|
ZeroXClem_Qwen2.5-7B-HomerCreative-Mix_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerCreative-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-HomerCreative-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-HomerCreative-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZeroXClem/Qwen2.5-7B-HomerCreative-Mix
|
6849553db73428ca67823a06f5cfeea660f77df8
| 34.353368
|
apache-2.0
| 5
| 7
| true
| false
| false
| true
| 0.729571
| 0.783504
| 78.350443
| 0.554807
| 36.770722
| 0.323263
| 32.326284
| 0.299497
| 6.599553
| 0.434958
| 13.769792
| 0.444731
| 38.303413
| true
| false
|
2024-11-21
|
2024-11-21
| 1
|
ZeroXClem/Qwen2.5-7B-HomerCreative-Mix (Merge)
|
ZeroXClem_Qwen2.5-7B-Qandora-CySec_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-Qandora-CySec" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-Qandora-CySec</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-Qandora-CySec-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZeroXClem/Qwen2.5-7B-Qandora-CySec
|
6c8b513dbc61a9f704210d26124244f19f3bc4cc
| 30.95351
|
apache-2.0
| 3
| 7
| true
| false
| false
| true
| 0.68208
| 0.677317
| 67.73173
| 0.549002
| 36.264898
| 0.228852
| 22.885196
| 0.300336
| 6.711409
| 0.428604
| 13.408854
| 0.448471
| 38.718972
| true
| false
|
2024-11-12
|
2024-11-12
| 1
|
ZeroXClem/Qwen2.5-7B-Qandora-CySec (Merge)
|
ZeusLabs_L3-Aethora-15B-V2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZeusLabs/L3-Aethora-15B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeusLabs/L3-Aethora-15B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeusLabs__L3-Aethora-15B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZeusLabs/L3-Aethora-15B-V2
|
2c601f116c37dd912c89357dbdbef879a637997e
| 24.673537
|
cc-by-sa-4.0
| 40
| 15
| true
| false
| false
| true
| 2.377669
| 0.720806
| 72.080635
| 0.501091
| 28.968505
| 0.079305
| 7.930514
| 0.287752
| 5.033557
| 0.387083
| 6.252083
| 0.349983
| 27.775931
| false
| false
|
2024-06-27
|
2024-06-27
| 1
|
ZeusLabs/L3-Aethora-15B-V2 (Merge)
|
ZhangShenao_SELM-Llama-3-8B-Instruct-iter-3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/ZhangShenao/SELM-Llama-3-8B-Instruct-iter-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZhangShenao/SELM-Llama-3-8B-Instruct-iter-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZhangShenao__SELM-Llama-3-8B-Instruct-iter-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ZhangShenao/SELM-Llama-3-8B-Instruct-iter-3
|
9c95ccdeceed14a3c2881bc495101a1acca1385f
| 23.652706
|
mit
| 5
| 8
| true
| false
| false
| true
| 0.655591
| 0.690282
| 69.028179
| 0.504609
| 29.078531
| 0.062689
| 6.268882
| 0.258389
| 1.118568
| 0.38451
| 5.497135
| 0.378324
| 30.924941
| false
| false
|
2024-05-25
|
2024-07-02
| 3
|
meta-llama/Meta-Llama-3-8B-Instruct
|
aaditya_Llama3-OpenBioLLM-70B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/aaditya/Llama3-OpenBioLLM-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aaditya/Llama3-OpenBioLLM-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aaditya__Llama3-OpenBioLLM-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
aaditya/Llama3-OpenBioLLM-70B
|
5f79deaf38bc5f662943d304d59cb30357e8e5bd
| 35.004197
|
llama3
| 355
| 70
| true
| false
| false
| true
| 9.657022
| 0.759674
| 75.967433
| 0.639887
| 47.147075
| 0.19864
| 19.864048
| 0.322987
| 9.731544
| 0.441719
| 14.348177
| 0.486702
| 42.966903
| false
| false
|
2024-04-24
|
2024-08-30
| 2
|
meta-llama/Meta-Llama-3-70B
|
abacusai_Dracarys-72B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Dracarys-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Dracarys-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Dracarys-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Dracarys-72B-Instruct
|
10cabc4beb57a69df51533f65e39a7ad22821370
| 42.710042
|
other
| 19
| 72
| true
| false
| false
| true
| 12.383464
| 0.785578
| 78.557782
| 0.694407
| 56.93552
| 0.356495
| 35.649547
| 0.39094
| 18.791946
| 0.455823
| 16.811198
| 0.545628
| 49.514258
| false
| true
|
2024-08-14
|
2024-08-16
| 0
|
abacusai/Dracarys-72B-Instruct
|
abacusai_Liberated-Qwen1.5-14B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Liberated-Qwen1.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Liberated-Qwen1.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Liberated-Qwen1.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Liberated-Qwen1.5-14B
|
cc0fa5102bfee821bb5e49f082731ccb9d1fedf1
| 19.866148
|
other
| 21
| 14
| true
| false
| false
| true
| 4.077312
| 0.363102
| 36.310212
| 0.4948
| 28.020906
| 0.121601
| 12.160121
| 0.283557
| 4.474273
| 0.417469
| 10.316927
| 0.35123
| 27.91445
| false
| true
|
2024-03-05
|
2024-09-05
| 0
|
abacusai/Liberated-Qwen1.5-14B
|
abacusai_Llama-3-Smaug-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Llama-3-Smaug-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Llama-3-Smaug-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Llama-3-Smaug-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Llama-3-Smaug-8B
|
fe54a7d42160d3d8fcc3289c8c411fd9dd5e8357
| 18.967057
|
llama2
| 87
| 8
| true
| false
| false
| true
| 0.910218
| 0.486675
| 48.667535
| 0.493071
| 27.880374
| 0.079305
| 7.930514
| 0.248322
| 0
| 0.36225
| 5.047917
| 0.318484
| 24.276005
| false
| true
|
2024-04-19
|
2024-07-02
| 0
|
abacusai/Llama-3-Smaug-8B
|
abacusai_Smaug-34B-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-34B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-34B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-34B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Smaug-34B-v0.1
|
34d54c65a0247d5eb694973106c816d9c0ad3fc2
| 23.757347
|
apache-2.0
| 60
| 34
| true
| false
| false
| true
| 11.785941
| 0.501563
| 50.156252
| 0.535779
| 34.261661
| 0
| 0
| 0.329698
| 10.626398
| 0.397875
| 8.134375
| 0.454289
| 39.365396
| false
| true
|
2024-01-25
|
2024-06-12
| 1
|
jondurbin/bagel-34b-v0.2
|
abacusai_Smaug-72B-v0.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-72B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-72B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-72B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Smaug-72B-v0.1
|
a1d657156f82c24b670158406378648233487011
| 29.573654
|
other
| 467
| 72
| true
| false
| false
| false
| 38.824997
| 0.5167
| 51.670013
| 0.599563
| 43.1251
| 0.181269
| 18.126888
| 0.323826
| 9.8434
| 0.447323
| 14.415365
| 0.46235
| 40.261155
| false
| true
|
2024-02-02
|
2024-06-12
| 1
|
moreh/MoMo-72B-lora-1.8.7-DPO
|
abacusai_Smaug-Llama-3-70B-Instruct-32K_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-Llama-3-70B-Instruct-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-Llama-3-70B-Instruct-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-Llama-3-70B-Instruct-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Smaug-Llama-3-70B-Instruct-32K
|
33840982dc253968f32ef3a534ee0e025eb97482
| 35.022193
|
llama3
| 21
| 70
| true
| false
| false
| true
| 13.303413
| 0.776111
| 77.611072
| 0.649311
| 49.07037
| 0.230363
| 23.036254
| 0.296141
| 6.152125
| 0.420792
| 12.432292
| 0.476479
| 41.831043
| false
| true
|
2024-06-11
|
2024-08-06
| 0
|
abacusai/Smaug-Llama-3-70B-Instruct-32K
|
abacusai_Smaug-Mixtral-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-Mixtral-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-Mixtral-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-Mixtral-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Smaug-Mixtral-v0.1
|
98fdc8315906b0a8b9e7f24bad89914869fcfc20
| 22.235369
|
apache-2.0
| 12
| 46
| true
| true
| false
| true
| 3.941416
| 0.555443
| 55.544289
| 0.516225
| 31.919261
| 0
| 0
| 0.301174
| 6.823266
| 0.429813
| 12.993229
| 0.335189
| 26.132166
| false
| true
|
2024-02-18
|
2024-08-30
| 0
|
abacusai/Smaug-Mixtral-v0.1
|
abacusai_Smaug-Qwen2-72B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-Qwen2-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-Qwen2-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-Qwen2-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/Smaug-Qwen2-72B-Instruct
|
af015925946d0c60ef69f512c3b35f421cf8063d
| 41.432226
|
other
| 9
| 72
| true
| false
| false
| true
| 13.257335
| 0.78253
| 78.253035
| 0.690979
| 56.266172
| 0.374622
| 37.462236
| 0.361577
| 14.876957
| 0.440073
| 15.175781
| 0.519033
| 46.559176
| false
| true
|
2024-06-26
|
2024-07-29
| 0
|
abacusai/Smaug-Qwen2-72B-Instruct
|
abacusai_bigstral-12b-32k_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/bigstral-12b-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/bigstral-12b-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__bigstral-12b-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/bigstral-12b-32k
|
b78a5385ec1b04d6c97f25e9ba1dff18dc98305f
| 18.072201
|
apache-2.0
| 43
| 12
| true
| false
| false
| false
| 0.965279
| 0.419381
| 41.938058
| 0.470012
| 25.556902
| 0.011329
| 1.132931
| 0.292785
| 5.704698
| 0.455979
| 15.864063
| 0.264129
| 18.236554
| true
| true
|
2024-03-06
|
2024-09-04
| 1
|
abacusai/bigstral-12b-32k (Merge)
|
abacusai_bigyi-15b_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abacusai/bigyi-15b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/bigyi-15b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__bigyi-15b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abacusai/bigyi-15b
|
b878c15531f7aaf6cf287530f1117b1308b96dc4
| 12.976296
|
other
| 11
| 15
| true
| false
| false
| false
| 1.871122
| 0.209403
| 20.940327
| 0.43453
| 19.940223
| 0.024924
| 2.492447
| 0.309564
| 7.941834
| 0.353781
| 4.289323
| 0.300283
| 22.25362
| true
| true
|
2024-03-06
|
2024-09-17
| 1
|
abacusai/bigyi-15b (Merge)
|
abhishek_autotrain-0tmgq-5tpbg_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-0tmgq-5tpbg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-0tmgq-5tpbg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-0tmgq-5tpbg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abhishek/autotrain-0tmgq-5tpbg
|
a75e1fda984e009613dca3b7846c579a37ab0673
| 4.856619
|
other
| 0
| 0
| true
| false
| false
| true
| 0.351828
| 0.195715
| 19.571515
| 0.313451
| 4.268752
| 0
| 0
| 0.251678
| 0.223714
| 0.365042
| 3.396875
| 0.11511
| 1.678856
| false
| false
|
2024-11-19
|
2024-12-03
| 2
|
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
|
abhishek_autotrain-0tmgq-5tpbg_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-0tmgq-5tpbg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-0tmgq-5tpbg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-0tmgq-5tpbg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abhishek/autotrain-0tmgq-5tpbg
|
a75e1fda984e009613dca3b7846c579a37ab0673
| 4.837547
|
other
| 0
| 0
| true
| false
| false
| true
| 0.336804
| 0.195165
| 19.516549
| 0.312733
| 4.419023
| 0
| 0
| 0.259228
| 1.230425
| 0.358375
| 2.263542
| 0.114362
| 1.595745
| false
| false
|
2024-11-19
|
2024-12-04
| 2
|
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
|
abhishek_autotrain-llama3-70b-orpo-v1_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-llama3-70b-orpo-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-llama3-70b-orpo-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-llama3-70b-orpo-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abhishek/autotrain-llama3-70b-orpo-v1
|
053236c6846cc561c1503ba05e2b28c94855a432
| 14.712672
|
other
| 4
| 70
| true
| false
| false
| true
| 10.761028
| 0.423302
| 42.330239
| 0.599799
| 41.565362
| 0.004532
| 0.453172
| 0.244128
| 0
| 0.357906
| 2.571615
| 0.112201
| 1.355644
| false
| false
|
2024-05-02
|
2024-08-30
| 0
|
abhishek/autotrain-llama3-70b-orpo-v1
|
abhishek_autotrain-llama3-70b-orpo-v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-llama3-70b-orpo-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-llama3-70b-orpo-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-llama3-70b-orpo-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abhishek/autotrain-llama3-70b-orpo-v2
|
a2c16a8a7fa48792eb8a1f0c50e13309c2021a63
| 28.804373
|
other
| 3
| 70
| true
| false
| false
| true
| 12.527047
| 0.540606
| 54.060559
| 0.589947
| 39.882199
| 0.206949
| 20.694864
| 0.293624
| 5.816555
| 0.411333
| 9.95
| 0.481799
| 42.42206
| false
| false
|
2024-05-04
|
2024-08-21
| 0
|
abhishek/autotrain-llama3-70b-orpo-v2
|
abhishek_autotrain-llama3-orpo-v2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-llama3-orpo-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-llama3-orpo-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-llama3-orpo-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abhishek/autotrain-llama3-orpo-v2
|
1655d0683696a5de2eb9a59c339ee469297beb9c
| 12.263693
|
other
| 3
| 8
| true
| false
| false
| true
| 0.90594
| 0.437166
| 43.716561
| 0.315938
| 4.380134
| 0.046073
| 4.607251
| 0.266779
| 2.237136
| 0.37924
| 5.104948
| 0.221825
| 13.536126
| false
| false
|
2024-04-22
|
2024-06-26
| 0
|
abhishek/autotrain-llama3-orpo-v2
|
abhishek_autotrain-vr4a1-e5mms_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-vr4a1-e5mms" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-vr4a1-e5mms</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-vr4a1-e5mms-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
abhishek/autotrain-vr4a1-e5mms
|
5206a32e0bd3067aef1ce90f5528ade7d866253f
| 18.609616
|
other
| 0
| 16
| true
| false
| false
| false
| 1.872878
| 0.214225
| 21.422492
| 0.500062
| 28.456617
| 0.138218
| 13.821752
| 0.319631
| 9.284116
| 0.389125
| 9.040625
| 0.366689
| 29.632092
| false
| false
|
2024-09-05
|
2024-09-06
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
adamo1139_Yi-34B-200K-AEZAKMI-v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/Yi-34B-200K-AEZAKMI-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/adamo1139__Yi-34B-200K-AEZAKMI-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
adamo1139/Yi-34B-200K-AEZAKMI-v2
|
189b42b0dae6352fbe7165255aae851961c8e678
| 23.789585
|
apache-2.0
| 12
| 34
| true
| false
| false
| true
| 3.021574
| 0.455526
| 45.552578
| 0.538382
| 35.276425
| 0.054381
| 5.438066
| 0.332215
| 10.961969
| 0.388604
| 6.475521
| 0.451297
| 39.032949
| false
| false
|
2023-12-13
|
2024-06-26
| 0
|
adamo1139/Yi-34B-200K-AEZAKMI-v2
|
adriszmar_QAIMath-Qwen2.5-7B-TIES_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/adriszmar/QAIMath-Qwen2.5-7B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adriszmar/QAIMath-Qwen2.5-7B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/adriszmar__QAIMath-Qwen2.5-7B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
adriszmar/QAIMath-Qwen2.5-7B-TIES
|
c89bc166dbe2a31c1fceb40ea7acdd96c5620ff5
| 5.469542
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 1.283408
| 0.174632
| 17.46322
| 0.312638
| 5.253691
| 0
| 0
| 0.244966
| 0
| 0.409594
| 9.132552
| 0.10871
| 0.96779
| true
| false
|
2024-10-27
|
2024-10-27
| 0
|
adriszmar/QAIMath-Qwen2.5-7B-TIES
|
adriszmar_QAIMath-Qwen2.5-7B-TIES_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/adriszmar/QAIMath-Qwen2.5-7B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adriszmar/QAIMath-Qwen2.5-7B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/adriszmar__QAIMath-Qwen2.5-7B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
adriszmar/QAIMath-Qwen2.5-7B-TIES
|
c89bc166dbe2a31c1fceb40ea7acdd96c5620ff5
| 4.963265
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 1.306316
| 0.168537
| 16.853726
| 0.312427
| 5.019151
| 0
| 0
| 0.249161
| 0
| 0.396292
| 7.169792
| 0.106632
| 0.736924
| true
| false
|
2024-10-27
|
2024-10-27
| 0
|
adriszmar/QAIMath-Qwen2.5-7B-TIES
|
ahmeda335_13_outOf_32_pruned_layers_llama3.1-8b_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ahmeda335__13_outOf_32_pruned_layers_llama3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b
|
248c420cc0a0bb8fce3a64a998ca0ce89613783c
| 4.404259
|
apache-2.0
| 0
| 5
| true
| false
| false
| true
| 0.496024
| 0.174807
| 17.480729
| 0.288326
| 1.677845
| 0
| 0
| 0.259228
| 1.230425
| 0.380323
| 4.607031
| 0.112866
| 1.429521
| false
| false
|
2024-10-21
|
2024-12-03
| 1
|
ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b (Merge)
|
ai21labs_Jamba-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
JambaForCausalLM
|
<a target="_blank" href="https://huggingface.co/ai21labs/Jamba-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai21labs/Jamba-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ai21labs__Jamba-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ai21labs/Jamba-v0.1
|
ce13f3fe99555a2606d1892665bb67649032ff2d
| 9.142836
|
apache-2.0
| 1,171
| 51
| true
| true
| false
| true
| 10.112143
| 0.202559
| 20.255921
| 0.360226
| 10.722059
| 0.011329
| 1.132931
| 0.268456
| 2.46085
| 0.359021
| 3.710937
| 0.249169
| 16.57432
| false
| true
|
2024-03-28
|
2024-09-16
| 0
|
ai21labs/Jamba-v0.1
|
aixonlab_Aether-12b_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/aixonlab/Aether-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aixonlab/Aether-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aixonlab__Aether-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
aixonlab/Aether-12b
|
c55d08a69c74f87c18ab5afb05d46359f389c91a
| 17.882297
|
apache-2.0
| 1
| 12
| true
| false
| false
| false
| 1.866432
| 0.234683
| 23.468286
| 0.51794
| 30.551138
| 0.096677
| 9.667674
| 0.316275
| 8.836689
| 0.382865
| 7.991406
| 0.341007
| 26.77859
| false
| false
|
2024-09-24
|
2024-10-09
| 1
|
Xclbr7/Arcanum-12b
|
aixonlab_Grey-12b_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/aixonlab/Grey-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aixonlab/Grey-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aixonlab__Grey-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
aixonlab/Grey-12b
|
50f56572870c49186c3679f9949a602d2d97c046
| 23.606024
|
apache-2.0
| 0
| 12
| true
| false
| false
| false
| 1.468694
| 0.396799
| 39.679938
| 0.569896
| 38.746043
| 0.093656
| 9.365559
| 0.300336
| 6.711409
| 0.451635
| 16.254427
| 0.377909
| 30.878768
| false
| false
|
2024-10-07
|
2024-10-09
| 2
|
Xclbr7/Arcanum-12b
|
akjindal53244_Llama-3.1-Storm-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akjindal53244/Llama-3.1-Storm-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akjindal53244__Llama-3.1-Storm-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
akjindal53244/Llama-3.1-Storm-8B
|
df21b06dcf534b026dd301a44a521d7253c8b94b
| 29.36525
|
llama3.1
| 168
| 8
| true
| false
| false
| true
| 0.794391
| 0.803263
| 80.326312
| 0.519633
| 31.615695
| 0.162387
| 16.238671
| 0.309564
| 7.941834
| 0.402833
| 8.820833
| 0.381233
| 31.248153
| true
| false
|
2024-08-12
|
2024-10-27
| 0
|
akjindal53244/Llama-3.1-Storm-8B
|
akjindal53244_Llama-3.1-Storm-8B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akjindal53244/Llama-3.1-Storm-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akjindal53244__Llama-3.1-Storm-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
akjindal53244/Llama-3.1-Storm-8B
|
df21b06dcf534b026dd301a44a521d7253c8b94b
| 29.843219
|
llama3.1
| 168
| 8
| true
| false
| false
| true
| 1.29582
| 0.805062
| 80.506168
| 0.518867
| 31.494363
| 0.166163
| 16.616314
| 0.326342
| 10.178971
| 0.402802
| 9.116927
| 0.380319
| 31.146572
| true
| false
|
2024-08-12
|
2024-11-26
| 0
|
akjindal53244/Llama-3.1-Storm-8B
|
alcholjung_llama3_medical_tuned_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/alcholjung/llama3_medical_tuned" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alcholjung/llama3_medical_tuned</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/alcholjung__llama3_medical_tuned-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
alcholjung/llama3_medical_tuned
|
62bd457b6fe961a42a631306577e622c83876cb6
| 11.306071
| 0
| 16
| false
| false
| false
| false
| 0.910721
| 0.010566
| 1.056641
| 0.451294
| 23.265089
| 0.002266
| 0.226586
| 0.286074
| 4.809843
| 0.466021
| 16.852604
| 0.294631
| 21.625665
| false
| false
|
2024-08-14
|
2024-08-14
| 0
|
alcholjung/llama3_medical_tuned
|
|
allenai_Llama-3.1-Tulu-3-70B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-70B
|
c4280450c0cd91a2fb6f41a25c6a1662c6966b01
| 41.198857
|
llama3.1
| 43
| 70
| true
| false
| false
| true
| 36.59305
| 0.829117
| 82.911674
| 0.616363
| 45.365569
| 0.382175
| 38.217523
| 0.373322
| 16.442953
| 0.494833
| 23.754167
| 0.464511
| 40.501256
| false
| true
|
2024-11-20
|
2024-11-27
| 1
|
allenai/Llama-3.1-Tulu-3-70B (Merge)
|
allenai_Llama-3.1-Tulu-3-70B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-70B
|
c4280450c0cd91a2fb6f41a25c6a1662c6966b01
| 41.454527
|
llama3.1
| 43
| 70
| true
| false
| false
| true
| 38.022026
| 0.837934
| 83.793446
| 0.615685
| 45.259481
| 0.382931
| 38.293051
| 0.373322
| 16.442953
| 0.498802
| 24.316927
| 0.465592
| 40.621306
| false
| true
|
2024-11-20
|
2024-11-27
| 1
|
allenai/Llama-3.1-Tulu-3-70B (Merge)
|
allenai_Llama-3.1-Tulu-3-70B-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-70B-DPO
|
6ea110f39fb660573111892a1381d3be3f826f80
| 41.204777
|
llama3.1
| 7
| 70
| true
| false
| false
| true
| 36.800749
| 0.828193
| 82.819253
| 0.61462
| 45.047181
| 0.388218
| 38.821752
| 0.375839
| 16.778523
| 0.49226
| 23.399219
| 0.463265
| 40.362736
| false
| true
|
2024-11-20
|
2024-11-27
| 1
|
allenai/Llama-3.1-Tulu-3-70B-DPO (Merge)
|
allenai_Llama-3.1-Tulu-3-70B-SFT_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-70B-SFT
|
f58ab66db3a1c5dd805c6d3420b2b4f5aef30041
| 38.722611
|
llama3.1
| 4
| 70
| true
| false
| false
| true
| 27.338327
| 0.805062
| 80.506168
| 0.595144
| 42.023984
| 0.324018
| 32.401813
| 0.344799
| 12.639821
| 0.502615
| 24.49349
| 0.462434
| 40.27039
| false
| true
|
2024-11-18
|
2024-11-27
| 1
|
allenai/Llama-3.1-Tulu-3-70B-SFT (Merge)
|
allenai_Llama-3.1-Tulu-3-8B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-8B
|
63b75e0dd6eac3725319f869716b9b70c16a6a65
| 26.034998
|
llama3.1
| 95
| 8
| true
| false
| false
| true
| 0.703774
| 0.826669
| 82.666879
| 0.404983
| 16.671813
| 0.196375
| 19.637462
| 0.298658
| 6.487696
| 0.417469
| 10.45026
| 0.282663
| 20.295878
| false
| true
|
2024-11-20
|
2024-11-21
| 1
|
allenai/Llama-3.1-Tulu-3-8B (Merge)
|
allenai_Llama-3.1-Tulu-3-8B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-8B
|
50fef8756a9a4ca2010587d128aebb3a18ec897d
| 25.883225
|
llama3.1
| 95
| 8
| true
| false
| false
| true
| 0.701232
| 0.82547
| 82.546975
| 0.406083
| 16.858052
| 0.188822
| 18.882175
| 0.29698
| 6.263982
| 0.417469
| 10.516927
| 0.282081
| 20.231235
| false
| true
|
2024-11-20
|
2024-11-28
| 1
|
allenai/Llama-3.1-Tulu-3-8B (Merge)
|
allenai_Llama-3.1-Tulu-3-8B-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-8B-DPO
|
002347006131d85678ea3865520bc9caad69869a
| 25.620576
|
llama3.1
| 12
| 8
| true
| false
| false
| true
| 0.670338
| 0.802938
| 80.293843
| 0.407943
| 17.426016
| 0.185801
| 18.58006
| 0.293624
| 5.816555
| 0.416135
| 10.516927
| 0.289811
| 21.090056
| false
| true
|
2024-11-20
|
2024-11-22
| 1
|
allenai/Llama-3.1-Tulu-3-8B-DPO (Merge)
|
allenai_Llama-3.1-Tulu-3-8B-RM_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForSequenceClassification
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-RM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-RM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-RM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-8B-RM
|
76247c00745747f820f1712949b5b37901d0f9c4
| 4.235057
|
llama3.1
| 7
| 8
| true
| false
| false
| true
| 0.736899
| 0.167014
| 16.701352
| 0.295004
| 2.64967
| 0
| 0
| 0.256711
| 0.894855
| 0.376417
| 4.252083
| 0.108211
| 0.912382
| false
| true
|
2024-11-20
|
2024-11-22
| 1
|
allenai/Llama-3.1-Tulu-3-8B-RM (Merge)
|
allenai_Llama-3.1-Tulu-3-8B-SFT_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/Llama-3.1-Tulu-3-8B-SFT
|
4ddd761e6750e04ea3d468175f78463628bba860
| 22.534
|
llama3.1
| 12
| 8
| true
| false
| false
| true
| 0.683246
| 0.74034
| 74.034008
| 0.387186
| 13.931208
| 0.114048
| 11.404834
| 0.277685
| 3.691275
| 0.426771
| 12.013021
| 0.281167
| 20.129654
| false
| true
|
2024-11-18
|
2024-11-22
| 1
|
allenai/Llama-3.1-Tulu-3-8B-SFT (Merge)
|
allenai_OLMo-1B-hf_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
OlmoForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/OLMo-1B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-1B-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/OLMo-1B-hf
|
8e995430edd24416ccfa98b5b283fa07b0c9f1a9
| 6.470278
|
apache-2.0
| 17
| 1
| true
| false
| false
| false
| 0.248874
| 0.218197
| 21.819661
| 0.305195
| 3.196546
| 0.007553
| 0.755287
| 0.261745
| 1.565996
| 0.409781
| 9.55599
| 0.117354
| 1.928191
| false
| true
|
2024-04-12
|
2024-06-12
| 0
|
allenai/OLMo-1B-hf
|
allenai_OLMo-7B-Instruct-hf_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
OlmoForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/OLMo-7B-Instruct-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-7B-Instruct-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-7B-Instruct-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/OLMo-7B-Instruct-hf
|
2ea947518df93433aa71219f29b36c72ac63be95
| 10.760857
|
apache-2.0
| 2
| 7
| true
| false
| false
| true
| 1.199951
| 0.347265
| 34.726526
| 0.370647
| 13.159933
| 0.008308
| 0.830816
| 0.270973
| 2.796421
| 0.376479
| 4.326563
| 0.178524
| 8.724882
| false
| true
|
2024-06-04
|
2024-06-27
| 0
|
allenai/OLMo-7B-Instruct-hf
|
allenai_OLMo-7B-hf_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
OlmoForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/OLMo-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-7B-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/OLMo-7B-hf
|
687d934d36a05417048d0fe7482f24f389fef6aa
| 6.776151
|
apache-2.0
| 12
| 6
| true
| false
| false
| false
| 0.590564
| 0.271927
| 27.192737
| 0.327913
| 5.761987
| 0.006798
| 0.679758
| 0.272651
| 3.020134
| 0.348667
| 2.083333
| 0.117271
| 1.918957
| false
| true
|
2024-04-12
|
2024-06-27
| 0
|
allenai/OLMo-7B-hf
|
allenai_OLMoE-1B-7B-0924_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
OlmoeForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/OLMoE-1B-7B-0924" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMoE-1B-7B-0924</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMoE-1B-7B-0924-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/OLMoE-1B-7B-0924
|
4fa3a6e09ed0e41639962f38bfba0fc532b90075
| 7.178464
|
apache-2.0
| 109
| 6
| true
| true
| false
| false
| 3.076407
| 0.218471
| 21.847143
| 0.339344
| 8.308107
| 0.011329
| 1.132931
| 0.247483
| 0
| 0.348792
| 3.565625
| 0.173953
| 8.216977
| false
| true
|
2024-07-20
|
2024-09-30
| 0
|
allenai/OLMoE-1B-7B-0924
|
allenai_OLMoE-1B-7B-0924-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
OlmoeForCausalLM
|
<a target="_blank" href="https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMoE-1B-7B-0924-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMoE-1B-7B-0924-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allenai/OLMoE-1B-7B-0924-Instruct
|
7f1c97f440f06ce36705e4f2b843edb5925f4498
| 13.207221
|
apache-2.0
| 85
| 6
| true
| true
| false
| true
| 5.487959
| 0.465218
| 46.521784
| 0.390161
| 14.571563
| 0
| 0
| 0.267617
| 2.348993
| 0.384823
| 6.069531
| 0.187583
| 9.731457
| false
| true
|
2024-08-13
|
2024-09-30
| 2
|
allenai/OLMoE-1B-7B-0924
|
allknowingroger_Chocolatine-24B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Chocolatine-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Chocolatine-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Chocolatine-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Chocolatine-24B
|
6245b82885ca4930575dbed2932ec1d32d901c0e
| 21.333145
|
apache-2.0
| 1
| 24
| true
| false
| false
| false
| 6.18496
| 0.195815
| 19.581488
| 0.619126
| 45.78594
| 0
| 0
| 0.325503
| 10.067114
| 0.432323
| 12.940365
| 0.456616
| 39.623966
| true
| false
|
2024-09-02
|
2024-09-02
| 1
|
allknowingroger/Chocolatine-24B (Merge)
|
allknowingroger_Gemma2Slerp1-2.6B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp1-2.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp1-2.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp1-2.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Gemma2Slerp1-2.6B
|
2d0e85a03c55abd22963c5c3a44f180bfecebf7b
| 20.594692
| 0
| 2
| false
| false
| false
| false
| 1.194263
| 0.535435
| 53.543487
| 0.434309
| 19.770255
| 0.101964
| 10.196375
| 0.283557
| 4.474273
| 0.456167
| 16.820833
| 0.268866
| 18.762928
| false
| false
|
2024-12-04
|
2024-12-06
| 1
|
allknowingroger/Gemma2Slerp1-2.6B (Merge)
|
|
allknowingroger_Gemma2Slerp1-27B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp1-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp1-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp1-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Gemma2Slerp1-27B
|
4a5c5092f40cc161bb18ca2b9e30a653c768e062
| 36.356582
|
apache-2.0
| 0
| 27
| true
| false
| false
| false
| 4.089968
| 0.718633
| 71.863323
| 0.63989
| 48.377666
| 0.249245
| 24.924471
| 0.364094
| 15.212528
| 0.476719
| 19.35651
| 0.445645
| 38.404994
| true
| false
|
2024-11-30
|
2024-12-06
| 1
|
allknowingroger/Gemma2Slerp1-27B (Merge)
|
allknowingroger_Gemma2Slerp2-2.6B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp2-2.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp2-2.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp2-2.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Gemma2Slerp2-2.6B
|
12ca2fdb5dd866fbdc624057a176ad3d1f8c2293
| 21.28146
| 0
| 2
| false
| false
| false
| false
| 1.200011
| 0.574727
| 57.472728
| 0.430765
| 19.719839
| 0.089879
| 8.987915
| 0.305369
| 7.38255
| 0.446771
| 15.279688
| 0.269614
| 18.84604
| false
| false
|
2024-12-04
|
2024-12-06
| 1
|
allknowingroger/Gemma2Slerp2-2.6B (Merge)
|
|
allknowingroger_Gemma2Slerp2-27B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp2-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp2-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp2-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Gemma2Slerp2-27B
|
21043f6eaf40680675461825fbdfc964f4a3c4a0
| 37.340059
|
apache-2.0
| 0
| 27
| true
| false
| false
| false
| 4.411521
| 0.754553
| 75.455347
| 0.655727
| 51.090234
| 0.243202
| 24.320242
| 0.369966
| 15.995526
| 0.462083
| 16.927083
| 0.462267
| 40.251921
| true
| false
|
2024-11-30
|
2024-12-06
| 1
|
allknowingroger/Gemma2Slerp2-27B (Merge)
|
allknowingroger_Gemma2Slerp3-27B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp3-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp3-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp3-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Gemma2Slerp3-27B
|
cddd53f3b29a361be2350b76770a60b3fcc78059
| 37.166401
|
apache-2.0
| 0
| 27
| true
| false
| false
| false
| 4.346639
| 0.742638
| 74.263842
| 0.649964
| 49.951521
| 0.252266
| 25.226586
| 0.354866
| 13.982103
| 0.474021
| 19.119271
| 0.464096
| 40.455083
| true
| false
|
2024-12-01
|
2024-12-06
| 1
|
allknowingroger/Gemma2Slerp3-27B (Merge)
|
allknowingroger_Gemma2Slerp4-27B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp4-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp4-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp4-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Gemma2Slerp4-27B
|
5c89bb96e60f0297f5bf27fc10713a4dcdd54285
| 36.650032
|
apache-2.0
| 0
| 27
| true
| false
| false
| false
| 4.406683
| 0.749658
| 74.965758
| 0.652958
| 50.773762
| 0.228852
| 22.885196
| 0.366611
| 15.548098
| 0.45024
| 15.179948
| 0.464927
| 40.547429
| true
| false
|
2024-12-01
|
2024-12-06
| 1
|
allknowingroger/Gemma2Slerp4-27B (Merge)
|
allknowingroger_GemmaSlerp-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/GemmaSlerp-9B
|
4f54819ae9c0af1f3e508f0afc88a7a734f9632d
| 30.857317
|
apache-2.0
| 0
| 9
| true
| false
| false
| false
| 1.60767
| 0.70432
| 70.432009
| 0.592058
| 41.556032
| 0.076284
| 7.628399
| 0.34396
| 12.527964
| 0.467323
| 17.882031
| 0.416057
| 35.117465
| true
| false
|
2024-10-27
|
2024-11-22
| 1
|
allknowingroger/GemmaSlerp-9B (Merge)
|
allknowingroger_GemmaSlerp2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/GemmaSlerp2-9B
|
e93fb8d7fad0007e463e44365a5a82d0d6facd61
| 33.502291
|
apache-2.0
| 2
| 9
| true
| false
| false
| false
| 1.785003
| 0.7281
| 72.810033
| 0.598271
| 42.541033
| 0.165408
| 16.540785
| 0.352349
| 13.646532
| 0.476719
| 19.489844
| 0.42387
| 35.98552
| true
| false
|
2024-10-29
|
2024-11-22
| 1
|
allknowingroger/GemmaSlerp2-9B (Merge)
|
allknowingroger_GemmaSlerp4-10B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp4-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp4-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp4-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/GemmaSlerp4-10B
|
e30d14d05730a83926263a7b0e4b1e002b6cd65a
| 33.232197
|
apache-2.0
| 2
| 10
| true
| false
| false
| false
| 1.918679
| 0.732622
| 73.262167
| 0.602786
| 43.328658
| 0.174471
| 17.44713
| 0.353188
| 13.758389
| 0.45399
| 15.482031
| 0.425033
| 36.114805
| true
| false
|
2024-10-30
|
2024-11-22
| 1
|
allknowingroger/GemmaSlerp4-10B (Merge)
|
allknowingroger_GemmaSlerp5-10B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp5-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp5-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp5-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/GemmaSlerp5-10B
|
7e94afcde7cc1ae88105521a831abefe8126b0d1
| 34.042524
|
apache-2.0
| 2
| 10
| true
| false
| false
| false
| 2.322861
| 0.735344
| 73.534444
| 0.605448
| 43.538464
| 0.197885
| 19.78852
| 0.352349
| 13.646532
| 0.460781
| 16.764323
| 0.432846
| 36.982861
| true
| false
|
2024-10-30
|
2024-11-22
| 1
|
allknowingroger/GemmaSlerp5-10B (Merge)
|
allknowingroger_GemmaStock1-27B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/GemmaStock1-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaStock1-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaStock1-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/GemmaStock1-27B
|
8563301fe323c4d1060ae6f56d5737ad62a63fef
| 37.43813
|
apache-2.0
| 0
| 27
| true
| false
| false
| false
| 4.152741
| 0.750906
| 75.090648
| 0.656561
| 50.990136
| 0.259063
| 25.906344
| 0.364094
| 15.212528
| 0.452687
| 15.985937
| 0.472989
| 41.443189
| true
| false
|
2024-12-03
|
2024-12-06
| 1
|
allknowingroger/GemmaStock1-27B (Merge)
|
allknowingroger_HomerSlerp1-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/HomerSlerp1-7B
|
42e3df3d9a25d8ff0d470582395f165b2ddb83d8
| 28.622212
|
apache-2.0
| 1
| 7
| true
| false
| false
| false
| 0.683282
| 0.462121
| 46.212051
| 0.551818
| 36.259863
| 0.280211
| 28.021148
| 0.317953
| 9.060403
| 0.435854
| 13.248438
| 0.450382
| 38.931368
| true
| false
|
2024-11-20
|
2024-11-22
| 1
|
allknowingroger/HomerSlerp1-7B (Merge)
|
allknowingroger_HomerSlerp2-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/HomerSlerp2-7B
|
210acef73da0488ea270332f5831b609298a98f0
| 28.760078
|
apache-2.0
| 1
| 7
| true
| false
| false
| false
| 0.624987
| 0.448682
| 44.868172
| 0.564894
| 37.9603
| 0.285498
| 28.549849
| 0.319631
| 9.284116
| 0.435573
| 12.846615
| 0.451463
| 39.051418
| true
| false
|
2024-11-20
|
2024-11-22
| 1
|
allknowingroger/HomerSlerp2-7B (Merge)
|
allknowingroger_HomerSlerp3-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp3-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/HomerSlerp3-7B
|
4f41686caa5bc39e3b0f075360974057486ece95
| 28.601186
|
apache-2.0
| 1
| 7
| true
| false
| false
| false
| 0.617234
| 0.436267
| 43.626688
| 0.559806
| 37.290018
| 0.280967
| 28.096677
| 0.317114
| 8.948546
| 0.446177
| 14.372135
| 0.453457
| 39.27305
| true
| false
|
2024-11-21
|
2024-11-22
| 1
|
allknowingroger/HomerSlerp3-7B (Merge)
|
allknowingroger_HomerSlerp4-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp4-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp4-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp4-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/HomerSlerp4-7B
|
f2ce1f2afa3c645e26ca61ee30f24736873bafa1
| 28.616144
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.67907
| 0.437416
| 43.741606
| 0.557077
| 36.786834
| 0.295317
| 29.531722
| 0.319631
| 9.284116
| 0.440844
| 13.772135
| 0.447224
| 38.580452
| true
| false
|
2024-11-21
|
2024-11-22
| 1
|
allknowingroger/HomerSlerp4-7B (Merge)
|
allknowingroger_LimyQstar-7B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/LimyQstar-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/LimyQstar-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__LimyQstar-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/LimyQstar-7B-slerp
|
6dc557c7bfd6a6f9bc8190bc8a31c3b732deca40
| 18.672525
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.630324
| 0.349114
| 34.911369
| 0.502356
| 30.194567
| 0.068731
| 6.873112
| 0.298658
| 6.487696
| 0.414646
| 10.197396
| 0.310339
| 23.371011
| true
| false
|
2024-03-23
|
2024-06-26
| 1
|
allknowingroger/LimyQstar-7B-slerp (Merge)
|
allknowingroger_Llama3.1-60B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Llama3.1-60B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Llama3.1-60B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Llama3.1-60B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Llama3.1-60B
|
5fb1ddcce0bddc60949a9d0c2fc9f8326be5bc4e
| 9.951594
| 0
| 61
| false
| false
| false
| false
| 13.491859
| 0.181452
| 18.145188
| 0.324176
| 7.784283
| 0
| 0
| 0.294463
| 5.928412
| 0.359583
| 2.18125
| 0.331034
| 25.670434
| false
| false
|
2024-10-08
| 0
|
Removed
|
||
allknowingroger_Marco-01-slerp1-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Marco-01-slerp1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Marco-01-slerp1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Marco-01-slerp1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Marco-01-slerp1-7B
|
12070d5f5bbd891024cb02c363759430ffd3dfba
| 29.485317
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.637324
| 0.468116
| 46.811571
| 0.554094
| 36.231847
| 0.31571
| 31.570997
| 0.317114
| 8.948546
| 0.445188
| 14.648438
| 0.448305
| 38.700502
| true
| false
|
2024-11-22
|
2024-11-22
| 1
|
allknowingroger/Marco-01-slerp1-7B (Merge)
|
allknowingroger_Meme-7B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Meme-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Meme-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Meme-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Meme-7B-slerp
|
7836c0f4fce70286382e61003e9a05d7559365d9
| 19.326433
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.48245
| 0.516375
| 51.637544
| 0.466094
| 24.529486
| 0.046828
| 4.682779
| 0.286074
| 4.809843
| 0.422302
| 10.18776
| 0.281001
| 20.111185
| true
| false
|
2024-05-22
|
2024-06-26
| 1
|
allknowingroger/Meme-7B-slerp (Merge)
|
allknowingroger_Ministral-8B-slerp_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Ministral-8B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ministral-8B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ministral-8B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Ministral-8B-slerp
|
51c40046c0f9fead83485ae83b6c0d03f4ae47f2
| 14.838025
| 0
| 7
| false
| false
| false
| false
| 1.125098
| 0.19609
| 19.608971
| 0.468602
| 25.195565
| 0
| 0
| 0.312081
| 8.277405
| 0.428531
| 12.39974
| 0.311918
| 23.546469
| false
| false
|
2024-10-18
|
2024-10-21
| 1
|
allknowingroger/Ministral-8B-slerp (Merge)
|
|
allknowingroger_MistralPhi3-11B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/MistralPhi3-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MistralPhi3-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MistralPhi3-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/MistralPhi3-11B
|
3afeaf24c6306c4752c320c4fd32fa2e7694e12e
| 21.627095
|
apache-2.0
| 0
| 11
| true
| false
| false
| false
| 0.707038
| 0.194291
| 19.429115
| 0.623431
| 46.164629
| 0
| 0
| 0.332215
| 10.961969
| 0.426677
| 12.234635
| 0.46875
| 40.972222
| true
| false
|
2024-08-26
|
2024-09-02
| 1
|
allknowingroger/MistralPhi3-11B (Merge)
|
allknowingroger_Mistralmash1-7B-s_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Mistralmash1-7B-s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Mistralmash1-7B-s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Mistralmash1-7B-s-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Mistralmash1-7B-s
|
730b7b2867deef63961f002b6e1e70e7d416c599
| 20.901278
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.661188
| 0.3961
| 39.610013
| 0.527749
| 33.448554
| 0.09139
| 9.138973
| 0.294463
| 5.928412
| 0.426708
| 11.805208
| 0.329289
| 25.476507
| true
| false
|
2024-08-27
|
2024-09-02
| 1
|
allknowingroger/Mistralmash1-7B-s (Merge)
|
allknowingroger_Mistralmash2-7B-s_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/Mistralmash2-7B-s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Mistralmash2-7B-s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Mistralmash2-7B-s-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/Mistralmash2-7B-s
|
3b2aafa0f931f3d3103fbc96a6da4ac36f376d78
| 21.402269
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.665537
| 0.410188
| 41.01883
| 0.530486
| 33.298364
| 0.08006
| 8.006042
| 0.297819
| 6.375839
| 0.43725
| 13.65625
| 0.334525
| 26.058289
| true
| false
|
2024-08-27
|
2024-09-02
| 1
|
allknowingroger/Mistralmash2-7B-s (Merge)
|
allknowingroger_MixTAO-19B-pass_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/MixTAO-19B-pass" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MixTAO-19B-pass</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MixTAO-19B-pass-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/MixTAO-19B-pass
|
a41369cfcfbada9d5387051ba616bf1432b31d31
| 20.615004
|
apache-2.0
| 1
| 19
| true
| false
| false
| false
| 1.255132
| 0.381437
| 38.143681
| 0.512825
| 31.577918
| 0.060423
| 6.042296
| 0.284396
| 4.58613
| 0.478271
| 19.950521
| 0.310505
| 23.38948
| true
| false
|
2024-06-02
|
2024-06-26
| 1
|
allknowingroger/MixTAO-19B-pass (Merge)
|
allknowingroger_MixTaoTruthful-13B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/MixTaoTruthful-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MixTaoTruthful-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MixTaoTruthful-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/MixTaoTruthful-13B-slerp
|
3324d37e138c6bf0d6891e54b6dd839c8d2f35ec
| 20.265564
|
apache-2.0
| 0
| 12
| true
| false
| false
| false
| 0.808109
| 0.413885
| 41.388516
| 0.520734
| 32.706362
| 0.067221
| 6.722054
| 0.284396
| 4.58613
| 0.42925
| 12.85625
| 0.310007
| 23.334072
| true
| false
|
2024-05-25
|
2024-06-26
| 1
|
allknowingroger/MixTaoTruthful-13B-slerp (Merge)
|
allknowingroger_MultiCalm-7B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/MultiCalm-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiCalm-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiCalm-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/MultiCalm-7B-slerp
|
1c23540e907fab4dfe0ef66edd0003e764bfe568
| 19.459701
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.616573
| 0.392653
| 39.265261
| 0.512189
| 31.466483
| 0.061178
| 6.117825
| 0.282718
| 4.362416
| 0.431948
| 12.960156
| 0.303275
| 22.586067
| true
| false
|
2024-05-19
|
2024-06-26
| 1
|
allknowingroger/MultiCalm-7B-slerp (Merge)
|
allknowingroger_MultiMash-12B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/MultiMash-12B-slerp
|
91a6d0fe6b9271000ca713ee9ab414c782ba4c50
| 20.192492
|
apache-2.0
| 0
| 12
| true
| false
| false
| false
| 0.84198
| 0.397449
| 39.744877
| 0.514183
| 31.925677
| 0.081571
| 8.1571
| 0.276846
| 3.579418
| 0.443792
| 14.773958
| 0.306765
| 22.973921
| true
| false
|
2024-05-20
|
2024-06-26
| 1
|
allknowingroger/MultiMash-12B-slerp (Merge)
|
allknowingroger_MultiMash10-13B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash10-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash10-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash10-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
allknowingroger/MultiMash10-13B-slerp
|
6def2fd1a11d4c380a19b7a3bdf263a6b80cd8f3
| 20.376084
|
apache-2.0
| 0
| 12
| true
| false
| false
| false
| 0.879359
| 0.416283
| 41.628324
| 0.518634
| 32.452502
| 0.068731
| 6.873112
| 0.286074
| 4.809843
| 0.431792
| 12.973958
| 0.311669
| 23.518765
| true
| false
|
2024-05-27
|
2024-06-26
| 1
|
allknowingroger/MultiMash10-13B-slerp (Merge)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.