LuckyRP-24B
LuckyRP-24B is a merge of the following models using mergekit:
Configuration:
The following YAML configuration was used to produce this model:
models:
- model: trashpanda-org/MS-24B-Mullein-v0
parameters:
weight: 0.7
- model: cognitivecomputations/Dolphin3.0-Mistral-24B
parameters:
weight: 0.3
base_model: trashpanda-org/MS-24B-Mullein-v0
tokenizer:
source: base
parameters:
t: 0.3
normalize: true
dtype: bfloat16
out_dtype: bfloat16
- Downloads last month
- 40
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Vortex5/LuckyRP-24B
Merge model
this model