output

This is a merge of pre-trained language models created using mergekit.

Merge Details

An homage to the 70B-monster-truck-smash-frankenmerges of old, back when Llama 2 was the best open-source had. Modernized with three of the best Llama 3 tunes I know.

Note: untested, but it's 105B dense and made of three very good models. I'd be shocked if it was all that bad.

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: TheDrummer/Anubis-70B-v1.1
        layer_range: [0, 40]
  - sources:
      - model: Sao10K/L3.3-70B-Euryale-v2.3
        layer_range: [20, 60]
  - sources:
      - model: sophosympatheia/Strawberrylemonade-L3-70B-v1.2
        layer_range: [40, 80]
merge_method: passthrough
dtype: bfloat16
Downloads last month
3
Safetensors
Model size
105B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for tssst/Geranium-105B-v1.0.0

Collection including tssst/Geranium-105B-v1.0.0