Edit model card

This model is intended to be a strong base suitable for downstream fine-tuning on a variety of tasks. Based on our internal evaluations, we believe it's one of the strongest models for most down-stream tasks. You can read more about our development and evaluation process here.

It is a hierarchichal SLERP merge of teknium/OpenHermes-2.5-Mistral-7B, Intel/neural-chat-7b-v3-3, meta-math/MetaMath-Mistral-7B, and openchat/openchat-3.5-1210. berkeley-nest/Starling-LM-7B-alpha was omitted from this version of the model.

Downloads last month
2,728
Safetensors
Model size
7.24B params
Tensor type
BF16
Β·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for OpenPipe/mistral-ft-optimized-1227

Finetuned
this model
Finetunes
1 model
Merges
18 models
Quantizations
5 models

Spaces using OpenPipe/mistral-ft-optimized-1227 14