Edit model card

CollectiveCognition-v1.1-Mistral-7B and airoboros-mistral2.2-7b glued together and finetuned with qlora of Pippa and LimaRPv3 dataset.

Description

This repo contains fp16 files of Mistral-11B-CC-Air-RP.

Model used

Prompt template: Alpaca or default

Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{prompt}

### Response:
USER: <prompt>
ASSISTANT:

The secret sauce

slices:
  - sources:
    - model: teknium/CollectiveCognition-v1.1-Mistral-7B
      layer_range: [0, 24]
  - sources:
    - model: teknium/airoboros-mistral2.2-7b
      layer_range: [8, 32]
merge_method: passthrough
dtype: float16

Special thanks to Sushi.

If you want to support me, you can here.

Downloads last month
18
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Undi95/Mistral-11B-CC-Air-RP

Merges
1 model
Quantizations
3 models

Collection including Undi95/Mistral-11B-CC-Air-RP