This is a experimental model, yet it is the Most powerful RNN model in the world.

Mobius RWKV r6 chat 12B 16k

Mobius is a RWKV v6 arch chat model, benifit from Matrix-Valued States and Dynamic Recurrence

Introduction

Mobius is a RWKV v6 arch model, a state based RNN+CNN+Transformer Mixed language model pretrained on a certain amount of data. In comparison with the previous released Mobius, the improvements include:

  • Only 24G Vram to run this model locally with fp16;
  • Significant performance improvement in chinese;
  • Stable support of 16K context length.
  • function call support ;

Usage

Chat format: User: xxxx\n\nAssistant: xxx\n\n

Recommend Temp and topp: 1 0.3

function call format example:

System: You are a helpful assistant with access to the following functions. Use them if required -{
  "name": "get_exchange_rate",
  "description": "Get the exchange rate between two currencies",
  "parameters": {
    "type": "object",
    "properties": {
    "base_currency": {
    "type": "string",
    "description": "The currency to convert from"
    },
    "target_currency": {
    "type": "string",
    "description": "The currency to convert to"
    }
    },
    "required": [
    "base_currency",
    "target_currency"
    ]
  }
}

User: Hi, I need to know the exchange rate from USD to EUR

Assistant: xxxx

Obersavtion: xxxx

Assistant: xxxx

More details

Mobius 12B 16k based on RWKV v6 arch, which is leading state based RNN+CNN+Transformer Mixed large language model which focus opensouce community

  • 10~100 trainning/inference cost reduce;
  • state based,selected memory, which mean good at grok;
  • community support.

requirements

21.9G vram to run fp16, 13.7G for int8, 7.2 for nf4 with Ai00 server.

Benchmark

ceval 63.53 cmmlu 76.07

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Collection including xiaol/mobius-rwkv-r6-12B