Update README.md
Browse files✨ **effi-13b**
effi-13b is a 13B parameters causal decoder-only model built by AiPlanet based on Llama-2-13b-chat-hf and finetuned on CoT dataset is made available under the Apache 2.0 license.
✨ **Why use effi-13b?**
- Looking for a ready to use chat/instruct model that can provide rationale for a context provided .It is based on FLlama-2-13b-chat-hf .
💬 This is an instruct model, which may not be ideal for further finetuning. If you are interested in building your own instruct/chat model, we recommend starting from Llama-2-13b-chat-hf.
💬 You will need at least 85-100GB of memory to swiftly run inference .
✨ **Uses**
**Direct Use**
effi-13b has been finetuned on a Chain of Thought dataset.
**Out-of-Scope Use**
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
**Bias, Risks, and Limitations**
effi-13bis mostly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
**Recommendations**
We recommend users of effi-13b to develop guardrails and to take appropriate precautions for any production use.