Edit model card

Barcenas 2.7b is a small language model (SLM) with 2.7 billion parameters, based on the Phi-2 model by Microsoft.

It inherits the outstanding reasoning and language understanding capabilities of Phi-2, which showcases state-of-the-art performance among base language models with less than 13 billion parameters.

Barcenas 2.7b also incorporates the features of the phi-2-NaturalChat model by Onymity, which is a fine-tuned version of Phi-2 for natural and engaging chat conversations based on GPT-4.

Barcenas 2.7b is designed to handle a variety of tasks, such as question answering, code generation, and natural chat, using high-quality and synthetic training data.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽

Downloads last month
11
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.