Edit model card

CAC-v0.1

CAC is a large language model with 6.7B parameters specifically finetuned for code completions. CAC, although trained for code autocompletion, can also be used for other code related tasks such as:

  • Generation
  • Summarization
  • Translation
  • Question Answering
  • Optimization
  • Debugging
  • Code Review and more.

This is the very first version of CAC (0.1) and is still under development. For this version, we chose to go ahead with DeepSeek-Coder-6.7B as the base model.


Model Details

  • Training Data: Exclusively fine-tuned on a proprietary dataset of 1.8 billion tokens of high-quality programming problems and solutions.

  • The dataset was generated manually and is internal to CodeMate.

  • Training Techniques: The model was fine-tuned using Flash Attention 2.

  • A sequence length of 8096 tokens was used during training.

  • Multilingual Support: CAC-v0.1 is proficient in multiple programming languages, including Python, C/C++, TypeScript, Java, and more.


Load the model with Transformers:

Make sure to install Transformers from the main git branch:

pip install git+https://github.com/huggingface/transformers.git

How to Prompt the Model:

This model accepts prompts in the Alpaca/Vicuna instruction format. For example:

### System Prompt
You are an intelligent programming assistant.

### User Message
Implement a linked list in C++

### Assistant
...

You can also use the Mistral chat template for conversations:

<s>[INST] .... [/INST] ... </s>

Load the model:

from transformers import AutoTokenizer, AutoModelForCausalLM

# Initialize the model
model_path = "codemateai/CodeMate-v0.1"
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_path)

# ... generate response ...

Limitations

This model has undergone very limited testing. CodeMate recommends additional safety testing before any real-world deployments.

For more information and updates, visit the CodeMate website.

Downloads last month
23
Safetensors
Model size
6.74B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for codemateai/cac-v0.1

Finetuned
(3)
this model
Quantizations
1 model