Added git repo link
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ inference: false
|
|
5 |
|
6 |
# MegaBeam-Mistral-7B-512k Model
|
7 |
|
8 |
-
`MegaBeam-Mistral-7B-512k` is a Large-Context LLM that supports 524,288 tokens in its context. `MegaBeam-Mistral-7B-512k` was trained on [Mistral-7B Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), and can be deployed using various serving frameworks like [vLLM](https://github.com/vllm-project/vllm) and Amazon SageMaker's [DJL](https://docs.aws.amazon.com/sagemaker/latest/dg/deploy-models-frameworks-djl-serving.html) endpoint.
|
9 |
|
10 |
|
11 |
## Evaluations
|
|
|
5 |
|
6 |
# MegaBeam-Mistral-7B-512k Model
|
7 |
|
8 |
+
`MegaBeam-Mistral-7B-512k` is a Large-Context LLM that supports 524,288 tokens in its context. `MegaBeam-Mistral-7B-512k` was trained on [Mistral-7B Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), and can be deployed using various serving frameworks like [vLLM](https://github.com/vllm-project/vllm) and Amazon SageMaker's [DJL](https://docs.aws.amazon.com/sagemaker/latest/dg/deploy-models-frameworks-djl-serving.html) endpoint. Please refer to our [GitRepo](https://github.com/awslabs/extending-the-context-length-of-open-source-llms/tree/main/megabeam-mistral-7b) for deployment and inference examples.
|
9 |
|
10 |
|
11 |
## Evaluations
|