Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Mozilla
/
Meta-Llama-3.1-70B-Instruct-llamafile
like
7
Follow
mozilla
147
llamafile
PyTorch
8 languages
facebook
meta
llama
llama-3
arxiv:
2204.05149
License:
llama3.1
Model card
Files
Files and versions
Community
1
main
Meta-Llama-3.1-70B-Instruct-llamafile
1 contributor
History:
33 commits
jartine
Update LICENSE
579671a
verified
3 months ago
.gitattributes
Safe
3.5 kB
Quantize Q4_1 with llamafile-0.8.13
3 months ago
LICENSE
Safe
17.3 kB
Update LICENSE
3 months ago
Meta-Llama-3.1-70B-Instruct.BF16.cat0.llamafile
Safe
50 GB
LFS
Quantize BF16 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.BF16.cat1.llamafile
Safe
50 GB
LFS
Quantize BF16 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.BF16.cat2.llamafile
Safe
41.4 GB
LFS
Quantize BF16 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.F16.cat0.llamafile
Safe
50 GB
LFS
Quantize F16 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.F16.cat1.llamafile
Safe
50 GB
LFS
Quantize F16 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.F16.cat2.llamafile
Safe
41.4 GB
LFS
Quantize F16 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q2_K.llamafile
Safe
26.6 GB
LFS
Quantize Q2_K with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q3_K_L.llamafile
Safe
37.4 GB
LFS
Quantize Q3_K_L with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q3_K_M.llamafile
Safe
34.5 GB
LFS
Quantize Q3_K_M with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q3_K_S.llamafile
Safe
31.2 GB
LFS
Quantize Q3_K_S with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q4_0.llamafile
Safe
40.2 GB
LFS
Quantize Q4_0 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q4_1.llamafile
Safe
44.6 GB
LFS
Quantize Q4_1 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q4_K_M.llamafile
Safe
42.8 GB
LFS
Quantize Q4_K_M with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q4_K_S.llamafile
Safe
40.6 GB
LFS
Quantize Q4_K_S with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q5_0.llamafile
Safe
48.9 GB
LFS
Quantize Q5_0 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q5_1.cat0.llamafile
Safe
50 GB
LFS
Quantize Q5_1 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q5_1.cat1.llamafile
Safe
3.24 GB
LFS
Quantize Q5_1 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q5_K_M.cat0.llamafile
Safe
50 GB
LFS
Quantize Q5_K_M with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q5_K_M.cat1.llamafile
Safe
191 MB
LFS
Quantize Q5_K_M with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q5_K_S.llamafile
Safe
48.9 GB
LFS
Quantize Q5_K_S with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q6_K.cat0.llamafile
Safe
50 GB
LFS
Quantize Q6_K with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q6_K.cat1.llamafile
Safe
8.13 GB
LFS
Quantize Q6_K with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q8_0.cat0.llamafile
Safe
50 GB
LFS
Quantize Q8_0 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-70B-Instruct.Q8_0.cat1.llamafile
Safe
25.2 GB
LFS
Quantize Q8_0 with llamafile-0.8.13
3 months ago
README.md
Safe
31.4 kB
Update README.md
3 months ago