Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
13
Follow
AWS Inferentia and Trainium
73
License:
apache-2.0
Model card
Files
Files and versions
Community
276
891a68a
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.22
/
inference
/
llama
/
meta-llama
/
Llama-2-7b-chat-hf
8 contributors
History:
7 commits
dacorvo
HF staff
Synchronizing local compiler cache.
891a68a
verified
8 months ago
06b52fde14a8f28d6a3d.json
860 Bytes
Synchronizing local compiler cache.
8 months ago
5a8294b279e725cf8542.json
861 Bytes
Synchronizing local compiler cache.
8 months ago
735e2b4b3a5019f203a7.json
861 Bytes
Synchronizing local compiler cache.
8 months ago
9c9eb0f38ddf656dac4a.json
860 Bytes
Synchronizing local compiler cache.
8 months ago
be28f38f27aeb510e1c8.json
860 Bytes
Synchronizing local compiler cache.
8 months ago
c9527c01253d9424170a.json
860 Bytes
Synchronizing local compiler cache.
8 months ago
f6a3964311a50e56da2e.json
860 Bytes
Synchronizing local compiler cache.
8 months ago