Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
13
Follow
AWS Inferentia and Trainium
72
License:
apache-2.0
Model card
Files
Files and versions
Community
275
f2a02a4
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.22
/
inference
/
llama
/
llm-jp
/
llm-jp-13b-v2.0
Commit History
Synchronizing local compiler cache.
5b590b1
verified
htokoyo
commited on
May 23