Edit model card

You need to agree to share your contact information to access this model

Meta Chameleon Research License and Acceptable Use Policy

META CHAMELEON RESEARCH LICENSE AGREEMENT

Log in or Sign Up to review the conditions and access this model content.

Meta Chameleon 7B

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.

The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.

[more details and usage examples coming soon]

Citation

To cite the paper, model, or software, please use the below:

@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
  author = {Chameleon Team},
  doi = {10.48550/arXiv.2405.09818},
  journal = {arXiv preprint arXiv:2405.09818},
  title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
  url = {https://github.com/facebookresearch/chameleon},
  year = {2024}
}

License

Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.

Downloads last month
15,195
Safetensors
Model size
7.04B params
Tensor type
F32
Β·
BF16
Β·
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Spaces using facebook/chameleon-7b 4

Collection including facebook/chameleon-7b