File size: 1,873 Bytes
4f7be3a d4c0201 4f7be3a fbd2e6f 4f7be3a fbd2e6f 4f7be3a d4c0201 4f7be3a d4c0201 4f7be3a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
---
license: other
license_name: sample-code-license
license_link: LICENSE
library_name: ml-4m
---
# 4M: Massively Multimodal Masked Modeling
*David Mizrahi\*, Roman Bachmann\*, Oğuzhan Fatih Kar, Teresa Yeo, Mingfei Gao, Afshin Dehghan, Amir Zamir*
Official implementation and pre-trained models for "4M: Massively Multimodal Masked Modeling" (NeurIPS 2023).
[`Website`](https://4m.epfl.ch) | [`Paper`](https://arxiv.org/abs/2312.06647) | [`GitHub`](https://github.com/apple/ml-4m)
4M is a framework for training "any-to-any" foundation models, using tokenization and masking to scale to many diverse modalities.
Models trained using 4M can perform a wide range of vision tasks, transfer well to unseen tasks and modalities, and are flexible and steerable multimodal generative models.
## Installation
For install instructions, please see https://github.com/apple/ml-4m.
## Usage
This model can be loaded from Hugging Face Hub as follows:
```python
from fourm.models.fm import FM
fm = FM.from_pretrained('EPFL-VILAB/4M-7_XL_COYO700M')
```
Please see https://github.com/apple/ml-4m/blob/main/README_GENERATION.md for more detailed instructions and https://github.com/apple/ml-4m for other 4M model and tokenizer checkpoints.
Safetensors checkpoints are hosted under https://huggingface.co/EPFL-VILAB/4M.
## Citation
If you find this repository helpful, please consider citing our work:
```
@inproceedings{mizrahi20234m,
title={{4M}: Massively Multimodal Masked Modeling},
author={David Mizrahi and Roman Bachmann and O{\u{g}}uzhan Fatih Kar and Teresa Yeo and Mingfei Gao and Afshin Dehghan and Amir Zamir},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
}
```
## License
The model weights in this repository are released under the Sample Code license as found in the [LICENSE](LICENSE) file. |