KaiChen1998 commited on
Commit
08f485c
1 Parent(s): edbe0e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -2,6 +2,6 @@
2
  license: apache-2.0
3
  ---
4
  # MoCLE Model Card
5
- MoCLE is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://arxiv.org/abs/2305.06500).
6
  This repo contains the MoCLE checkpoint with 64 instruction clusters and a routing temperature of 0.1.
7
- Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle).
 
2
  license: apache-2.0
3
  ---
4
  # MoCLE Model Card
5
+ [MoCLE](https://arxiv.org/abs/2312.12379) is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://huggingface.co/docs/transformers/model_doc/instructblip).
6
  This repo contains the MoCLE checkpoint with 64 instruction clusters and a routing temperature of 0.1.
7
+ Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle) and [Website](https://kaichen1998.github.io/projects/mocle/).