KaiChen1998 commited on
Commit
fe33182
1 Parent(s): ec7f4b1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -1,3 +1,7 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ # MoCLE Model Card
5
+ MoCLE is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://arxiv.org/abs/2305.06500).
6
+ This repo contains the MoCLE checkpoint with 64 instruction clusters and a routing temperature of 0.05.
7
+ Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle).