KaiChen1998 commited on
Commit
edbe0e1
1 Parent(s): fa4029d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -2,6 +2,6 @@
2
  license: apache-2.0
3
  ---
4
  # MoCLE Model Card
5
- MoCLE is a Multi-modality Large Language Model (MLLM) with Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://arxiv.org/abs/2305.06500).
6
  This repo contains the MoCLE checkpoint with 64 instruction clusters and a routing temperature of 0.1.
7
  Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle).
 
2
  license: apache-2.0
3
  ---
4
  # MoCLE Model Card
5
+ MoCLE is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://arxiv.org/abs/2305.06500).
6
  This repo contains the MoCLE checkpoint with 64 instruction clusters and a routing temperature of 0.1.
7
  Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle).