Librarian Bot: Add moe tag to model
#4
by
librarian-bot
- opened
This pull request aims to enrich the metadata of your model by adding an moe
(Mixture of Experts) tag
in the YAML
block of your model's README.md
.
How did we find this information? We infered that this model is a moe
model based on the following criteria:
- The model's name contains the string
moe
. - The model indicates it uses a
moe
architecture - The model's base model is a
moe
model
Why add this? Enhancing your model's metadata in this way:
- Boosts Discoverability - It becomes easier to find mixture of experts models on the Hub
- Helping understand the ecosystem - It becomes easier to understand the ecosystem of mixture of experts models on the Hub and how they are used
This PR comes courtesy of Librarian Bot. If you have any feedback, queries, or need assistance, please don't hesitate to reach out to @davanstrien .
mobicham
changed pull request status to
merged