--- license: apache-2.0 --- # switch-large-128_qmoe This is the [google/switch-large-128](https://huggingface.co/google/switch-large-128) model quantized with the QMoE framework to ternary precision and stored in the custom further compressed QMoE format. Please see the [QMoE repository](https://github.com/IST-DASLab/qmoe) for how to use this model.