|
--- |
|
title: Groq MoA - Mixture of Agents |
|
emoji: 💻 |
|
colorFrom: blue |
|
colorTo: gray |
|
sdk: docker |
|
pinned: false |
|
license: apache-2.0 |
|
short_description: Deployment of the skapadia3214/groq-moa repo |
|
--- |
|
|
|
# Repo deployment |
|
[skapadia3214/groq-moa](https://github.com/skapadia3214/groq-moa.git) |
|
|
|
All of the following information comes directly from this repo. |
|
Only the available models have been modified to incorporate the latest Llama3.1 models. |
|
|
|
# Mixture-of-Agents Demo Powered by Groq |
|
|
|
This Streamlit application showcases the Mixture of Agents (MOA) architecture proposed by Together AI, powered by Groq LLMs. It allows users to interact with a configurable multi-agent system for enhanced AI-driven conversations. |
|
|
|
![MOA Architecture](./static/moa_groq.svg) |
|
*Source: Adaptation of [Together AI Blog - Mixture of Agents](https://www.together.ai/blog/together-moa)* |
|
|
|
## Acknowledgements |
|
|
|
- [Groq](https://groq.com/) for providing the underlying language models |
|
- [Together AI](https://www.together.ai/) for proposing the Mixture of Agents architecture and providing the conceptual image |
|
- [Streamlit](https://streamlit.io/) for the web application framework |
|
|
|
## Citation |
|
|
|
This project implements the Mixture-of-Agents architecture proposed in the following paper: |
|
|
|
``` |
|
@article{wang2024mixture, |
|
title={Mixture-of-Agents Enhances Large Language Model Capabilities}, |
|
author={Wang, Junlin and Wang, Jue and Athiwaratkun, Ben and Zhang, Ce and Zou, James}, |
|
journal={arXiv preprint arXiv:2406.04692}, |
|
year={2024} |
|
} |
|
``` |
|
|
|
For more information about the Mixture-of-Agents concept, please refer to the [original research paper](https://arxiv.org/abs/2406.04692) and the [Together AI blog post](https://www.together.ai/blog/together-moa). |