File size: 1,746 Bytes
edb4df3 773fbae edb4df3 773fbae 4a294f1 773fbae edb4df3 2bd8478 677a9a4 2bd8478 677a9a4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
title: Groq MoA - Mixture of Agents
emoji: 💻
colorFrom: blue
colorTo: gray
sdk: docker
pinned: false
license: apache-2.0
app_port: 8051
short_description: Deployment of the skapadia3214/groq-moa repo
---
# Repo deployment
[skapadia3214/groq-moa](https://github.com/skapadia3214/groq-moa.git)
All of the following information comes directly from this repo.
Only the available models have been modified to incorporate the latest Llama3.1 models.
# Mixture-of-Agents Demo Powered by Groq
This Streamlit application showcases the Mixture of Agents (MOA) architecture proposed by Together AI, powered by Groq LLMs. It allows users to interact with a configurable multi-agent system for enhanced AI-driven conversations.
![MOA Architecture](./static/moa_groq.svg)
*Source: Adaptation of [Together AI Blog - Mixture of Agents](https://www.together.ai/blog/together-moa)*
## Acknowledgements
- [Groq](https://groq.com/) for providing the underlying language models
- [Together AI](https://www.together.ai/) for proposing the Mixture of Agents architecture and providing the conceptual image
- [Streamlit](https://streamlit.io/) for the web application framework
## Citation
This project implements the Mixture-of-Agents architecture proposed in the following paper:
```
@article{wang2024mixture,
title={Mixture-of-Agents Enhances Large Language Model Capabilities},
author={Wang, Junlin and Wang, Jue and Athiwaratkun, Ben and Zhang, Ce and Zou, James},
journal={arXiv preprint arXiv:2406.04692},
year={2024}
}
```
For more information about the Mixture-of-Agents concept, please refer to the [original research paper](https://arxiv.org/abs/2406.04692) and the [Together AI blog post](https://www.together.ai/blog/together-moa). |