Update README.md
Browse files
README.md
CHANGED
@@ -7,6 +7,9 @@ datasets:
|
|
7 |
- HaoyeZhang/RLAIF-V-Dataset
|
8 |
---
|
9 |
|
|
|
|
|
|
|
10 |
[GitHub](https://github.com/OpenBMB/MiniCPM-V) | [Demo](https://huggingface.co/spaces/openbmb/MiniCPM-Llama3-V-2_5)
|
11 |
|
12 |
|
@@ -16,7 +19,7 @@ datasets:
|
|
16 |
|
17 |
* [2024.05.25] πππ MiniCPM-Llama3-V 2.5 now supports [Ollama](https://github.com/OpenBMB/ollama/tree/minicpm-v2.5/examples/minicpm-v2.5) for efficient inference. Try it now!
|
18 |
* [2024.05.23] π We've released a comprehensive comparison between Phi-3-vision-128k-instruct and MiniCPM-Llama3-V 2.5, including benchmarks evaluations, multilingual capabilities, and inference efficiency ππππ. Click [here](https://github.com/OpenBMB/MiniCPM-V/blob/main/docs/compare_with_phi-3_vision.md) to view more details.
|
19 |
-
* [2024.05.23] π₯π₯π₯ MiniCPM-V tops GitHub Trending and HuggingFace Trending! Our demo,
|
20 |
|
21 |
<br>
|
22 |
|
|
|
7 |
- HaoyeZhang/RLAIF-V-Dataset
|
8 |
---
|
9 |
|
10 |
+
|
11 |
+
<h1>A GPT-4V Level Multimodal LLM on Your Phone</h1>
|
12 |
+
|
13 |
[GitHub](https://github.com/OpenBMB/MiniCPM-V) | [Demo](https://huggingface.co/spaces/openbmb/MiniCPM-Llama3-V-2_5)
|
14 |
|
15 |
|
|
|
19 |
|
20 |
* [2024.05.25] πππ MiniCPM-Llama3-V 2.5 now supports [Ollama](https://github.com/OpenBMB/ollama/tree/minicpm-v2.5/examples/minicpm-v2.5) for efficient inference. Try it now!
|
21 |
* [2024.05.23] π We've released a comprehensive comparison between Phi-3-vision-128k-instruct and MiniCPM-Llama3-V 2.5, including benchmarks evaluations, multilingual capabilities, and inference efficiency ππππ. Click [here](https://github.com/OpenBMB/MiniCPM-V/blob/main/docs/compare_with_phi-3_vision.md) to view more details.
|
22 |
+
* [2024.05.23] π₯π₯π₯ MiniCPM-V tops GitHub Trending and HuggingFace Trending! Our demo, recommended by Hugging Face Gradioβs official account, is available [here](https://huggingface.co/spaces/openbmb/MiniCPM-Llama3-V-2_5). Come and try it out!
|
23 |
|
24 |
<br>
|
25 |
|