Delete README.md
Browse files
README.md
DELETED
@@ -1,112 +0,0 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
datasets:
|
4 |
-
- neulab/PangeaInstruct
|
5 |
-
language:
|
6 |
-
- am
|
7 |
-
- ar
|
8 |
-
- bg
|
9 |
-
- bn
|
10 |
-
- cs
|
11 |
-
- de
|
12 |
-
- el
|
13 |
-
- en
|
14 |
-
- es
|
15 |
-
- fa
|
16 |
-
- fr
|
17 |
-
- ga
|
18 |
-
- hi
|
19 |
-
- id
|
20 |
-
- ig
|
21 |
-
- it
|
22 |
-
- iw
|
23 |
-
- ja
|
24 |
-
- jv
|
25 |
-
- ko
|
26 |
-
- nl
|
27 |
-
- mn
|
28 |
-
- ms
|
29 |
-
- no
|
30 |
-
- pl
|
31 |
-
- pt
|
32 |
-
- ro
|
33 |
-
- ru
|
34 |
-
- si
|
35 |
-
- su
|
36 |
-
- sw
|
37 |
-
- ta
|
38 |
-
- te
|
39 |
-
- th
|
40 |
-
- tr
|
41 |
-
- uk
|
42 |
-
- ur
|
43 |
-
- vi
|
44 |
-
- zh
|
45 |
-
base_model:
|
46 |
-
- Qwen/Qwen2-7B-Instruct
|
47 |
-
---
|
48 |
-
# Pangea-7B Model Card
|
49 |
-
|
50 |
-
[Pangea: A Fully Open Multilingual Multimodal LLM for 39 Languages](https://neulab.github.io/Pangea/)
|
51 |
-
|
52 |
-
๐ช๐น ๐ธ๐ฆ ๐ง๐ฌ ๐ง๐ฉ ๐จ๐ฟ ๐ฉ๐ช ๐ฌ๐ท ๐ฌ๐ง ๐บ๐ธ ๐ช๐ธ ๐ฎ๐ท ๐ซ๐ท ๐ฎ๐ช ๐ฎ๐ณ ๐ฎ๐ฉ ๐ณ๐ฌ ๐ฎ๐น ๐ฎ๐ฑ ๐ฏ๐ต ๐ฎ๐ฉ ๐ฐ๐ท ๐ณ๐ฑ ๐ฒ๐ณ ๐ฒ๐พ ๐ณ๐ด ๐ต๐ฑ ๐ต๐น ๐ง๐ท ๐ท๐ด ๐ท๐บ ๐ฑ๐ฐ ๐ฎ๐ฉ ๐ฐ๐ช ๐น๐ฟ ๐ฑ๐ฐ ๐น๐ญ ๐น๐ท ๐บ๐ฆ ๐ต๐ฐ ๐ป๐ณ ๐จ๐ณ ๐น๐ผ
|
53 |
-
|
54 |
-
[๐ Homepage](https://neulab.github.io/Pangea/) | [๐ค Pangea-7B](https://huggingface.co/neulab/Pangea-7B) | [๐ PangeaIns](https://huggingface.co/datasets/neulab/PangeaInstruct) | [๐งช PangeaBench](https://huggingface.co/collections/neulab/pangea-6713c3b0d78a453906eb2ed8) | [๐ป Github](https://github.com/neulab/Pangea/tree/main) | [๐ Arxiv](https://arxiv.org/abs/2410.16153) | [๐ PDF](https://arxiv.org/pdf/2410.16153) | [๐ฅ๏ธ Demo](https://huggingface.co/spaces/neulab/Pangea)
|
55 |
-
|
56 |
-
<img src="https://cdn-uploads.huggingface.co/production/uploads/6230d750d93e84e233882dbc/ZjVTKnIsyshWpo-PWg9gM.png" alt="description" style="width:300px;">
|
57 |
-
|
58 |
-
|
59 |
-
## Model details
|
60 |
-
|
61 |
-
- **Model:** Pangea is a fully open-source Multilingual Multimodal Multicultural LLM.
|
62 |
-
- **Date:** Pangea-7B was trained in 2024.
|
63 |
-
- **Training Dataset:** [6M PangeaIns](https://huggingface.co/datasets/neulab/PangeaInstruct).
|
64 |
-
- **Architecture:** Pangea-7B follows the architecture of [LLaVA-NeXT](https://github.com/LLaVA-VL/LLaVA-NeXT), with a [Qwen2-7B-Instruct](https://huggingface.co/Qwen/Qwen2-7B-Instruct) backbone.
|
65 |
-
|
66 |
-
## Uses
|
67 |
-
|
68 |
-
Pangea-7B follows the architecture of [LLaVA-NeXT](https://github.com/LLaVA-VL/LLaVA-NeXT).
|
69 |
-
|
70 |
-
You could either (1) follow the same model loading procedures as of [LLaVA-NeXT](https://github.com/LLaVA-VL/LLaVA-NeXT), an example of loading Pangea-7B directly is shown in the Python code below, or (2) use our hf version of Pangea-7B: [Pangea-7B-hf]https://huggingface.co/neulab/Pangea-7B-hf
|
71 |
-
|
72 |
-
### Direct Use
|
73 |
-
The hf version is intended so that you could use Pangea-7B with the huggingface generate function.
|
74 |
-
If you want to use it with the Llava-Next codebase, please refer to our [original checkpoint](https://huggingface.co/neulab/Pangea-7B).
|
75 |
-
|
76 |
-
```python
|
77 |
-
# Assuming that you have text_input and image_path
|
78 |
-
from transformers import LlavaNextForConditionalGeneration, AutoProcessor
|
79 |
-
import torch
|
80 |
-
from PIL import Image
|
81 |
-
|
82 |
-
image_input = Image.open(image_path)
|
83 |
-
|
84 |
-
model = LlavaNextForConditionalGeneration.from_pretrained(
|
85 |
-
"neulab/Pangea-7B-hf",
|
86 |
-
torch_dtype=torch.float16
|
87 |
-
).to(0)
|
88 |
-
processor = AutoProcessor.from_pretrained("neulab/Pangea-7B-hf")
|
89 |
-
model.resize_token_embeddings(len(processor.tokenizer))
|
90 |
-
|
91 |
-
text_input = f"<|im_start|>system\nYou are a helpful assistant.<|im_end|>\n<|im_start|>user\n<image>\n{text_input}<|im_end|>\n<|im_start|>assistant\n"
|
92 |
-
model_inputs = processor(images=image_input, text=text_input, return_tensors='pt').to("cuda", torch.float16)
|
93 |
-
output = model.generate(**model_inputs, max_new_tokens=1024, min_new_tokens=32, temperature=1.0, top_p=0.9, do_sample=True)
|
94 |
-
output = output[0]
|
95 |
-
result = processor.decode(output, skip_special_tokens=True, clean_up_tokenization_spaces=False)
|
96 |
-
|
97 |
-
print(result)
|
98 |
-
```
|
99 |
-
|
100 |
-
## Citing the Model
|
101 |
-
|
102 |
-
**BibTeX Citation:**
|
103 |
-
|
104 |
-
```
|
105 |
-
@article{yue2024pangeafullyopenmultilingual,
|
106 |
-
title={Pangea: A Fully Open Multilingual Multimodal LLM for 39 Languages},
|
107 |
-
author={Xiang Yue and Yueqi Song and Akari Asai and Seungone Kim and Jean de Dieu Nyandwi and Simran Khanuja and Anjali Kantharuban and Lintang Sutawika and Sathyanarayanan Ramamoorthy and Graham Neubig},
|
108 |
-
year={2024},
|
109 |
-
journal={arXiv preprint arXiv:2410.16153},
|
110 |
-
url={https://arxiv.org/abs/2410.16153}
|
111 |
-
}
|
112 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|