Datasets:
Tasks:
Question Answering
Modalities:
Text
Formats:
parquet
Languages:
English
Size:
10M - 100M
Tags:
wikipedia
Update README.md
Browse files
README.md
CHANGED
@@ -21,4 +21,12 @@ configs:
|
|
21 |
data_files:
|
22 |
- split: train
|
23 |
path: gpt-4/train-*
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
---
|
|
|
|
|
|
21 |
data_files:
|
22 |
- split: train
|
23 |
path: gpt-4/train-*
|
24 |
+
task_categories:
|
25 |
+
- question-answering
|
26 |
+
language:
|
27 |
+
- en
|
28 |
+
tags:
|
29 |
+
- wikipedia
|
30 |
---
|
31 |
+
This is Wikidedia passages dataset for ODQA retriever.
|
32 |
+
Each passages have 256~ tokens splitteed by gpt-4 tokenizer using tiktoken.
|