albertvillanova HF staff commited on
Commit
d8b8066
1 Parent(s): 720ee2f

Convert dataset sizes from base 2 to base 10 in the dataset card (#3)

Browse files

- Convert dataset sizes from base 2 to base 10 in the dataset card (130ae677cfb8161f318a8225d5ead62dfa267dcd)

Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -85,9 +85,9 @@ dataset_info:
85
  - **Repository:** https://github.com/facebookresearch/EmpatheticDialogues
86
  - **Paper:** [Towards Empathetic Open-domain Conversation Models: a New Benchmark and Dataset](https://arxiv.org/abs/1811.00207)
87
  - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
88
- - **Size of downloaded dataset files:** 26.72 MB
89
- - **Size of the generated dataset:** 23.97 MB
90
- - **Total amount of disk used:** 50.69 MB
91
 
92
  ### Dataset Summary
93
 
@@ -107,9 +107,9 @@ PyTorch original implementation of Towards Empathetic Open-domain Conversation M
107
 
108
  #### default
109
 
110
- - **Size of downloaded dataset files:** 26.72 MB
111
- - **Size of the generated dataset:** 23.97 MB
112
- - **Total amount of disk used:** 50.69 MB
113
 
114
  An example of 'train' looks as follows.
115
  ```
 
85
  - **Repository:** https://github.com/facebookresearch/EmpatheticDialogues
86
  - **Paper:** [Towards Empathetic Open-domain Conversation Models: a New Benchmark and Dataset](https://arxiv.org/abs/1811.00207)
87
  - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
88
+ - **Size of downloaded dataset files:** 28.02 MB
89
+ - **Size of the generated dataset:** 25.13 MB
90
+ - **Total amount of disk used:** 53.15 MB
91
 
92
  ### Dataset Summary
93
 
 
107
 
108
  #### default
109
 
110
+ - **Size of downloaded dataset files:** 28.02 MB
111
+ - **Size of the generated dataset:** 25.13 MB
112
+ - **Total amount of disk used:** 53.15 MB
113
 
114
  An example of 'train' looks as follows.
115
  ```