File size: 1,470 Bytes
b80dc69 072b64e b80dc69 072b64e b016e12 072b64e b016e12 072b64e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
---
license: apache-2.0
pretty_name: OpenAI guided-diffusion 256px class-conditional unguided samples (20 samples)
size_categories:
- n<1K
---
Read from the webdataset (after saving it somewhere on your disk) like this:
```python
from webdataset import WebDataset
from typing import TypedDict, Iterable
from PIL import Image
from PIL.PngImagePlugin import PngImageFile
from io import BytesIO
from os import makedirs
Example = TypedDict('Example', {
'__key__': str,
'__url__': str,
'img.png': bytes,
})
dataset = WebDataset('./wds-dataset-viewer-test/{00000..00001}.tar')
out_root = 'out'
makedirs(out_root, exist_ok=True)
it: Iterable[Example] = iter(dataset)
for ix, item in enumerate(it):
with BytesIO(item['img.png']) as stream:
img: PngImageFile = Image.open(stream)
img.load()
img.save(f'{out_root}/{ix}.png')
```
Or from the HF dataset like this:
```python
from datasets import load_dataset
from datasets.dataset_dict import DatasetDict
from datasets.arrow_dataset import Dataset
from PIL.PngImagePlugin import PngImageFile
from typing import TypedDict, Iterable
from os import makedirs
class Item(TypedDict):
index: int
tar: str
tar_path: str
img: PngImageFile
dataset: DatasetDict = load_dataset('Birchlabs/wds-dataset-viewer-test')
train: Dataset = dataset['train']
out_root = 'out'
makedirs(out_root, exist_ok=True)
it: Iterable[Item] = iter(train)
for item in it:
item['img'].save(f'{out_root}/{item["index"]}.png')
``` |