metadata
license: apache-2.0
pretty_name: OpenAI guided-diffusion 256px class-conditional unguided samples (20 samples)
size_categories:
- n<1K
Read from the webdataset (after saving it somewhere on your disk) like this:
from webdataset import WebDataset
from typing import TypedDict, Iterable
from PIL import Image
from PIL.PngImagePlugin import PngImageFile
from io import BytesIO
from os import makedirs
Example = TypedDict('Example', {
'__key__': str,
'__url__': str,
'img.png': bytes,
})
dataset = WebDataset('./wds-dataset-viewer-test/{00000..00001}.tar')
out_root = 'out'
makedirs(out_root, exist_ok=True)
it: Iterable[Example] = iter(dataset)
for ix, item in enumerate(it):
with BytesIO(item['img.png']) as stream:
img: PngImageFile = Image.open(stream)
img.load()
img.save(f'{out_root}/{ix}.png')
Or from the HF dataset like this:
from datasets import load_dataset
from datasets.dataset_dict import DatasetDict
from datasets.arrow_dataset import Dataset
from PIL.PngImagePlugin import PngImageFile
from typing import TypedDict, Iterable
from os import makedirs
class Item(TypedDict):
index: int
tar: str
tar_path: str
img: PngImageFile
dataset: DatasetDict = load_dataset('Birchlabs/wds-dataset-viewer-test')
train: Dataset = dataset['train']
out_root = 'out'
makedirs(out_root, exist_ok=True)
it: Iterable[Item] = iter(train)
for item in it:
item['img'].save(f'{out_root}/{item["index"]}.png')