# T5-VAE-Python (flax) (WIP) | |
A Transformer-VAE made using flax. | |
It has been trained to interpolate on lines of Python code form the [python-lines dataset](https://huggingface.co/datasets/Fraser/python-lines). | |
Done as part of Huggingface community training ([see forum post](https://discuss.huggingface.co/t/train-a-vae-to-interpolate-on-english-sentences/7548)). | |
Builds on T5, using an autoencoder to convert it into an MMD-VAE. | |
## Setup | |
Follow all steps to install dependencies from https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md#tpu-vm | |
- [ ] Find dataset storage site. | |
- [ ] Ask JAX team for dataset storage. | |