|
--- |
|
license: cc0-1.0 |
|
--- |
|
|
|
A Reber sequence is a grammar string made of finite states, in simple words, formed by using a confined set of characters. In the research paper that proposed the [LSTM](https://dl.acm.org/doi/10.1162/neco.1997.9.8.1735), the authors use embedded Reber grammar due to its short time lags. |
|
|
|
[Here](https://huggingface.co/datasets/harshildarji/Reber-Grammar/blob/main/dataset.pdf) is the chapter from my Master's thesis that briefly explains the dataset, and also includes various plots about its statistics. |
|
|
|
**Data visualiations available at [about_reber](https://www.kaggle.com/harshildarji/about-reber).** |
|
|
|
**Master's thesis**: [Investigating Sparsity in Recurrent Neural Networks](https://arxiv.org/abs/2407.20601) |
|
|
|
``` |
|
@article{darji2024investigating, |
|
title={Investigating Sparsity in Recurrent Neural Networks}, |
|
author={Darji, Harshil}, |
|
journal={arXiv preprint arXiv:2407.20601}, |
|
year={2024} |
|
} |
|
``` |