Update README.md
Browse files
README.md
CHANGED
@@ -60,5 +60,10 @@ Please use the following BibTeX entry if you use this model in your project:
|
|
60 |
}
|
61 |
```
|
62 |
|
|
|
|
|
|
|
|
|
|
|
63 |
## License
|
64 |
[GNU GPLv3](https://choosealicense.com/licenses/gpl-3.0/)
|
|
|
60 |
}
|
61 |
```
|
62 |
|
63 |
+
# Limitations
|
64 |
+
|
65 |
+
Entropy-Attention Regularization mitigates lexical overfitting but does not completely remove it. We expect the model still to show biases, e.g., peculiar keywords that induce a specific prediction regardless of the context. \
|
66 |
+
Please refer to our paper for a quantitative evaluation of this mitigation.
|
67 |
+
|
68 |
## License
|
69 |
[GNU GPLv3](https://choosealicense.com/licenses/gpl-3.0/)
|