tscholak/1wnr382e
Fine-tuned weights for PICARD - Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models based on T5-Large.
Training Data
The model has been fine-tuned on the 7000 training examples in the Spider text-to-SQL dataset. The model solves Spider's zero-shot text-to-SQL translation task, and that means that it can generalize to unseen SQL databases.
Training Objective
This model was initialized with T5-Large and fine-tuned with the text-to-text generation objective.
Questions are always grounded in a database schema, and the model is trained to predict the SQL query that would be used to answer the question. The input to the model is composed of the user's natural language question, the database identifier, and a list of tables and their columns:
[question] | [db_id] | [table] : [column] ( [content] , [content] ) , [column] ( ... ) , [...] | [table] : ... | ...
The model outputs the database identifier and the SQL query that will be executed on the database to answer the user's question:
[db_id] | [sql]
Performance
Out of the box, this model achieves 65.3 % exact-set match accuracy and 67.2 % execution accuracy on the Spider development set.
Using the PICARD constrained decoding method (see the official PICARD implementation), the model's performance can be improved to 69.1 % exact-set match accuracy and 72.9 % execution accuracy on the Spider development set.
Usage
Please see the official repository for scripts and docker images that support evaluation and serving of this model.
References
Citation
@inproceedings{Scholak2021:PICARD,
author = {Torsten Scholak and Nathan Schucher and Dzmitry Bahdanau},
title = "{PICARD}: Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.779",
pages = "9895--9901",
}
- Downloads last month
- 7