marksverdhei
commited on
Commit
•
25db008
1
Parent(s):
852c810
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# T5-define
|
2 |
+
|
3 |
+
This model is trained to generate word definitions based on the word and a context,
|
4 |
+
using a subset of wordnet for all words that have an example and definition.
|
5 |
+
The model uses task prompts on the format 'define "[word]": [example sentence]'
|
6 |
+
|
7 |
+
To my knowledge, this is the first public model trained on a word definition task.
|
8 |
+
Similar work: [Zero-shot Word Sense Disambiguation using Sense Definition Embeddings](https://aclanthology.org/P19-1568.pdf)
|
9 |
+
|
10 |
+
For this project, there are two objectives:
|
11 |
+
1. Explore generalizability on generating word definitions for unseen words
|
12 |
+
2. Explore the utility of word embeddings by definition models
|
13 |
+
|
14 |
+
How to run:
|
15 |
+
```python
|
16 |
+
from transformers import T5ForConditionalGeneration, T5Tokenizer
|
17 |
+
|
18 |
+
tokenizer = T5Tokenizer.from_pretrained("marksverdhei/t5-base-define")
|
19 |
+
model = T5ForConditionalGeneration.from_pretrained("marksverdhei/t5-base-define")
|
20 |
+
|
21 |
+
prompt = "define \"noseplow\": The children hid as the noseplow drove across the street"
|
22 |
+
|
23 |
+
ids = tokenizer(prompt, return_tensors="pt").input_ids
|
24 |
+
generated_tokens = model.generate(ids)[0][1:-1]
|
25 |
+
tokenizer.decode(generated_tokens)
|
26 |
+
```
|