jmeadows17 commited on
Commit
a6d13ee
1 Parent(s): a36cb6f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -7
README.md CHANGED
@@ -1,16 +1,24 @@
1
  ---
2
  license: openrail
 
3
  ---
4
 
5
- MathT5-large is a version of FLAN-T5-large fine-tuned for 25 epochs on 15K (LaTeX) synthetic mathematical derivations (containing 5 - 9 equations), generated using a symbolic solver.
6
-
7
- It outperforms GPT-4 and ChatGPT (paper link soon) on a derivation generation task in ROUGE, BLEU, BLEURT, and GLEU, and shows some generalisation capabilities.
8
 
 
 
9
  It was trained on 155 physics symbols, but struggles with out-of-vocabulary symbols.
10
 
11
 
12
- An example prompt:
 
 
 
 
 
 
13
 
14
- Given \cos{(q)} = \theta{(q)},
15
- then derive - \sin{(q)} = \frac{d}{d q} \theta{(q)},
16
- then obtain (- \sin{(q)})^{q} (\frac{d}{d q} \cos{(q)})^{q} = (- \sin{(q)})^{2 q}
 
 
1
  ---
2
  license: openrail
3
+ pipeline_tag: text-generation
4
  ---
5
 
6
+ **Overview**
 
 
7
 
8
+ MathT5-large is a version of FLAN-T5-large fine-tuned for 25 epochs on 15K (LaTeX) synthetic mathematical derivations (containing 5 - 9 equations), that were generated using a symbolic solver (SymPy).
9
+ It outperforms GPT-4 and ChatGPT (paper link soon) on a derivation generation task in ROUGE, BLEU, BLEURT, and GLEU scores, and shows some generalisation capabilities.
10
  It was trained on 155 physics symbols, but struggles with out-of-vocabulary symbols.
11
 
12
 
13
+ **Example prompt:**
14
+
15
+ ```prompt = "Given \\cos{(q)} = \\theta{(q)},
16
+ then derive - \\sin{(q)} = \\frac{d}{d q} \\theta{(q)},
17
+ then obtain (- \\sin{(q)})^{q} (\\frac{d}{d q} \\cos{(q)})^{q} = (- \\sin{(q)})^{2 q}"```
18
+
19
+ **To use MathT5 easily:**
20
 
21
+ 1. Download ```MathT5.py``` to your working directory.
22
+ 2. ```from MathT5 import load_model, inference```
23
+ 3. ```tokenizer, model = load_model("jmeadows17/MathT5-large")```
24
+ 4. ```inference(prompt, tokenizer, model)```