File size: 1,340 Bytes
e3804cb
 
a6d13ee
e3804cb
da27179
74f800f
 
2342c5c
74f800f
 
 
 
 
 
a6d13ee
da27179
98e291f
a0952fb
14ae145
da27179
 
a6d13ee
 
 
 
 
 
ef39e18
 
7073d08
 
ef39e18
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
license: openrail
pipeline_tag: text-generation
---

**To use MathT5 easily:**

1. Download  ```MathT5.py```.
2. ```from MathT5 import load_model, inference```
3. ```tokenizer, model = load_model("jmeadows17/MathT5-large")```
4. ```inference(prompt, tokenizer, model)```

```MathT5.pretty_print(text, prompt=True)``` makes prompts and outputs (```prompt=False```) easier to read.

**Overview**

MathT5-large is a version of FLAN-T5-large fine-tuned for 25 epochs on 15K (LaTeX) synthetic mathematical derivations (containing 4 - 10 equations), that were generated using a symbolic solver (SymPy).
It outperforms the few-shot performance of GPT-4 and ChatGPT on a derivation generation task in ROUGE, BLEU, BLEURT, and GLEU scores, and shows some generalisation capabilities.
It was trained on 155 physics symbols, but struggles with out-of-vocabulary symbols. Paper available here: https://arxiv.org/abs/2307.09998.


**Example prompt:**

```prompt = "Given \\cos{(q)} = \\theta{(q)},
then derive - \\sin{(q)} = \\frac{d}{d q} \\theta{(q)},
then obtain (- \\sin{(q)})^{q} (\\frac{d}{d q} \\cos{(q)})^{q} = (- \\sin{(q)})^{2 q}"```

Output derivations are equations separated by "and".

Additional prompts can be found in "training_prompts.json" alongside the model files.

Use ```"jmeadows17/MathT5-base"``` for the lightweight version.