Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,40 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials
|
2 |
+
|
3 |
+
Reference: R. Luu and M.J. Buehler, Adv. Science, 2023, DOI: https://doi.org/10.1002/advs.202306724
|
4 |
+
|
5 |
+
Abstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains.
|
6 |
+
|
7 |
+
```
|
8 |
+
model = PeftModel.from_pretrained('lamm-mit/BioinspiredLLM')
|
9 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
10 |
+
```
|
11 |
+
|
12 |
+
Generate:
|
13 |
+
```
|
14 |
+
device='cuda'
|
15 |
+
def generate_response (text_input="Biological materials offer amazing",
|
16 |
+
num_return_sequences=1,
|
17 |
+
temperature=1.,
|
18 |
+
max_new_tokens=127,
|
19 |
+
num_beams=1,
|
20 |
+
top_k = 50,
|
21 |
+
top_p =0.9,repetition_penalty=1.,eos_token_id=2,verbatim=False,
|
22 |
+
exponential_decay_length_penalty_fac=None,
|
23 |
+
):
|
24 |
+
|
25 |
+
inputs = tokenizer.encode(text_input, add_special_tokens =False, return_tensors ='pt')
|
26 |
+
if verbatim:
|
27 |
+
print ("Length of input, tokenized: ", inputs.shape, inputs)
|
28 |
+
with torch.no_grad():
|
29 |
+
outputs = model.generate(input_ids=inputs.to(device),
|
30 |
+
max_new_tokens=max_new_tokens,
|
31 |
+
temperature=temperature, #value used to modulate the next token probabilities.
|
32 |
+
num_beams=num_beams,
|
33 |
+
top_k = top_k,
|
34 |
+
top_p =top_p,
|
35 |
+
num_return_sequences = num_return_sequences, eos_token_id=eos_token_id,
|
36 |
+
do_sample =True,#skip_prompt=True,
|
37 |
+
repetition_penalty=repetition_penalty,
|
38 |
+
)
|
39 |
+
return tokenizer.batch_decode(outputs[:,inputs.shape[1]:].detach().cpu().numpy(), skip_special_tokens=True)
|
40 |
+
```
|