Edit model card

Table of Contents

  1. TL;DR
  2. Model Details
  3. Usage
  4. Uses
  5. Citation

TL;DR

This is a Phi-1_5 model trained on camel-ai/biology. This model is for research purposes only and should not be used in production settings.

Model Description

  • Model type: Language model
  • Language(s) (NLP): English
  • License: Apache 2.0
  • Related Models: Phi-1_5

Usage

Find below some example scripts on how to use the model in transformers:

Using the Pytorch model


from huggingface_hub import notebook_login
from datasets import load_dataset, Dataset
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer

model = "ArtifactAI/phi-biology"

model = AutoModelForCausalLM.from_pretrained(base_model, trust_remote_code= True)
tokenizer = AutoTokenizer.from_pretrained(base_model, trust_remote_code=True)

def generate(prompt):
  inputs = tokenizer(f'''Below is an instruction that describes a task. Write a response that appropriately completes the request If you are adding additional white spaces, stop writing".\n\n### Instruction:\n{prompt}.\n\n### Response:\n ''', return_tensors="pt", return_attention_mask=False)
  streamer = TextStreamer(tokenizer, skip_prompt= True)
  _ = model.generate(**inputs, streamer=streamer, max_new_tokens=500)
  
generate("What are the common techniques used in identifying a new species, and how can scientists accurately categorize it within the existing taxonomy system?")

Training Data

The model was trained on camel-ai/biology, a dataset of question/answer pairs. Questions are generated using the t5-base model, while the answers are generated using the GPT-3.5-turbo model.

Citation

@misc{phi-math,
    title={phi-biology},
    author={Matthew Kenney},
    year={2023}
}
Downloads last month
13
Safetensors
Model size
1.42B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.