qnqgpt2 / README.md
karthikqnq's picture
Update README.md
75b1271 verified
metadata
base_model:
  - openai-community/gpt2
language:
  - en
  - ta
license: mit
tags:
  - gpt2
  - text-generation
  - QnQ
pipeline_tag: question-answering

QnQGPT Model

This is a custom GPT model based on GPT-2 architecture.

Model Details

  • Model Type: GPT-2
  • Base Model: gpt2
  • Training Data: [Describe your training data]
  • Use Cases: [Describe intended use cases]

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt")
tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt")

# Generate text
text = "Hello, how are"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
result = tokenizer.decode(outputs[0])
print(result)

Training Details

[Add your training details here]

Limitations

[Add model limitations here]

License

This model is released under the MIT License.