Edit model card

QnQGPT Model

This is a custom GPT model based on GPT-2 architecture.

Model Details

  • Model Type: GPT-2
  • Base Model: gpt2
  • Training Data: [Describe your training data]
  • Use Cases: [Describe intended use cases]

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt")
tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt")

# Generate text
text = "Hello, how are"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
result = tokenizer.decode(outputs[0])
print(result)

Training Details

[Add your training details here]

Limitations

[Add model limitations here]

License

This model is released under the MIT License.

Downloads last month
28
Safetensors
Model size
124M params
Tensor type
F32
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for karthikqnq/qnqgpt2

Finetuned
(1187)
this model

Space using karthikqnq/qnqgpt2 1