karthikqnq commited on
Commit
8d20180
1 Parent(s): 4d42a34

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +48 -46
README.md CHANGED
@@ -1,46 +1,48 @@
1
- ---
2
- language: en
3
- tags:
4
- - gpt2
5
- - text-generation
6
- license: mit
7
- ---
8
-
9
- # QnQGPT Model
10
-
11
- This is a custom GPT model based on GPT-2 architecture.
12
-
13
- ## Model Details
14
-
15
- - Model Type: GPT-2
16
- - Base Model: gpt2
17
- - Training Data: [Describe your training data]
18
- - Use Cases: [Describe intended use cases]
19
-
20
- ## Usage
21
-
22
- ```python
23
- from transformers import AutoModelForCausalLM, AutoTokenizer
24
-
25
- model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt")
26
- tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt")
27
-
28
- # Generate text
29
- text = "Hello, how are"
30
- inputs = tokenizer(text, return_tensors="pt")
31
- outputs = model.generate(**inputs, max_length=50)
32
- result = tokenizer.decode(outputs[0])
33
- print(result)
34
- ```
35
-
36
- ## Training Details
37
-
38
- [Add your training details here]
39
-
40
- ## Limitations
41
-
42
- [Add model limitations here]
43
-
44
- ## License
45
-
46
- This model is released under the MIT License.
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - gpt2
5
+ - text-generation
6
+ license: mit
7
+ base_model:
8
+ - openai-community/gpt2
9
+ ---
10
+
11
+ # QnQGPT Model
12
+
13
+ This is a custom GPT model based on GPT-2 architecture.
14
+
15
+ ## Model Details
16
+
17
+ - Model Type: GPT-2
18
+ - Base Model: gpt2
19
+ - Training Data: [Describe your training data]
20
+ - Use Cases: [Describe intended use cases]
21
+
22
+ ## Usage
23
+
24
+ ```python
25
+ from transformers import AutoModelForCausalLM, AutoTokenizer
26
+
27
+ model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt")
28
+ tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt")
29
+
30
+ # Generate text
31
+ text = "Hello, how are"
32
+ inputs = tokenizer(text, return_tensors="pt")
33
+ outputs = model.generate(**inputs, max_length=50)
34
+ result = tokenizer.decode(outputs[0])
35
+ print(result)
36
+ ```
37
+
38
+ ## Training Details
39
+
40
+ [Add your training details here]
41
+
42
+ ## Limitations
43
+
44
+ [Add model limitations here]
45
+
46
+ ## License
47
+
48
+ This model is released under the MIT License.