File size: 1,330 Bytes
ec7676f 9562ac2 2b9e602 9562ac2 2b9e602 ec7676f 02e0167 ec7676f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
---
license: apache-2.0
widget:
- text: |-
<A>We should have a pet. <end>
<B>I don't think so. <end>
<A>Why not? <end>
<B>Because pets make a mess. <end>
<A>But dogs are cute! <end>
<B>Cats are cute too. <end>
<A>We can get a cat then. <end>
<B>
example_title: Sample 1
datasets:
- raincandy-u/TinyChat
pipeline_tag: text-generation
---
# raincandy-u/TinyChat-1776K
A tiny LM trained on TinyChat dataset from scratch.
The aim is to try to achieve natural responses on the smallest possible model. Trained using a dataset of 3 year old children level English conversations.
Note: It has no world knowledge, so you should not ask it any intellectual questions.
## Model Spec
```
config = AutoConfig.for_model(
model_type="llama",
hidden_size=192,
intermediate_size=640,
num_attention_heads=16,
num_hidden_layers=3,
num_key_value_heads=4,
tie_word_embeddings=True,
vocab_size=2048,
max_position_embeddings=256
)
```
## Template
```
<A>Hi, Tom. How are you? <end>
<B>I'm fine, thank you. And you? <end>
<A>Fine. What's your favorite color? <end>
<B>My favorite color is black. <end>
<A>Do you like cats? <end>
<B>
```
Example output:
```
Yes, I do. I like it too. They are good for me.
```
## Generation Param
```
top_k=40,
top_p=0.8,
temperature=1
``` |