File size: 2,467 Bytes
e13da85
2d5813a
7b6f41c
 
f65daa9
 
 
 
e13da85
6b4e76a
7b6f41c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f23bcb5
7b6f41c
 
 
 
 
f23bcb5
7b6f41c
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---
license: cc-by-nc-4.0
language:
- en
widget:
  - text: |
      Who are you?
    example_title: "Introduction"
---
![tinyviking.png](https://huggingface.co/phanerozoic/Tiny-Viking-1.1b-v0.1/resolve/main/tinyviking.jpg)
# TinyViking-1.1B-v0.1

TinyViking-1.1B-v0.1 is a specialized language model designed for generating Viking-themed content. Developed by phanerozoic, this model is fine-tuned from TinyLlamaTinyLlama-1.1B-Chat-v1.0, optimized for environments with limited computing resources.

### Performance
TinyViking is capable of generating engaging Viking narratives, reflecting an understanding of Viking culture. However, it is not designed for general language tasks and may struggle with complex scientific or technical queries.

### Direct Use
Ideal for thematic language generation, particularly in settings like NPCs in games, where fun and thematic engagement are prioritized over detailed factual accuracy.

### Training Data
Trained on "The Saga of Grettir the Strong: Grettir's Saga" to ensure authentic thematic content.

### Custom Stopping Strings
Custom stopping strings are employed to enhance output quality:
- "},"
- "User:"
- "You:"
- "\nUser"
- "\nUser:"
- "me:"
- "user"
- "\n"

### Training Hyperparameters and Fine-Tuning Details
- **Learning Rate**: 2e-5
- **Epochs**: 1
- **Training Duration**: Approximately 5.6 minutes on an RTX 6000 Ada GPU
- **LoRA Rank**: 2048
- **LoRA Alpha**: 4096
- **LoRA Dropout**: 0.05
- **Cutoff Length**: 256
- **Batch Size**: 4 (micro batch size)
- **Warmup Steps**: 8
- **Optimizer**: adamw_torch
- **Gradient Accumulation Steps**: 1

### Limitations
Specialized in Viking dialect and narratives, TinyViking is less effective outside its thematic focus.

### Compute Infrastructure
Trained on an RTX 6000 Ada Lovelace GPU

### Results
Successfully generates Viking-themed responses, maintaining thematic consistency while displaying improved coherence and depth over previous models due to advancements in dataset generation and parsing.

### Summary
TinyViking-1.1B-v0.1 shows an improvement in quality compared to earlier thematic models, thanks to a new dataset generation method that helps to conserve the base model's already tenuous ability to hold a conversation. While it excels in Viking-themed interactions, its specialized focus limits broader application.

### Acknowledgments
Gratitude to the TinyLlama team, whose foundational work was, as always, essential for developing TinyViking.