BramVanroy commited on
Commit
592aa01
β€’
1 Parent(s): fee8c8d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  license: mit
3
- base_model: BramVanroy/fietje-2b
4
  tags:
5
  - trl
6
  - fietje
@@ -11,7 +11,7 @@ datasets:
11
  - BramVanroy/no_robots_dutch
12
  - BramVanroy/belebele_dutch
13
  model-index:
14
- - name: fietje-2b-instruct
15
  results: []
16
  pipeline_tag: text-generation
17
  inference: false
@@ -20,27 +20,27 @@ language:
20
  ---
21
 
22
  <p align="center" style="margin:0;padding:0">
23
- <img src="https://huggingface.co/BramVanroy/fietje-2b-instruct/resolve/main/img/fietje-2b-banner-rounded.png" alt="Fietje banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
24
  </p>
25
 
26
  <div style="margin:auto; text-align:center">
27
- <h1 style="margin-bottom: 0">Fietje 2B Instruct</h1>
28
  <em>An open and efficient LLM for Dutch</em>
29
  </div>
30
 
31
  <blockquote class="tip" style="padding: 1.5em; border: 0">
32
  <p align="center" style="text-align: center; margin: 0">
33
- <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b">πŸ‘±β€β™€οΈ Base version</a> -
34
- <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-instruct">πŸ€– Instruct version</a> (this one) -
35
- <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-chat">πŸ’¬ Chat version</a> -
36
- <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-chat-GGUF">πŸš€ GGUF of Instruct</a>
37
  </p>
38
  <p align="center" style="text-align: center; margin: 0">
39
- <a href="https://huggingface.co/spaces/BramVanroy/fietje-2b"><strong>Chat with Fietje here!</strong></a>
40
  </p>
41
  </blockquote>
42
 
43
- This is the instruct version of Fietje, an SFT-tuned (instruction-tuned) variant of [the base model](https://huggingface.co/BramVanroy/fietje-2b). Fietje is an adapated version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2), tailored to Dutch text generation by training on 28B tokens. It is small and efficient with a size of 2.7 billion parameters while performing almost on par with more powerful Dutch LLMs of twice its size like [GEITje 7B Ultra](https://huggingface.co/BramVanroy/GEITje-7B-ultra).
44
 
45
  A thorough description of the creation and evaluation of Fietje as well as usage examples are available in [this Github repository](https://github.com/BramVanroy/fietje).
46
 
@@ -50,7 +50,7 @@ The same limitations as [phi-2](https://huggingface.co/microsoft/phi-2#limitatio
50
 
51
  ## Training and evaluation data
52
 
53
- Fietje 2B instruct was finetuned from [the base model](https://huggingface.co/BramVanroy/fietje-2b) on the following datasets. Number of training samples per dataset given in brackets, totalling 201,579 samples.
54
 
55
  - [BramVanroy/ultrachat_200k_dutch](https://huggingface.co/datasets/BramVanroy/ultrachat_200k_dutch): gpt-4-1106-preview; multi-turn; fully generated (192,598)
56
  - [BramVanroy/no_robots_dutch](https://huggingface.co/datasets/BramVanroy/no_robots_dutch): gpt-4-1106-preview; prompt translate, answer generated; some items have system messages (8181)
 
1
  ---
2
  license: mit
3
+ base_model: BramVanroy/fietje-2
4
  tags:
5
  - trl
6
  - fietje
 
11
  - BramVanroy/no_robots_dutch
12
  - BramVanroy/belebele_dutch
13
  model-index:
14
+ - name: fietje-2-instruct
15
  results: []
16
  pipeline_tag: text-generation
17
  inference: false
 
20
  ---
21
 
22
  <p align="center" style="margin:0;padding:0">
23
+ <img src="https://huggingface.co/BramVanroy/fietje-2-instruct/resolve/main/img/fietje-2-banner-rounded.png" alt="Fietje banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
24
  </p>
25
 
26
  <div style="margin:auto; text-align:center">
27
+ <h1 style="margin-bottom: 0">Fietje 2 Instruct</h1>
28
  <em>An open and efficient LLM for Dutch</em>
29
  </div>
30
 
31
  <blockquote class="tip" style="padding: 1.5em; border: 0">
32
  <p align="center" style="text-align: center; margin: 0">
33
+ <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2">πŸ‘±β€β™€οΈ Base version</a> -
34
+ <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2-instruct">πŸ€– Instruct version</a> (this one) -
35
+ <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2-chat">πŸ’¬ Chat version</a> -
36
+ <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2-chat-GGUF">πŸš€ GGUF of Instruct</a>
37
  </p>
38
  <p align="center" style="text-align: center; margin: 0">
39
+ <a href="https://huggingface.co/spaces/BramVanroy/fietje-2"><strong>Chat with Fietje here!</strong></a>
40
  </p>
41
  </blockquote>
42
 
43
+ This is the instruct version of Fietje, an SFT-tuned (instruction-tuned) variant of [the base model](https://huggingface.co/BramVanroy/fietje-2). Fietje is an adapated version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2), tailored to Dutch text generation by training on 28B tokens. It is small and efficient with a size of 2.7 billion parameters while performing almost on par with more powerful Dutch LLMs of twice its size like [GEITje 7B Ultra](https://huggingface.co/BramVanroy/GEITje-7B-ultra).
44
 
45
  A thorough description of the creation and evaluation of Fietje as well as usage examples are available in [this Github repository](https://github.com/BramVanroy/fietje).
46
 
 
50
 
51
  ## Training and evaluation data
52
 
53
+ Fietje 2 instruct was finetuned from [the base model](https://huggingface.co/BramVanroy/fietje-2) on the following datasets. Number of training samples per dataset given in brackets, totalling 201,579 samples.
54
 
55
  - [BramVanroy/ultrachat_200k_dutch](https://huggingface.co/datasets/BramVanroy/ultrachat_200k_dutch): gpt-4-1106-preview; multi-turn; fully generated (192,598)
56
  - [BramVanroy/no_robots_dutch](https://huggingface.co/datasets/BramVanroy/no_robots_dutch): gpt-4-1106-preview; prompt translate, answer generated; some items have system messages (8181)