phanerozoic commited on
Commit
96e8bd8
1 Parent(s): e07e01e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -2
README.md CHANGED
@@ -1,3 +1,9 @@
 
 
 
 
 
 
1
  license: cc-by-4.0
2
 
3
  # Mistral-Astronomy-7b-v0.2
@@ -65,5 +71,4 @@ To optimize the model's output, custom stopping strings have been employed. Thes
65
  The training was conducted on an RTX 6000 Ada GPU, taking approximately 40 minutes. This setup ensured efficient processing of the expanded dataset, contributing to the model's enhanced performance.
66
 
67
  ## Acknowledgments and Attribution
68
- Special appreciation is extended to OpenStax for "Astronomy 2e," which forms the core of the training material. The model's development is also indebted to the foundational work of the Mistral and OpenHermes 2.5 teams. This work is based on "Astronomy 2e" by OpenStax, licensed under [Creative Commons Attribution 4.0 International License (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/), with modifications for language modeling purposes. The endorsement by OpenStax or the original authors is not implied.
69
-
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - text-generation-inference
6
+ ---
7
  license: cc-by-4.0
8
 
9
  # Mistral-Astronomy-7b-v0.2
 
71
  The training was conducted on an RTX 6000 Ada GPU, taking approximately 40 minutes. This setup ensured efficient processing of the expanded dataset, contributing to the model's enhanced performance.
72
 
73
  ## Acknowledgments and Attribution
74
+ Special appreciation is extended to OpenStax for "Astronomy 2e," which forms the core of the training material. The model's development is also indebted to the foundational work of the Mistral and OpenHermes 2.5 teams. This work is based on "Astronomy 2e" by OpenStax, licensed under [Creative Commons Attribution 4.0 International License (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/), with modifications for language modeling purposes. The endorsement by OpenStax or the original authors is not implied.