Add medical tag
#6
by
davanstrien
HF staff
- opened
README.md
CHANGED
@@ -2,8 +2,10 @@
|
|
2 |
license: bigscience-bloom-rail-1.0
|
3 |
datasets:
|
4 |
- pubmed
|
5 |
-
widget:
|
6 |
-
- text:
|
|
|
|
|
7 |
---
|
8 |
|
9 |
# Model Card for BioMedLM 2.7B
|
@@ -145,4 +147,4 @@ BioMedLM 2.7B is a standard GPT-2 implementation (trained with Flash Attention)
|
|
145 |
|
146 |
## Compute Infrastructure
|
147 |
|
148 |
-
The model was trained on [MosaicML Cloud](https://www.mosaicml.com/cloud), a platform designed for large workloads like LLMs. Using the [Composer](https://github.com/mosaicml/composer) training library and [PyTorch FSDP](https://pytorch.org/docs/stable/fsdp.html), it was easy to enable multi-node training across 128 A100-40GB GPUs, and the total run was completed in ~6.25 days.
|
|
|
2 |
license: bigscience-bloom-rail-1.0
|
3 |
datasets:
|
4 |
- pubmed
|
5 |
+
widget:
|
6 |
+
- text: Photosynthesis is
|
7 |
+
tags:
|
8 |
+
- medical
|
9 |
---
|
10 |
|
11 |
# Model Card for BioMedLM 2.7B
|
|
|
147 |
|
148 |
## Compute Infrastructure
|
149 |
|
150 |
+
The model was trained on [MosaicML Cloud](https://www.mosaicml.com/cloud), a platform designed for large workloads like LLMs. Using the [Composer](https://github.com/mosaicml/composer) training library and [PyTorch FSDP](https://pytorch.org/docs/stable/fsdp.html), it was easy to enable multi-node training across 128 A100-40GB GPUs, and the total run was completed in ~6.25 days.
|