Update license info to Apache 2.0
Browse files
README.md
CHANGED
@@ -4,7 +4,7 @@ license: apache-2.0
|
|
4 |
|
5 |
# π Falcon-7B
|
6 |
|
7 |
-
**Falcon-7B is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the
|
8 |
|
9 |
*Paper coming soon* π.
|
10 |
|
@@ -12,7 +12,7 @@ license: apache-2.0
|
|
12 |
|
13 |
* **It outperforms comparable open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
|
14 |
* **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
|
15 |
-
* **It is made available under a license allowing commercial use**,
|
16 |
|
17 |
β οΈ **This is a raw, pretrained model, which should be further finetuned for most usecases.** If you are looking for a version better suited to taking generic instructions in a chat format, we recommend taking a look at [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct).
|
18 |
|
@@ -59,7 +59,7 @@ for seq in sequences:
|
|
59 |
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
|
60 |
- **Model type:** Causal decoder-only;
|
61 |
- **Language(s) (NLP):** English and French;
|
62 |
-
- **License:**
|
63 |
|
64 |
### Model Source
|
65 |
|
@@ -198,11 +198,7 @@ Falcon-7B was trained a custom distributed training codebase, Gigatron. It uses
|
|
198 |
|
199 |
## License
|
200 |
|
201 |
-
Falcon-7B is made available under the
|
202 |
-
* You can freely use our models for research and/or personal purpose;
|
203 |
-
* You are allowed to share and build derivatives of these models, but you are required to give attribution and to share-alike with the same license;
|
204 |
-
* For commercial use, you are exempt from royalties payment if the attributable revenues are inferior to $1M/year, otherwise you should enter in a commercial agreement with TII.
|
205 |
-
|
206 |
|
207 |
## Contact
|
208 |
falconllm@tii.ae
|
|
|
4 |
|
5 |
# π Falcon-7B
|
6 |
|
7 |
+
**Falcon-7B is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the Apache 2.0 license.**
|
8 |
|
9 |
*Paper coming soon* π.
|
10 |
|
|
|
12 |
|
13 |
* **It outperforms comparable open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
|
14 |
* **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
|
15 |
+
* **It is made available under a permissive Apache 2.0 license allowing for commercial use**, without any royalties or restrictions.
|
16 |
|
17 |
β οΈ **This is a raw, pretrained model, which should be further finetuned for most usecases.** If you are looking for a version better suited to taking generic instructions in a chat format, we recommend taking a look at [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct).
|
18 |
|
|
|
59 |
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
|
60 |
- **Model type:** Causal decoder-only;
|
61 |
- **Language(s) (NLP):** English and French;
|
62 |
+
- **License:** Apache 2.0.
|
63 |
|
64 |
### Model Source
|
65 |
|
|
|
198 |
|
199 |
## License
|
200 |
|
201 |
+
Falcon-7B is made available under the Apache 2.0 license.
|
|
|
|
|
|
|
|
|
202 |
|
203 |
## Contact
|
204 |
falconllm@tii.ae
|