Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Fine-Tuned Falcon 7B - Indian Law
|
2 |
+
|
3 |
+
This is a Falcon 7B model, fine-tuned on a dataset pertaining to Indian law.
|
4 |
+
|
5 |
+
## Model Description
|
6 |
+
|
7 |
+
Falcon is a family of state-of-the-art language models created by the Technology Innovation Institute in Abu Dhabi. The Falcon 7B model was fine-tuned on a dataset specifically curated with questions and answers about Indian law. The questions cover a wide range of legal topics, from the components of the Indian Constitution to the roles of various governmental positions.
|
8 |
+
|
9 |
+
## Fine-tuning details
|
10 |
+
|
11 |
+
The model was trained using a batch size of 1 and a learning rate of 2e-4 over five epochs. The maximum sequence length was set to 512 tokens, and gradient accumulation was used with a step size of four. The model was trained for a total of 250 steps.
|
12 |
+
|
13 |
+
## Intended uses
|
14 |
+
|
15 |
+
This model is intended to be used for answering questions related to Indian law. This could include use cases such as legal advice platforms, legal research tools, or educational applications.
|
16 |
+
|
17 |
+
## Limitations
|
18 |
+
|
19 |
+
While the model has been trained specifically on a dataset about Indian law, it's important to remember that it doesn't have a comprehensive understanding of the legal system or context. The model's responses should be used as a starting point for legal research, not as a definitive legal advice.
|