Spaces:
Build error
Build error
more doc
Browse files- docs/README.md +8 -1
docs/README.md
CHANGED
@@ -41,4 +41,11 @@ set these in `code/config.yaml`:
|
|
41 |
* ``["embedding_options"]["expand_urls"]`` - If set to True, gets and reads the data from all the links under the url provided. If set to False, only reads the data in the url provided.
|
42 |
* ``["embedding_options"]["search_top_k"]`` - Number of sources that the retriever returns
|
43 |
* ``["llm_params]["use_history"]`` - Whether to use history in the prompt or not
|
44 |
-
* ``["llm_params]["memory_window"]`` - Number of interactions to keep a track of in the history
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
* ``["embedding_options"]["expand_urls"]`` - If set to True, gets and reads the data from all the links under the url provided. If set to False, only reads the data in the url provided.
|
42 |
* ``["embedding_options"]["search_top_k"]`` - Number of sources that the retriever returns
|
43 |
* ``["llm_params]["use_history"]`` - Whether to use history in the prompt or not
|
44 |
+
* ``["llm_params]["memory_window"]`` - Number of interactions to keep a track of in the history
|
45 |
+
|
46 |
+
|
47 |
+
## LlamaCpp
|
48 |
+
* https://python.langchain.com/docs/integrations/llms/llamacpp
|
49 |
+
|
50 |
+
## Hugging Face Models
|
51 |
+
* Download the ``.gguf`` files for your Local LLM from Hugging Face (Example: https://huggingface.co/TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF)
|