metadata
license: gpl-3.0
datasets:
- nomic-ai/gpt4all_prompt_generations
language:
- en
gpt4all-lora
An autoregressive transformer trained on data curated using Atlas. This model is trained with four full epochs of training, while the related gpt4all-lora-epoch-2 model is trained with three. Replication instructions and data: https://github.com/nomic-ai/gpt4all
Model Details
Model Description
Developed by: Nomic AI
Model Type: An auto-regressive language model based on the transformer architecture and fine-tuned.
Languages: English
License: GPL-3.0
Finetuned from: LLaMA
Model Sources
Repository: https://github.com/nomic-ai/gpt4all
Base Model Repository: https://github.com/facebookresearch/llama
Technical Report: GPT4All: Training an Assistant-style Chatbot with Large Scale Data Distillation from GPT-3.5-Turbo