eugenepentland's picture
updated readme
f823e16
|
raw
history blame
446 Bytes
metadata
license: other

WizardLM-7B with 10k+ context using Landmark Attention.

Model generated using Landmark-Attention-QLoRA https://github.com/eugenepentland/landmark-attention-qlora

A merge of the following models: https://huggingface.co/TheBloke/wizardLM-7B-HF https://huggingface.co/eugenepentland/WizardLM-7B-Landmark-Attention-QLoRA

Can be loaded in using oobooga, make sure to have the --trust-remote-code option on for it to function.