Not great output, lol
I managed to get it running, but the output leaves much to be desired.
My original experiment was to feed it the first chapter of Neuromancer (14k tokens), to see if it could write a new 2nd chapter. It started by hallucinating about a girl named Candy, citing sources, followed by some ramblings about timekeepers, butterflies, and getting completely lost.
2nd experiment, I fed it the first couple pages, a few dozen paragraphs. This immediately hallucinated that the prior text was a journal entry, and began an entry about its time in Las Vegas in the 1990's, then another journal entry filled with acknowledgements and copyrights. Amusingly it began describing a doctor's credentials (MD, PhD, MBA), followed by a 30+ letter acronym of some other credential it made up (CPTDSAFCetc...).
3rd experiment I fed it the opening line of the book, and it actually managed to write a conversation, but it devolved quickly into an encyclopedia acid trip. :laughing:
It likes to smash words together, so added some spaces and apostrophe where I could see they should go. Enjoy!
@PoVRAZOR : How did you manage to get it running? This is a 4-bit quantized model, so I assume some special library is needed to run it.
It's maybe a dumb question (sorry for that), but how is this used in, i.e ooba? Or in SillyTavern? What settings/templates have to be used?