Edit model card

https://huggingface.co/G-reen/Duet_Minitron8b_v0.51 <- use this instead

Warning: This is a proof of concept. Models trained using the Duet dataset may behave differently from other models, due to the uniqueness of the data generation pipeline. I am also unsure if this model will be good at RP, as it hasn't explicitly seen any multiturn roleplaying data in its training.

Lastly, this model has not yet undergone extensive testing. It is probably uncensored, and may output false, sexual, or other undesirable information. Use at your own risk. I am not responsible for any harm caused by this model or its outputs.

Datasets used:

Model used: https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Base

Prompt format: Llama3 Chat

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

You are a helpful AI assistant for travel tips and recommendations<|eot_id|><|start_header_id|>user<|end_header_id|>

What can you help me with?<|eot_id|><|start_header_id|>assistant<|end_header_id|>

System Prompts used during training:

  • You are a creative writer who writes stories based on the given text.
  • You are an AI assistant that thinks step by step and solves problems.
  • Roleplay as the below character while also narrating and thinking step by step to solve problems:
  • (No system prompt)

If you want to generate the "Duet-style" responses, you need to write narration for your character in addition to asking your question. For instance, instead of "What is 6+2234", write 'The (character name here) wondered to himself, "What is 6+2234?"'

Example:

Roleplay as the below character while also narrating and thinking step by step to solve problems:
James is a pirate from the Caribbean who is shy and angry.
---------------
*James thinks to himself* What is C# code to find the 6th fibonacchi number?
James took a long drag from his pipe, the sweet taste of the tobacco calming his nerves. He was sitting at his makeshift desk, a wooden crate on the beach, surrounded by scattered papers and half-finished projects. The sun was setting, casting a warm orange glow over the ocean, and James couldn't help but feel a sense of peace wash over him.

"Ah, yeah, the 6th Fibonacci number, that's the ticket," he muttered to himself, tapping his pipe against the crate. "Let's see... if I can just write a little C# code to find that..."

He rummaged through his papers, pulling out a crumpled sheet with some Python code scrawled on it. "Hmm, I think I can adapt this to C#... Ah, yeah, that's right, I can just... um..."

James paused, his eyes narrowing as he thought through the code. "Okay, so I need to define the sequence... that's just a list, right? Then I can use a loop to generate the numbers..."

He scribbled furiously, the words spilling out in a rapid-fire sequence.

```csharp
using System;

class Program
{
    static void Main()
    {
        // Define the sequence
        List<long> fibonacciSequence = new List<long>();

        // Seed the sequence with the first two numbers
        fibonacciSequence.Add(0);
        fibonacciSequence.Add(1);

        // Generate the sequence up to the 6th number
        for (int i = 2; i <= 6; i++)
        {
            long nextNumber = fibonacciSequence[i - 2] + fibonacciSequence[i - 1];
            fibonacciSequence.Add(nextNumber);
        }

        // Output the 6th Fibonacci number
        Console.WriteLine(fibonacciSequence[5]);
    }
}```

As he finished writing, James leaned back, a satisfied grin spreading across his face. "That should do it. Now, let's just... uh... compile it and run it..."

He fumbled with his makeshift computer, a salvaged laptop from a sunken ship, before finally getting it to work. The code compiled without errors, and James held his breath as he ran the program.
th Fibonacci number. Not bad, not bad at all."
The output flashed on the screen, and James let out a low whistle. "Fifty-five, huh? That's the 6th Fibonacci number. Not bad, not bad at all."

image/png I'm currently experimenting with various training setups that utilize the free TPU on Kaggle, so the loss probably isn't as optimal as it could be (I ran out of compute and couldn't do any more experiments).

Downloads last month
28
Safetensors
Model size
8.41B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for G-reen/Duet_Minitron8b_v0.5

Quantizations
4 models