Edit model card

Gemma2-2B-it Merged Fine-Tuned Models for Chinese & German understanding

Lightweight language model based on Gemma2 2B created by merging multiple fine tuned Gemma2-2B-IT versions to test multilingual conversation capabilities in specialized low parameter language models.

🤏 Models Merged

This is a merge of pre-trained language models created using mergekit. This model was merged using the Model Stock merge method using google/gemma-2-2b-it as a base.

The following models were included in the merge:

🧩 Configuration

The following YAML configuration was used to produce this model:

models:
  - model: google/gemma-2-2b-it
  - model: VAGOsolutions/SauerkrautLM-gemma-2-2b-it
  - model: stvlynn/Gemma-2-2b-Chinese-it
merge_method: model_stock
base_model: google/gemma-2-2b-it
dtype: bfloat16

💻 Usage

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

# Load the tokenizer and model
tokenizer = AutoTokenizer.from_pretrained("AdamLucek/gemma2-2b-it-chinese-german")
model = AutoModelForCausalLM.from_pretrained(
    "AdamLucek/gemma2-2b-it-chinese-german",
    device_map="cuda",
    torch_dtype=torch.bfloat16
)

# Prepare the input text
input_text = "请解释一下量子力学中的叠加原理,并举例说明该原理在实际应用中的重要性和挑战。"
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")

# Generate the output
outputs = model.generate(
    **input_ids,
    max_new_tokens=256,
    pad_token_id=tokenizer.eos_token_id
)

# Decode and print the generated text
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Ouptut

## 量子叠加原理:

**叠加原理**是量子力学中一个重要的概念,它描述了量子系统在测量之前处于多个状态的可能性。

**简单来说,就是说,一个量子系统可以同时处于多个状态,直到我们测量它时,才会坍缩到一个确定的状态。**

**具体来说,我们可以用以下方式理解叠加原理:**

* **量子系统:** 比如一个原子,它可以处于多个能量状态。
* **叠加态:**  表示量子系统同时处于多个状态的概率分布。
* **测量:**  当我们测量量子系统时,它会坍缩到一个确定的状态。
* **坍缩:**  测量过程会改变量子系统的状态,使其坍缩到一个确定的状态。

**举例说明:**

想象一下一个量子系统,它可以处于两个状态:上或下。这个系统可以被描述为一个叠加态,表示它同时处于上和下两个状态的概率分布。

**如果我们没有测量这个系统,那么它就处于叠加态,同时处于上和下两个状态。**

**但是,当我们测量这个系统时
Downloads last month
14
Safetensors
Model size
2.61B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for AdamLucek/gemma2-2b-it-chinese-german

Collection including AdamLucek/gemma2-2b-it-chinese-german