Edit model card

Sejong-Qwen-v5_inference.ipynb: Open In Colab

Usage:

!pip install transformers einops accelerate
!pip install qwen
!pip install unsloth

from transformers import AutoTokenizer, AutoModelForCausalLM

# ν† ν¬λ‚˜μ΄μ €μ™€ λͺ¨λΈ λ‘œλ“œ
tokenizer = AutoTokenizer.from_pretrained(
    "SejongKRX/Sejong-Qwen-v5",
    trust_remote_code=True,
    use_fast=False
)
model = AutoModelForCausalLM.from_pretrained(
    "SejongKRX/Sejong-Qwen-v5",
    trust_remote_code=True
)

# μž…λ ₯ ν…μŠ€νŠΈ
input_text =  """
λ‹€μŒ 쀑 ν™”νμ˜ μ‹œκ°„κ°€μΉ˜μ— κ΄€ν•œ μ„€λͺ…μœΌλ‘œ μ˜³μ§€ μ•Šμ€ 것은 무엇인가?

A. μ›” 볡리의 경우, 맀월 μ μš©λ˜λŠ” μ΄μžμœ¨μ€ μ—°κ°„ λͺ…λͺ© μ΄μžμœ¨μ„ 1/12둜 λ‚˜λˆ„μ–΄ μ‚°μΆœν•œλ‹€.
B. 투자 μ›κΈˆ 및 기타 쑰건이 동일할 경우, 단리 방식보닀 볡리 λ°©μ‹μ—μ„œ λ°œμƒν•˜λŠ” μ΄μžκ°€ 더 크닀.
C. μΌμ‹œλΆˆλ‘œ 지급될 κΈˆμ•‘μ˜ ν˜„μž¬ κ°€μΉ˜λŠ” 미래 κ°€μΉ˜λ₯Ό 일정 κΈ°κ°„ λ™μ•ˆ ν• μΈμœ¨μ„ μ μš©ν•΄ μ‚°μΆœν•  수 μžˆλ‹€.
D. 1,000,000원을 μ—° 5% 볡리둜 2λ…„ λ™μ•ˆ μ˜ˆμΉ˜ν–ˆμ„ 경우, λ§ŒκΈ°μ— 받을 μ„Έμ „ μ΄μžλŠ” 100,000원이닀.

### μ •λ‹΅:
"""

inputs = tokenizer(input_text, return_tensors="pt")

# λͺ¨λΈμ„ μ‚¬μš©ν•˜μ—¬ ν…μŠ€νŠΈ 생성
output = model.generate(**inputs, max_new_tokens=1500)

# κ²°κ³Ό λ””μ½”λ”©
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
print(generated_text)

output:

λ‹€μŒ 쀑 ν™”νμ˜ μ‹œκ°„κ°€μΉ˜μ— κ΄€ν•œ μ„€λͺ…μœΌλ‘œ μ˜³μ§€ μ•Šμ€ 것은 무엇인가?

A. μ›” 볡리의 경우, 맀월 μ μš©λ˜λŠ” μ΄μžμœ¨μ€ μ—°κ°„ λͺ…λͺ© μ΄μžμœ¨μ„ 1/12둜 λ‚˜λˆ„μ–΄ μ‚°μΆœν•œλ‹€.
B. 투자 μ›κΈˆ 및 기타 쑰건이 동일할 경우, 단리 방식보닀 볡리 λ°©μ‹μ—μ„œ λ°œμƒν•˜λŠ” μ΄μžκ°€ 더 크닀.
C. μΌμ‹œλΆˆλ‘œ 지급될 κΈˆμ•‘μ˜ ν˜„μž¬ κ°€μΉ˜λŠ” 미래 κ°€μΉ˜λ₯Ό 일정 κΈ°κ°„ λ™μ•ˆ ν• μΈμœ¨μ„ μ μš©ν•΄ μ‚°μΆœν•  수 μžˆλ‹€.
D. 1,000,000원을 μ—° 5% 볡리둜 2λ…„ λ™μ•ˆ μ˜ˆμΉ˜ν–ˆμ„ 경우, λ§ŒκΈ°μ— 받을 μ„Έμ „ μ΄μžλŠ” 100,000원이닀.

### μ •λ‹΅:
D

Dataset

λ³Έ λͺ¨λΈμ€ λ‹€μ–‘ν•œ 좜처의 데이터(Wikipedia 및 ν•œκ΅­μ€ν–‰μ˜ 곡곡 데이터)λ₯Ό ν™œμš©ν•˜μ—¬ ν•™μŠ΅λ˜μ—ˆμœΌλ©°, λͺ¨λ“  λ°μ΄ν„°λŠ” μ €μž‘κΆŒ 및 μ‚¬μš© 정책에 따라 적절히 μ‚¬μš©λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

  • Wikipedia λ°μ΄ν„°λŠ” CC BY-SA 4.0 λΌμ΄μ„ μŠ€λ₯Ό λ”°λ¦…λ‹ˆλ‹€. μžμ„Έν•œ μ •λ³΄λŠ” μ—¬κΈ°μ—μ„œ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€.
  • ν•œκ΅­μ€ν–‰μ˜ λ°μ΄ν„°λŠ” ν•œκ΅­μ€ν–‰μ˜ μ €μž‘κΆŒ λ³΄ν˜Έλ°©μΉ¨μ— 따라 μ‚¬μš©λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

Uploaded model

  • Developed by: SejongKRX
  • License: apache-2.0
  • Finetuned from model : unsloth/qwen2.5-7b-bnb-4bit

This qwen2 model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
43
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SejongKRX/Sejong-Qwen-v5

Quantizations
1 model