Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
Edit model card

Viking 7B

Viking 7B is a 7B parameter decoder-only transformer pretrained on Finnish, English, Swedish, Danish, Norwegian, Icelandic and code. It has been trained on 2 trillion tokens. Viking 7B is a fully open source model and is made available under the Apache 2.0 License.

Viking was created in a collaboration between the TurkuNLP group of the University of Turku, SiloGen from Silo AI,and High Performance Language Technologies (HPLT). Training was conducted on the LUMI supercomputer, using compute resources generously provided by CSC - IT Center for Science, Finland.

This project is part of an ongoing effort to create open source large language models for non-English and especially low resource languages like Finnish. The mode is fluent in Finnish, English, the Scandinavian languages and capable of basic translation between them. It is also able to understand and generate code.

Model Family

Viking is the second set of models released by LumiOpen and is available at 3 parameter counts:

Viking 7B

Viking 13B

Viking 33B

Model Overview

NOTE: Viking is a base model which needs further fine tuning for most use cases.

Viking is a generative pretrained transformer using a LLaMA-like GPT architecture, and makes use of rotary positional embeddings and flash attention.

Hyperparameter Value
n_parameters 7.55B
n_layers 32
n_heads 32
d_model 4096
vocab_size 131072
sequence_length 4096

Training

Viking 7B was trained on the LUMI supercomputer, using 256 AMD MI250X GPUs. Each MI250X GPU has two Graphics Complex Dies (GCDs) for a world size of 512 during training, using activation checkpointing, a micro batch size of 1, gradient accumulation of 16, and a 3D parallelism strategy of TP=1, PP=4, DP=128.

Training began in September 2023 using a custom fork of the Megatron-Deepspeed framework.

Training Hyperparameters

Hyperparameter Value Comment
Precision bfloat16
Optimizer AdamW
Learning rate 3e-4 10B tokens warm-up, cosine decay to 3e-5
Weight decay 1e-1
Batch size 1024 1024 samples x 4096 tokens = 4194304 tokens

Tokenizer

Viking uses a custom 128K Bloom tokenizer trained on the same English, Finnish, Swedish, Danish, Norwegian, Icelandic and code dataset used to train the model.

Dataset

Viking is being trained on a 2 trillion token mixed dataset of English, Finnish, Swedish, Danish, Norwegian, Icelandic and code.

More details on exact dataset will be published soon.

Evaluation Results

Full evaluation results will be published with the final model.

Training Checkpoints

Training Checkpoints are available as branches in the repository. Checkpoints will be released roughly every 100B tokens. The main branch will always point to the latest checkpoint. The following checkpoints are available:

The transformers library allows you to load a checkpoint from a branch as follows:

branch = "2000B"
model = transformers.AutoModelForCausalLM.from_pretrained(
    "LumiOpen/Viking-7B",
    torch_dtype=torch.bfloat16 if torch.cuda.is_bf16_supported() else torch.float16,
    revision=branch,
)

Ethical Considerations and Limitations

Viking 7B is a release of a partially trained model, and special care should be taken when using any output.

Viking is an advanced language model, primarily optimized for English, Finnish, Swedish, Norwegian, Danish, Icelandic and code, with no meaningful proficiency in any other languages. As with most AI-driven systems, Viking is a product of the vast data it has been trained on, which may reflect the imperfections, biases, and idiosyncrasies of the wider web. Viking may, at times, produce outputs that can be considered inaccurate, prejudiced, or controversial. Users and developers engaging with Viking should exercise discretion and consider additional evaluation and customization to ensure the model's responses align with their specific needs and ethical standards.

License

Viking is released under the Apache 2.0 license.

Downloads last month
1,154
Safetensors
Model size
7.55B params
Tensor type
BF16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LumiOpen/Viking-7B

Adapters
8 models
Finetunes
19 models
Merges
1 model
Quantizations
11 models

Datasets used to train LumiOpen/Viking-7B

Spaces using LumiOpen/Viking-7B 3

Collection including LumiOpen/Viking-7B