{"id": "202308230014", "lm_model_name": "cyberagent/open-calm-3b", "tokenizer_model_name": "cyberagent/open-calm-3b", "year": "2015", "sample_n": 30, "text_row": "text", "n_token": null, "useint8": true, "lora_r": 8, "lora_alpha": 16, "lora_dropout": 0.05, "lora_bias": "none", "ta_epochs": 3, "ta_batch_size": 4, "ta_warmup_steps": 200, "ta_weight_decay": 0.1, "ta_learning_rate": 0.0005} --- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - load_in_8bit: True - load_in_4bit: False - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.4.0