YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
Experiment1-7B - bnb 4bits
- Model creator: https://huggingface.co/yam-peleg/
- Original model: https://huggingface.co/yam-peleg/Experiment1-7B/
Original model description:
license: apache-2.0 language: - en library_name: transformers pipeline_tag: text-generation tags: - chat
Experiment1-7B
An experiment for testing and refining a specific training and evaluation pipeline research framework.
This experiment aims to identify potential optimizations, focusing on data engineering, architecture efficiency, and evaluation performance.
The goal is to evaluate the effectiveness of a new training / evaluation pipeline for LLMs.
The experiment will explore adjustments in data preprocessing, model training algorithms, and evaluation metrics to test methods for improvement.
More details in the future experiments.
license: apache-2.0
- Downloads last month
- 2