a few edits for your model card (sorry I'm a grammar/writing nerd)
Here are some edits to the first two paragraphs that clean up some plural and descriptive grammatical errors (take it or leave it ;-):
β‘ 1
The Phi-3-Mini-128K-Instruct is a 3.8 billion parameter advanced open model. It is trained using synthetic data and filtered publicly available websites, emphasizing high-quality and reasoning-dense properties. The model is a member of the Phi-3 family, specifically the Mini version, available in two variants. The model belongs to the Phi-3 family with the Mini version in two variants 4K and 128K which is the context length (in tokens) that it can support.
β‘ 2
The model has undergone a post-training process that utilized both supervised fine-tuning and direct preference optimization, with an emphasis on instruction following and safety measures. When evaluated against benchmarks testing common sense, language understanding, math, code, long-form contextual understanding and logical reasoning, Phi-3 Mini-4K-Instruct exhibits a robust and state-of-the-art performance among models with less than 13 billion parameters.
They should have run it through the model to correct this automatically.