MathGenie's picture
Update README.md
0170483 verified
|
raw
history blame
863 Bytes
metadata
license: apache-2.0
datasets:
  - MathGenie/MathCode-Pile
language:
  - en
metrics:
  - accuracy
base_model:
  - mistralai/Mistral-7B-v0.1
pipeline_tag: text-generation
tags:
  - math

MathCoder2

Introduction

The MathCoder2 models are created by conducting continued pretraining on MathCode-Pile. They are introduced in the paper MathCoder2: Better Math Reasoning from Continued Pretraining on Model-translated Mathematical Code.

The mathematical pretraining dataset includes mathematical code accompanied with natural language reasoning steps, making it a superior resource for models aimed at performing advanced mathematical reasoning tasks.

Evaluation

image/png