You are truly a godsend!
#16
by
lcahill
- opened
✔️ Tool usage with specific tool tokens and fine tuning
✔️ Real open license
✔️ Enhanced inference library
✔️ Safetensors from the start
Thank you so much for your invaluable contribution to the community!
A few questions:
- Is the
mistral_inference
library using constrained generation to ensure that tools are called with the correct syntax? - Does the model use its own 'judgment' to decide whether or not to use a given tool?
- I am getting an error in this line from the readme. Is this a versioning issue? This follows a fresh
pip install mistral_inference
on windows:
Error:from mistral_inference.model import Transformer
ModuleNotFoundError: No module named 'mistral_inference.model'
lcahill
changed discussion status to
closed