parinitarahi commited on
Commit
b444d92
1 Parent(s): b194e44

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -6,7 +6,7 @@ inference: false
6
  ---
7
 
8
  # Phi-3.5-mini-4K-Instruct ONNX models
9
- This repository hosts the optimized versions of [Phi-3.5-mini-4k-instruct](https://aka.ms/phi-3.5-mini-4k-instruct) to accelerate inference with ONNX Runtime.
10
  Optimized Phi-3.5 Mini models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai/) on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets.
11
 
12
  To easily get started with Phi-3.5, you can use our newly introduced ONNX Runtime Generate() API. See [here](https://aka.ms/generate-tutorial) for instructions on how to run it.
 
6
  ---
7
 
8
  # Phi-3.5-mini-4K-Instruct ONNX models
9
+ This repository hosts the optimized versions of [Phi-3.5-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) to accelerate inference with ONNX Runtime.
10
  Optimized Phi-3.5 Mini models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai/) on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets.
11
 
12
  To easily get started with Phi-3.5, you can use our newly introduced ONNX Runtime Generate() API. See [here](https://aka.ms/generate-tutorial) for instructions on how to run it.