How to set gpu device when doing inference
#1
by
sersoage
- opened
Hi I am trying to use your models and unable to use and specific gpu when doing inference? could you point to a resource to do so?
Thanks!
Hey!
Well, you can specify the device parameter in the pipeline function. More information here: https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.pipeline.device.
This worked for me:
extractor = KeyphraseExtractionPipeline(model=model_name, device=0)
Please check that you have installed PyTorch for GPU. Otherwise here is the link: https://pytorch.org/get-started/locally/.
Hope this helps!