Text Classification
Transformers
Safetensors
English
HHEMv2Config
custom_code

How to use "Trust_remote_code" using InferenceClient

#11
by shubhangikat - opened

This the function in which I'm trying to access hugging face through Inference Client and how do I send "Trust_remote_code" so that I don't get the issue.
def hallucination(llm_response, ground_truth):
output_from_client = client.post(model = 'vectara/hallucination_evaluation_model',
json = {"text": llm_response, "text_pair": ground_truth})

raw_score = json.loads(output_from_client.decode('utf-8'))
hallucination_score = raw_score[0]['score']
return hallucination_score
Vectara org

@shubhangikat thanks for your inquiry. We are working on it now.

Sign up or log in to comment