0

I've frozen and exported a SavedModel, which takes as input a batch of videos that has the following format according to saved_model_cli:

The given SavedModel SignatureDef contains the following input(s): inputs['ims_ph'] tensor_info: dtype: DT_UINT8 shape: (1, 248, 224, 224, 3) name: Placeholder:0 inputs['samples_ph'] tensor_info: dtype: DT_FLOAT shape: (1, 173774, 2) name: Placeholder_1:0 The given SavedModel SignatureDef contains the following output(s): ... << OUTPUTS >> ...... Method name is: tensorflow/serving/predict 

I have a TF-Serving (HTTP/REST) server successfully running locally. From my Python client code, I have 2 populated objects of type numpy.ndarray, named ims of shape (1, 248, 224, 224, 3) -- and samples of shape (1, 173774, 2).

I am trying to run an inference against my TF model server (see client code below) but am receiving the following error: {u'error': u'JSON Parse error: Invalid value. at offset: 0'}

# I have tried the following combinations without success: data = {"instances" : [{"ims_ph": ims.tolist()}, {"samples_ph": samples.tolist()} ]} data = {"inputs" : { "ims_ph": ims, "samples_ph": samples} } r = requests.post(url="http://localhost:9000/v1/models/multisensory:predict", data=data) 

The TF-Serving REST docs don't seem to indicate that any extra escaping / encoding is required here for these two input tensors. As these aren't binary data, I don't think base64 encoding is the right approach either. Any pointers to a working approach here would be greatly appreciated!

1 Answer 1

2

You should send your request like this, json serialize request body first.

r = requests.post(url="http://localhost:9000/v1/models/multisensory:predict", data=json.dumps(data)) 
Sign up to request clarification or add additional context in comments.

1 Comment

It would've been helpful to understand why this is accepted solution i.e. some context.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.