GithubHelp home page GithubHelp logo

Comments (7)

jameswex avatar jameswex commented on July 21, 2024

One way to possibly get a stack trace with the error is to create your WitWidget and save it to a var, like "ww = WitWidget(...)", and then in another cell call ww.infer(), which is what is called under the covers when inference needs to happen. The failure of that method call may give you a nice stack trace.

If not, you could pull down the repo and build the pip package locally, and make some code changes in the inference logic in your locally-built version to add some debugging prints and try to capture the error. Since your using TF serving, the code you would want to add debugging code to is at https://github.com/PAIR-code/what-if-tool/blob/master/utils/platform_utils.py#L160

from what-if-tool.

zmjjmz avatar zmjjmz commented on July 21, 2024

Thanks, I got a full stacktrace by calling infer_impl, which gives

TypeError                                 Traceback (most recent call last)
<ipython-input-18-ad89c1a8d5fa> in <module>
----> 1 witwidgie.infer_impl()

~/proj/dataplayground3/lib/python3.5/site-packages/witwidget/notebook/base.py in infer_impl(self)
    178     (predictions, extra_output) = (
    179       inference_utils.run_inference_for_inference_results(
--> 180         examples_to_infer, serving_bundle))
    181     infer_objs.append(predictions)
    182     extra_output_objs.append(extra_output)

~/proj/dataplayground3/lib/python3.5/site-packages/witwidget/_utils/inference_utils.py in run_inference_for_inference_results(examples, serving_bundle)
    631   """Calls servo and wraps the inference results."""
    632   (inference_result_proto, extra_results) = run_inference(
--> 633     examples, serving_bundle)
    634   inferences = wrap_inference_results(inference_result_proto)
    635   infer_json = json_format.MessageToJson(

~/proj/dataplayground3/lib/python3.5/site-packages/witwidget/_utils/inference_utils.py in run_inference(examples, serving_bundle)
    860             extra_results)
    861   else:
--> 862     return (platform_utils.call_servo(examples, serving_bundle), None)

~/proj/dataplayground3/lib/python3.5/site-packages/witwidget/_utils/platform_utils.py in call_servo(examples, serving_bundle)
    190     # utility file is bundled in the witwidget pip package which has a dep
    191     # on TensorFlow.
--> 192     request.inputs[serving_bundle.predict_input_tensor].CopyFrom(
    193       tf.compat.v1.make_tensor_proto(
    194         values=[ex.SerializeToString() for ex in examples],

TypeError: None has type NoneType, but expected one of: bytes, unicode

I'll try and debug this more on my end, but if you have any insight in the meantime it'd be helpful.

from what-if-tool.

jameswex avatar jameswex commented on July 21, 2024

You do need to provide a predict_input_tensor setting to indicate what the name of the tensor that TF serving expect to extract the serialized tf.Example from.

How do you query your served model outside of WIT? What is the exact input format you send to the server and what is the serving signature of that served model?

If your model is served in a way that WIT doesn't currently support sending inputs to, another option is for you to use the set_custom_predict_fn option of WitConfigBuilder to write your own python function to take the tf.Example protos, send them to your model, and return back the results to WIT.

from what-if-tool.

zmjjmz avatar zmjjmz commented on July 21, 2024

I can't provide the actual feature names, but essentially it looks like this for the REST API

curl -d '
{"inputs":
    {
        "a":[11,2],
        "b":[1,2]
}}'   -X POST http://dat000:8501/v1/models/fts_test:predict

{
    "outputs": [
        [
            0.600828409,
            0.399171621
        ],
        [
            0.745381653,
            0.254618317
        ]
    ]
}

It's not clear to me what the predict_input_tensor setting should be here, and providing "inputs" gave me the error given above.

from what-if-tool.

jameswex avatar jameswex commented on July 21, 2024

Ah, I see, thanks. Since your model accepts feature dictionaries directly, and not a serialized tf.Example proto (as through the TF serving ClassificationRequest or PredictRequest APIs throgh the PredictionService (https://github.com/tensorflow/serving/blob/5369880e9143aa00d586ee536c12b04e945a977c/tensorflow_serving/apis/prediction_service.proto#L15), the use of set_model_name and set_inference_address won't work for your case.

You can instead write your own python function that takes in a list of examples as input (which will be a list of items in pred_df_ex in your code sample), calls the model remotely, then returns the 2D list (outer list for each input that was predicted, inner list being length 2 containing probabilities for the negative and positive classes) that you see as the outputs value from your curl call. For some colab examples of custom predict functions (they don't call remote servers, so not an identical use-case), see https://colab.sandbox.google.com/github/PAIR-code/what-if-tool/blob/master/WIT_Smile_Detector.ipynb#scrollTo=E5fYynA9ZpPJ and https://colab.sandbox.google.com/github/pair-code/what-if-tool/blob/master/WIT_Toxicity_Text_Model_Comparison.ipynb

from what-if-tool.

jameswex avatar jameswex commented on July 21, 2024

Also we're working on improving our documentation, and will include questions like this is our upcoming FAQ. Sorry for the lack of clear documentation for use-cases such as yours.

from what-if-tool.

zmjjmz avatar zmjjmz commented on July 21, 2024

Ok, I was able to get this all working using a custom prediction function - thanks!

from what-if-tool.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.