Comments (10)
@AliGharbali can you provide your model and your custom class file if you have one? I need to reproduce your result to see what's happening.
Edit: closed issue by mistake
from mrcnn_serving_ready.
@bendangnuksung thanks for the answer it would be great if you can reproduce the results. You can download them from here.
from mrcnn_serving_ready.
Hi @AliGharbali! I was successful to convert your model to Tensorflow model and Serving model. Have you set your saving model path correctly?
from mrcnn_serving_ready.
@bendangnuksung thanks for making this conversion so easy. I have the same observation as @AliGharbali. There is no files under the folder "variables". I was able to serve it though. I haven't yet compare the results from serving to the 'correct' ones. I will update once I have the comparison. I am new to tensorflow in general. So I am not sure what's the difference whether having files or not in folder "variables"...
from mrcnn_serving_ready.
@cfengai The Variables folder is indeed empty however you can still serve and inference it. Variables folder do not always need to have files in it. @AliGharbali Sorry! I misunderstood your question, I thought you meant the model save path does not contain any 'saved_model'. I understood now, thanks to @cfengai
from mrcnn_serving_ready.
@bendangnuksung No problem, I will try with empty variable folder and check the results. @cfengai we have indeed same issue, thanks for mentioning your issue here and waiting to hear from you soon. I will keep updating this post as well!
from mrcnn_serving_ready.
@bendangnuksung Hi again, I successfully serve the model and now I am writing the client api to be able to do object detection. Meanwhile I have a question and would be very happy if you can put time for it. I want to define "signature_name" for my model which is not defined in the "signature_def_map". How should I properly define it? Also, if you can help me create the client api or share a useful link, it would be great.
from mrcnn_serving_ready.
@bendangnuksung First want to report that the model serves fine with single image inference. Thanks a lot! The other thing I tried is to do multiple images at a time but failed. After I changed IMAGES_PER_GPU=2. The converted model reports a weird/wrong meta info for output mrcnn_detection/Reshape_1.
I was expecting something like:
outputs['mrcnn_detection/Reshape_1'] tensor_info:
dtype: DT_FLOAT
shape: (2, 20, 6)
name: mrcnn_detection/Reshape_1:0
from mrcnn_serving_ready.
@AliGharbali The 'signature_name' has already been defined with the Tensorflow default signature name which is 'serving_default'. You can still change the signature name in 'main.py'
# from:
sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
tf.saved_model.signature_def_utils.predict_signature_def(
{"input_image": input_image, 'input_image_meta': input_image_meta, 'input_anchors': input_anchors},
{"mrcnn_detection/Reshape_1": output_detection, 'mrcnn_mask/Reshape_1': output_mask})
# to:
sigs[YOUR_CUSTOM_NAME] = \
tf.saved_model.signature_def_utils.predict_signature_def(
{"input_image": input_image, 'input_image_meta': input_image_meta, 'input_anchors': input_anchors},
{"mrcnn_detection/Reshape_1": output_detection, 'mrcnn_mask/Reshape_1': output_mask})
And you can even change the name of input signature and output signature.
from mrcnn_serving_ready.
@cfengai It seems that the model has been structured in a way to inference a single image at a time. Needs to be fixed from the OP's end. Would be awesome if it can be fixed from our end. I would request you to open a new issue for this as it is diverging from the original issue. Thanks!
@AliGharbali I believed that the original issue has not caused any problem for the serving model. So closing this issue. Re-open anytime if you think if it is not fixed.
from mrcnn_serving_ready.
Related Issues (20)
- Frozen graph generates difference results compared to the original model HOT 2
- there is not any files in the folder about serving_model/1/variables. HOT 12
- Which .pb file should I use? The one in serving model or the one in frozen model? HOT 2
- How to use saved_model.pb in Tensorflow Model Server? HOT 8
- Can not convert to tflite (UnicodeDecodeError: 'utf-8' codec can't decode byte 0xbb in position 3: invalid start byte) HOT 3
- Converting frozen graph in mvNCCompile
- output_names=[out.op.name for out in model.outputs][:4] should be modified to get the 5 loss nodes HOT 2
- What should the inputs and outputs of the SavedModel SignatureDef look like? HOT 2
- About How to predict by tensorflow serving. HOT 2
- How can I read input and output node from frozen inference model (.pb)? HOT 2
- Inference speed really slow! HOT 2
- Rest Api Error
- { "error": "Malformed request: GET /v1/models/mask:predict" } HOT 1
- Unable to convert to tensorflow serving HOT 2
- Deployment on serving failed
- Error help pleas
- sending your input data instances as a JSON object to deployed model HOT 3
- gRPC error HOT 2
- error while converting keras model to tensorflow saved model format
- Error to run main.py file
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mrcnn_serving_ready.