GithubHelp home page GithubHelp logo

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. about tflite_flutter_plugin HOT 8 CLOSED

am15h avatar am15h commented on September 27, 2024 1
Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.

from tflite_flutter_plugin.

Comments (8)

mgalgs avatar mgalgs commented on September 27, 2024

How did you create your model? Have you gone through the docs for Running TF2 Detection API Models on mobile?

from tflite_flutter_plugin.

farazk86 avatar farazk86 commented on September 27, 2024

How did you create your model? Have you gone through the docs for Running TF2 Detection API Models on mobile?

Hi,

Yes, I've created the tflite model as per documentation.

import tensorflow as tf

TF_PATH = "/content/tf_model.pb" # where the forzen graph is stored
TFLITE_PATH = "./model.tflite"

# make a converter object from the saved tensorflow file
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(TF_PATH,  # TensorFlow freezegraph .pb model file
                                                      input_arrays=['input_ids'], 
                                                      output_arrays=['logits'],
                                                      )

converter.experimental_new_converter = True

converter.target_spec.supported_ops = [tf.compat.v1.lite.OpsSet.TFLITE_BUILTINS,
                                       tf.compat.v1.lite.OpsSet.SELECT_TF_OPS]

tf_lite_model = converter.convert()
# Save the model.
with open(TFLITE_PATH, 'wb') as f:
    f.write(tf_lite_model)

I'm converting an onnx model to tflite (onnx -> tensorflow -> tflite). The onnx model is working as expected when using the onnx interpreter onnxruntime.

Also, the converted tflite model also works when using the tflite interpreter in python.

tflite_interpreter = tf.lite.Interpreter(model_path='/content/model.tflite')
tflite_interpreter.allocate_tensors()

input_details = tflite_interpreter.get_input_details()
output_details = tflite_interpreter.get_output_details()

I only get this error when using the tflite_flutter interpreter. :(

from tflite_flutter_plugin.

mgalgs avatar mgalgs commented on September 27, 2024

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:

#59 (comment)

from tflite_flutter_plugin.

farazk86 avatar farazk86 commented on September 27, 2024

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:

#59 (comment)

Thanks for the link, but from the export_tflite_graph_tf2.py:

NOTE: This only supports SSD meta-architectures for now.

Since my model is text generation, not object detection, I wont be able to use the exporter linked. :(

from tflite_flutter_plugin.

am15h avatar am15h commented on September 27, 2024

Closing the issue as it does not seem to be directly related to this plugin.

from tflite_flutter_plugin.

anovis avatar anovis commented on September 27, 2024

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:
#59 (comment)

Thanks for the link, but from the export_tflite_graph_tf2.py:

NOTE: This only supports SSD meta-architectures for now.

Since my model is text generation, not object detection, I wont be able to use the exporter linked. :(

@farazk86 did you end up finding a solution around this? I am running into the same error

from tflite_flutter_plugin.

farazk86 avatar farazk86 commented on September 27, 2024

Did you run it through export_tflite_graph_tf2.py as well? I'm pretty sure this was the exact error message I was seeing before I started using that guy. I was doing the same thing as you, just using the tflite Python API to convert my model to tflite, but it needs to go through export_tflite_graph_tf2.py before you do that. Check out my earlier issue for more details:
#59 (comment)

Thanks for the link, but from the export_tflite_graph_tf2.py:

NOTE: This only supports SSD meta-architectures for now.

Since my model is text generation, not object detection, I wont be able to use the exporter linked. :(

@farazk86 did you end up finding a solution around this? I am running into the same error

No, but I ended up handling all tensorflow lite operations in Java using the flutter platform channel. Using the Java interpreter worked for me. But your error may be from using an incorrect converter. Make sure you know what tf version your main model was trained on. After tf==2.1 or tf==2.2 (dont exactly remember right now,),they changed how we convert models to tflite.

from tflite_flutter_plugin.

anovis avatar anovis commented on September 27, 2024

Thanks! I will take a look at that. I am using the prebuilt universal-sentence-encoder-multilingual model and I can't find what version they used to build it other than it was tensorflow 2.0.

from tflite_flutter_plugin.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.