GithubHelp home page GithubHelp logo

Comments (8)

tilakrayal avatar tilakrayal commented on June 2, 2024

@Yummyto,
Could you please try to follow the below steps to resolve the issue. Also try to upgrade the protobuf package.

pip install --upgrade protobuf

- Use protobuf version 3.19.4 for when using object detection
- Download builder.py from [github repo](https://github.com/protocolbuffers/protobuf/blob/main/python/google/protobuf/internal/builder.py)
- Place this downloaded builder.py inside your protobuf installation 

https://stackoverflow.com/questions/71759248/importerror-cannot-import-name-builder-from-google-protobuf-internal

Thank you!

from tensorflow.

Yummyto avatar Yummyto commented on June 2, 2024

I got this error when I tried to install protobuf 3.19.4

ERROR: Exception:
Traceback (most recent call last):
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\urllib3\response.py", line 437, in _error_catcher
yield
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\urllib3\response.py", line 560, in read
data = self._fp_read(amt) if not fp_closed else b""
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\urllib3\response.py", line 526, in _fp_read
return self._fp.read(amt) if amt is not None else self._fp.read()
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\cachecontrol\filewrapper.py", line 90, in read
data = self.__fp.read(amt)
File "C:\Users\Liam\Anaconda3\lib\http\client.py", line 447, in read
n = self.readinto(b)
File "C:\Users\Liam\Anaconda3\lib\http\client.py", line 491, in readinto
n = self.fp.readinto(b)
File "C:\Users\Liam\Anaconda3\lib\socket.py", line 589, in readinto
return self._sock.recv_into(b)
File "C:\Users\Liam\Anaconda3\lib\ssl.py", line 1052, in recv_into
return self.read(nbytes, buffer)
File "C:\Users\Liam\Anaconda3\lib\ssl.py", line 911, in read
return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\cli\base_command.py", line 160, in exc_logging_wrapper
status = run_func(*args)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\cli\req_command.py", line 247, in wrapper
return func(self, options, args)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\commands\install.py", line 401, in run
reqs, check_supported_wheels=not options.target_dir
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\resolver.py", line 93, in resolve
collected.requirements, max_rounds=try_to_avoid_resolution_too_deep
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 481, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 348, in resolve
self._add_to_criteria(self.state.criteria, r, parent=None)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 172, in _add_to_criteria
if not criterion.candidates:
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\resolvelib\structs.py", line 151, in bool
return bool(self._sequence)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 155, in bool
return any(self)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 143, in
return (c for c in iterator if id(c) not in self._incompatible_ids)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 47, in _iter_built
candidate = func()
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\factory.py", line 211, in _make_candidate_from_link
version=version,
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 303, in init
version=version,
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 162, in init
self.dist = self._prepare()
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 231, in _prepare
dist = self._prepare_distribution()
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 308, in _prepare_distribution
return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\operations\prepare.py", line 491, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\operations\prepare.py", line 542, in _prepare_linked_requirement
hashes,
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\operations\prepare.py", line 170, in unpack_url
hashes=hashes,
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\operations\prepare.py", line 107, in get_http_url
from_path, content_type = download(link, temp_dir.path)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\network\download.py", line 147, in call
for chunk in chunks:
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\cli\progress_bars.py", line 53, in _rich_progress_bar
for chunk in iterable:
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_internal\network\utils.py", line 87, in response_chunks
decode_content=False,
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\urllib3\response.py", line 621, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\urllib3\response.py", line 586, in read
raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
File "C:\Users\Liam\Anaconda3\lib\contextlib.py", line 130, in exit
self.gen.throw(type, value, traceback)
File "C:\Users\Liam\Anaconda3\lib\site-packages\pip_vendor\urllib3\response.py", line 442, in _error_catcher
raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.

Thank you!

from tensorflow.

Yummyto avatar Yummyto commented on June 2, 2024

I now installed the protobuf 3.19.4 and added to this path (C:\Users\Liam\Anaconda3\lib\site packages\google\protobuf\internal_init_.py) however I still got the same error ImportError: cannot import name 'builder' from 'google.protobuf.internal' (C:\Users\Liam\Anaconda3\lib\site-packages\google\protobuf\internal_init_.py). Should I copy the code of builder.py then and paste it to init.py?

from tensorflow.

Yummyto avatar Yummyto commented on June 2, 2024

I now got it and resolved the problem. However it still got me this new error AttributeError: module 'tensorflow.compat.v2' has no attribute 'internal'. What should I do?

Thank you!

from tensorflow.

Yummyto avatar Yummyto commented on June 2, 2024

This is the error that I got

AttributeError Traceback (most recent call last)
in
2 from object_detection.utils import label_map_util
3 from object_detection.utils import visualization_utils as viz_utils
----> 4 from object_detection.builders import model_builder

~\Anaconda3\lib\site-packages\object_detection\builders\model_builder.py in
35 from object_detection.meta_architectures import center_net_meta_arch
36 from object_detection.meta_architectures import context_rcnn_meta_arch
---> 37 from object_detection.meta_architectures import deepmac_meta_arch
38 from object_detection.meta_architectures import faster_rcnn_meta_arch
39 from object_detection.meta_architectures import rfcn_meta_arch

~\Anaconda3\lib\site-packages\object_detection\meta_architectures\deepmac_meta_arch.py in
18 from object_detection.meta_architectures import center_net_meta_arch
19 from object_detection.models.keras_models import hourglass_network
---> 20 from object_detection.models.keras_models import resnet_v1
21 from object_detection.protos import center_net_pb2
22 from object_detection.protos import losses_pb2

~\Anaconda3\lib\site-packages\object_detection\models\keras_models\resnet_v1.py in
26
27 try:
---> 28 from keras.applications import resnet # pylint: disable=g-import-not-at-top
29 except ImportError:
30 from tf_keras.applications import resnet # pylint: disable=g-import-not-at-top

~\AppData\Roaming\Python\Python37\site-packages\keras_init_.py in
18 keras.io.
19 """
---> 20 from keras import distribute
21 from keras import models
22 from keras.engine.input_layer import Input

~\AppData\Roaming\Python\Python37\site-packages\keras\distribute_init_.py in
16
17
---> 18 from keras.distribute import sidecar_evaluator

~\AppData\Roaming\Python\Python37\site-packages\keras\distribute\sidecar_evaluator.py in
20 from tensorflow.python.platform import tf_logging as logging
21 from tensorflow.python.util import deprecation
---> 22 from keras.optimizers.optimizer_experimental import (
23 optimizer as optimizer_experimental,
24 )

~\AppData\Roaming\Python\Python37\site-packages\keras\optimizers_init_.py in
23
24 # Imports needed for deserialization.
---> 25 from keras import backend
26 from keras.optimizers.legacy import adadelta as adadelta_legacy
27 from keras.optimizers.legacy import adagrad as adagrad_legacy

~\AppData\Roaming\Python\Python37\site-packages\keras\backend.py in
30 import tensorflow.compat.v2 as tf
31
---> 32 from keras import backend_config
33 from keras.distribute import distribute_coordinator_utils as dc
34 from keras.engine import keras_tensor

~\AppData\Roaming\Python\Python37\site-packages\keras\backend_config.py in
31
32 @keras_export("keras.backend.epsilon")
---> 33 @tf.internal.dispatch.add_dispatch_support
34 def epsilon():
35 """Returns the value of the fuzz factor used in numeric expressions.

AttributeError: module 'tensorflow.compat.v2' has no attribute 'internal'

from tensorflow.

tilakrayal avatar tilakrayal commented on June 2, 2024

@Yummyto,
Glad the build issue was resolved and could you please confirm whether you are trying to execute the code of the object detection which is related to TensorFlow 2.x version or the older TensorFlow version. Also I suspect that the error indicates an issue with how you are importing TensorFlow (TF) and potentially a version mismatch. Thank you!

from tensorflow.

Yummyto avatar Yummyto commented on June 2, 2024

Thank you so much for your help. I do have a question, what is the should I do since I get this error prompt AttributeError: module 'tensorflow.keras.optimizers' has no attribute 'experimental'?

from tensorflow.

tilakrayal avatar tilakrayal commented on June 2, 2024

@Yummyto,
I suspect you are using tf.keras.optimizers.experimental.SGD() and tf.keras.optimizers.experimental.Adam() in your code. Those experimental optimizers were available in tensorflow2.9. Now you can use the regular optimizers (like tf.keras.optimizers.Adam/tf.keras.optimizers.SGD .

https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/SGD

Thank you!

from tensorflow.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.