GithubHelp home page GithubHelp logo

googleapis / python-monitoring Goto Github PK

View Code? Open in Web Editor NEW
34.0 46.0 30.0 4.64 MB

This library has moved to https://github.com/googleapis/google-cloud-python/tree/main/packages/google-cloud-monitoring

License: Apache License 2.0

python-monitoring's Introduction

NOTE

This github repository is archived. The repository contents and history have moved to google-cloud-python.

Python Client for Stackdriver Monitoring API

stable pypi versions

Stackdriver Monitoring API: collects metrics, events, and metadata from Google Cloud, Amazon Web Services (AWS), hosted uptime probes, and application instrumentation. Using the BindPlane service, you can also collect this data from over 150 common application components, on-premise systems, and hybrid cloud systems. Stackdriver ingests that data and generates insights via dashboards, charts, and alerts. BindPlane is included with your Google Cloud project at no additional cost.

Quick Start

In order to use this library, you first need to go through the following steps:

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the Stackdriver Monitoring API.
  4. Setup Authentication.

Installation

Install this library in a virtual environment using venv. venv is a tool that creates isolated Python environments. These isolated environments can have separate versions of Python packages, which allows you to isolate one project's dependencies from the dependencies of other projects.

With venv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.

Code samples and snippets

Code samples and snippets live in the samples/ folder.

Supported Python Versions

Our client libraries are compatible with all current active and maintenance versions of Python.

Python >= 3.7

Unsupported Python Versions

Python <= 3.6

If you are using an end-of-life version of Python, we recommend that you update as soon as possible to an actively supported version.

Mac/Linux

python3 -m venv <your-env>
source <your-env>/bin/activate
pip install google-cloud-monitoring

Windows

py -m venv <your-env>
.\<your-env>\Scripts\activate
pip install google-cloud-monitoring

Next Steps

python-monitoring's People

Contributors

arithmetic1728 avatar busunkim96 avatar chemelnucfin avatar crwilcox avatar dandhlee avatar daniel-sanche avatar daspecster avatar dhermes avatar dpebot avatar gcf-owl-bot[bot] avatar google-cloud-policy-bot[bot] avatar liyanhui1228 avatar lukesneeringer avatar minherz avatar nicain avatar parthea avatar plamut avatar pravindahal avatar qingling128 avatar release-please[bot] avatar renovate-bot avatar shabirmean avatar steinwaywhw avatar surferjeffatgoogle avatar tseaver avatar tswast avatar vam-google avatar vinbs avatar ylil93 avatar yoshi-automation avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

python-monitoring's Issues

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Please investigate and fix this issue within 5 business days. While it remains broken,
this library cannot be updated with changes to the python-monitoring API, and the library grows
stale.

See https://github.com/googleapis/synthtool/blob/master/autosynth/TroubleShooting.md
for trouble shooting tips.

Here's the output from running synth.py:

43f189a2a73/external/com_google_api_gax_java/repositories.bzl:60:5
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:217:1
DEBUG: Rule 'com_google_api_codegen' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "47e2d7649bfcef198515f1412853cd1ff784fa65e9543ef80a81ab601e4600c6"
DEBUG: Call stack for the definition of repository 'com_google_api_codegen' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:77:1
DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4b714b35ee04ba90f560ee60e64c7357428efcb6b0f3a298f343f8ec2c6d4a5d"
DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:234:1
DEBUG: Rule 'protoc_docs_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "33b387245455775e0de45869c7355cc5a9e98b396a6fc43b02812a63b75fee20"
DEBUG: Call stack for the definition of repository 'protoc_docs_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:258:1
DEBUG: Rule 'rules_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "48f7e716f4098b85296ad93f5a133baf712968c13fbc2fdf3a6136158fe86eac"
DEBUG: Call stack for the definition of repository 'rules_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:42:1
DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "bad076ad037cc7e23978af204d73abc4479a3a9fc40a016ceb4fe94c0153dcc8"
DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:278:1
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "c0d0efba86429cee5e52baf838165b0ed7cafae1748d025abec109d25e006628"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:300:1
DEBUG: Rule 'gapic_generator_php' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "8ba95eb35076a796b1dad2bb424532b7fc2610ae2f8b4e2bebaed0286fcb2a54"
DEBUG: Call stack for the definition of repository 'gapic_generator_php' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:364:1
DEBUG: Rule 'gapic_generator_ruby' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "a14ec475388542f2ea70d16d75579065758acc4b99fdd6d59463d54e1a9e4499"
DEBUG: Call stack for the definition of repository 'gapic_generator_ruby' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:408:1
DEBUG: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/rules_python/python/pip.bzl:61:5: DEPRECATED: the pip_repositories rule has been replaced with pip_install, please see rules_python 0.1 release notes
DEBUG: Rule 'bazel_skylib' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "1dde365491125a3db70731e25658dfdd3bc5dbdfd11b840b3e987ecf043c7ca0"
DEBUG: Call stack for the definition of repository 'bazel_skylib' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:35:1
Analyzing: target //google/monitoring/v3:monitoring-v3-py (1 packages loaded, 0 targets configured)
ERROR: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/upb/bazel/upb_proto_library.bzl:257:29: aspect() got unexpected keyword argument 'incompatible_use_toolchain_transition'
ERROR: Analysis of target '//google/monitoring/v3:monitoring-v3-py' failed; build aborted: error loading package '@com_github_grpc_grpc//': in /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl: Extension file 'bazel/upb_proto_library.bzl' has errors
INFO: Elapsed time: 0.326s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (3 packages loaded, 0 targets configured)
FAILED: Build did NOT complete successfully (3 packages loaded, 0 targets configured)

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
    proto_output_path="google/cloud/monitoring_v3/proto"
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
    return self._generate_code(service, version, "python", False, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 204, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-05-04 04:22:32,787 autosynth [ERROR] > Synthesis failed
2021-05-04 04:22:32,787 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at af95e7f chore(revert): revert preventing normalization (#126)
2021-05-04 04:22:32,793 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-05-04 04:22:32,798 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 356, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 191, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 336, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 68, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Monitoring: Can't supply a function without metric.type to timeSeries.list which prevents retrieving SLO data from Stackdriver monitoring API

Linux 4.19.67-2
Python 3.7.4
Name: google-cloud-monitoring
Version: 0.34.0

Attempt to query for timeSeries data of a SLO which is done by supply a filter of select_slo_budget("projects/[PROJECT_ID]/services/[SERVICE_ID]/serviceLevelObjectives/[SLO_ID]") or similar function as described herE: https://cloud.google.com/monitoring/service-monitoring/timeseries-selectors

The monitoring client is trying to be too smart and starts appending other filter expressions like:

Field filter had an invalid value of "metric.type = "" AND select_slo_burn_rate("projects/1063791683888/services/ist:bmenasha-anthos-cert-labs-1-zone-us-central1-b-central-default-reviews/serviceLevelObjectives/reviews-availability-slo", 30d)": Time series selectors cannot be used with metric type restrictions."

See google/cloud/monitoring_v3/query.py which always prefixes the filter with metric.type:

def __str__(self):
    filters = ['metric.type = "{type}"'.format(type=self.metric_type)]

.....

This is a request to be able to use the python client library to make this API call.

Code example

import datetime
import pprint
import google.cloud.monitoring_v3.query

client = google.cloud.monitoring_v3.MetricServiceClient()

metric_type =  ''
query = google.cloud.monitoring_v3.query.Query(
    client, 'bmenasha-1', '', datetime.datetime.now(), 1)
query = query.select_metrics('')
query = query.select_metrics('select_slo_burn_rate("projects/1063791683888/services/ist:bmenasha-anthos-cert-labs-1-zone-us-central1-b-central-default-reviews/serviceLevelObjectives/reviews-availability-slo", 30d)')
for time_series in query:
    pprint.pprint(time_series)

Stack trace

Traceback (most recent call last):
  File "justwork.py", line 12, in <module>
    for time_series in query:
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/cloud/monitoring_v3/query.py", line 438, in iter
    for ts in self._client.list_time_series(**params):
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 204, in _items_iter
    for page in self._page_iter(increment=False):
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 235, in _page_iter
    page = self._next_page()
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 526, in _next_page
    response = self._method(self._request)
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/retry.py", line 277, in retry_wrapped_func
    on_error=on_error,
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/retry.py", line 182, in retry_target
    return target()
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
    return func(*args, **kwargs)
  File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 Field filter had an invalid value of "metric.type = "" AND select_slo_burn_rate("projects/1063791683888/services/ist:bmenasha-anthos-cert-labs-1-zone-us-central1-b-central-default-reviews/serviceLevelObjectives/reviews-availability-slo", 30d)": Time series selectors cannot be used with metric type restrictions.

Making sure to follow these steps will guarantee the quickest resolution possible.

Thanks!

monitoring_v3.query.Query.align raises a TypeError on execution

Environment details

  • OS type and version: Macos 11.1
  • Python version: Python 3.8.5 (anaconda)
  • pip version: pip 21.0
  • google-cloud-monitoring version: 2.0.0

Steps to reproduce

While I can execute the query

from google.cloud.monitoring_v3.query import Query
query = Query(
     client=client,
     project=project_name,
     metric_type= 'storage.googleapis.com/storage/request_count'
).select_resources(
     bucket_name_prefix=bucket_name_prefix
).select_interval(
     end_time, start_time=end_time - report_interval
)
print(next(iter(query)))

adding an alignment aggregation fails:

from google.cloud.monitoring_v3 import Aggregation
query = query.align(
    Aggregation.Aligner.ALIGN_MAX, seconds=3600
)
print(next(iter(query)))

and raises

        params = self._build_query_params(headers_only, page_size)
>       for ts in self._client.list_time_series(**params):
E       TypeError: list_time_series() got an unexpected keyword argument 'aggregation'

.../site-packages/google/cloud/monitoring_v3/query.py:446: TypeError

Indeed, while

params["aggregation"] = types.Aggregation(

prepares an aggregation parameter,


accepts no such parameter.

Workaround

Directly use MetricServiceClient.list_time_series, with the aggregation specification within the request parameter:

request = dict(
      name=project_name,
      filter=f'metric.type = "storage.googleapis.com/storage/request_count" AND resource.label.bucket_name = starts_with("{bucket_name_prefix}")',
      interval=TimeInterval(dict(
          end_time=_to_seconds_nanos(end_time),
          start_time=_to_seconds_nanos(end_time - report_interval)
      )),
      view=ListTimeSeriesRequest.TimeSeriesView.FULL,
      aggregation=Aggregation(dict(
          per_series_aligner=Aggregation.Aligner.ALIGN_SUM,
          alignment_period=dict(seconds=3600)
      ))
  )
print(next(iter(client.list_time_series(request=request))))

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series_aggregate failed

This test failed!

To configure my behavior, see the Build Cop Bot documentation.

If I'm commenting on this issue too often, add the buildcop: quiet label and
I will stop commenting.


commit: 2976654
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
  }
  resour...91041
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py37"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fa088041a10>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
}
resourc...8591041
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py37"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:


state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series_reduce failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 2348671
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
  }
  resour...62823
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py38"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f8c62158190>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
}
resourc...9862823
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py38"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

stackdriver 500 error when exporting timeseries to it

i'm using a python 3.7 cloud function to read entries from a pub/sub and stream metrics to stackdriver

i use the lib google-cloud-monitoring 0.34.0 to export metrics to stackdriver

Steps to reproduce

send metrics to exporters (in my case bigquery and stackdriver)

import logging
import time
from google.cloud import monitoring_v3
from exporters.base import Exporter

LOGGER = logging.getLogger(__name__)
DEFAULT_METRIC_TYPE = "custom.googleapis.com/test/error_budget_burn_rate"
DEFAULT_METRIC_DESCRIPTION = ("Speed at which the error budget for a given"
                              "aggregation window is consumed")


class StackdriverExporter(Exporter):
    """Stackdriver Monitoring exporter class."""

    def __init__(self):
        self.client = monitoring_v3.MetricServiceClient()

    def export(self, data, **config):
        """Export data to Stackdriver Monitoring.

        Args:
            data (dict): Data to send to Stackdriver Monitoring.
            config (dict): Stackdriver Monitoring metric config.
                project_id (str): Stackdriver host project id.
                custom_metric_type (str): Custom metric type.
                custom_metric_unit (str): Custom metric unit.

        Returns:
            object: Stackdriver Monitoring API result.
        """
        self.create_metric_descriptor(data, **config)
        self.create_timeseries(data, **config)

    def create_timeseries(self, data, **config):
        """Create Stackdriver Monitoring timeseries.

        Args:
            data (dict): Data to send to Stackdriver Monitoring.
            config (dict): Metric config.

        Returns:
            object: Metric descriptor.
        """
        metric_type = DEFAULT_METRIC_TYPE
        series = monitoring_v3.types.TimeSeries()
        series.metric.type = config.get('metric_type', metric_type)

        # Write timeseries metric labels.
        series.metric.labels['timestamp_human'] = str(data['timestamp_human'])
        series.metric.labels['error_budget_policy_step_name'] = str(
            data['error_budget_policy_step_name'])
        series.metric.labels['measurement_window_seconds'] = str(data['measurement_window_seconds'])
        series.metric.labels['cluster_name'] = str(data['cluster_name'])
        series.metric.labels['msg_vpn_name'] = str(data['msg_vpn_name'])
        series.metric.labels['service_name'] = str(data['service_name'])
        series.metric.labels['feature_name'] = str(data['feature_name'])
        series.metric.labels['slo_name'] = str(data['slo_name'])
        series.metric.labels['alerting_burn_rate_threshold'] = str(
            data['alerting_burn_rate_threshold'])

        # Use the generic resource 'global'.
        series.resource.type = 'global'
        series.resource.labels['project_id'] = str(config['project_id'])

        # Create a new data point.
        point = series.points.add()

        # Define end point timestamp. (changed by CF timestamp to ingest all datas)
        timestamp = time.time()
        point.interval.end_time.seconds = int(timestamp)
        point.interval.end_time.nanos = int(
            (timestamp - point.interval.end_time.seconds) * 10**9)

        # Set the metric value.
        point.value.double_value = data['error_budget_burn_rate']

        # Record the timeseries to Stackdriver Monitoring.
        project = self.client.project_path(config['project_id'])
        result = self.client.create_time_series(project, [series])
        labels = series.metric.labels
        LOGGER.debug(
            f"timestamp: {timestamp} burnrate: {point.value.double_value}"\
            f"{labels['service_name']}-{labels['feature_name']}-"\
            f"{labels['slo_name']}-{labels['error_budget_policy_step_name']}")
        return result

    def create_metric_descriptor(self, data, **config):
        """Create Stackdriver Monitoring metric descriptor.

        Args:
            config (dict): Metric config.

        Returns:
            object: Metric descriptor.
        """
        metric_type = DEFAULT_METRIC_TYPE
        project = self.client.project_path(config['project_id'])
        descriptor = monitoring_v3.types.MetricDescriptor()
        descriptor.type = config.get('metric_type', metric_type)
        descriptor.metric_kind = (
            monitoring_v3.enums.MetricDescriptor.MetricKind.GAUGE)
        descriptor.value_type = (
            monitoring_v3.enums.MetricDescriptor.ValueType.DOUBLE)
        descriptor.description = config.get('metric_description',
                                            DEFAULT_METRIC_DESCRIPTION)
        self.client.create_metric_descriptor(project, descriptor)
        return descriptor

Stack trace

Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable return callable_(*args, **kwargs) File "/env/local/lib/python3.7/site-packages/grpc/_channel.py", line 826, in __call__ return _end_unary_response_blocking(state, call, False, None) File "/env/local/lib/python3.7/site-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking raise _InactiveRpcError(state) grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.INTERNAL details = "One or more TimeSeries could not be written: An internal error occurred.: timeSeries[0]" debug_error_string = "{"created":"@1583919882.548713353","description":"Error received from peer ipv4:173.194.76.95:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"One or more TimeSeries could not be written: An internal error occurred.: timeSeries[0]","grpc_status":13}" > The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 383, in run_background_function _function_handler.invoke_user_function(event_object) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 217, in invoke_user_function return call_user_function(request_or_event) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 214, in call_user_function event_context.Context(**request_or_event.context)) File "/user_code/main.py", line 35, in main streamer(json.loads(base64.b64decode(event['data']).decode('utf-8')), slo_config) File "/user_code/streamer.py", line 29, in streamer export(data, exporters) File "/user_code/streamer.py", line 55, in export ret = exporter.export(data, **config) File "/user_code/exporters/stackdriver.py", line 49, in export self.create_timeseries(data, **config) File "/user_code/exporters/stackdriver.py", line 96, in create_timeseries result = self.client.create_time_series(project, [series]) File "/env/local/lib/python3.7/site-packages/google/cloud/monitoring_v3/gapic/metric_service_client.py", line 1039, in create_time_series request, retry=retry, timeout=timeout, metadata=metadata File "/env/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__ return wrapped_func(*args, **kwargs) File "/env/local/lib/python3.7/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func on_error=on_error, File "/env/local/lib/python3.7/site-packages/google/api_core/retry.py", line 184, in retry_target return target() File "/env/local/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout return func(*args, **kwargs) File "/env/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: An internal error occurred.: timeSeries[0]

After various tests, removing the descriptor creation, seems to correct the issue !

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series_header failed

This test failed!

To configure my behavior, see the Build Cop Bot documentation.

If I'm commenting on this issue too often, add the buildcop: quiet label and
I will stop commenting.


commit: 2976654
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
  }
  resour...91041
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py37"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fa088041a10>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
}
resourc...8591041
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py37"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:


state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

samples.snippets.v3.cloud-client.snippets_test: test_list_metric_descriptors failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: c903285
buildURL: Build Status, Sponge
status: failed

Test output
args = (name: "projects/python-docs-samples-tests-py38"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fc4faefcbe0>
request = name: "projects/python-docs-samples-tests-py38"
, timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7fc4faeb8eb0>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fc4faf08280>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174412.869136579","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

capsys = <_pytest.capture.CaptureFixture object at 0x7fc4faf48b80>

def test_list_metric_descriptors(capsys):
  snippets.list_metric_descriptors(PROJECT_ID)

snippets_test.py:71:


snippets.py:213: in list_metric_descriptors
for descriptor in client.list_metric_descriptors(name=project_name):
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:636: in list_metric_descriptors
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

samples.snippets.v3.alerts-client.snippets_test: test_list_alert_policies failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: e6063de
buildURL: Build Status, Sponge
status: failed

Test output
capsys = <_pytest.capture.CaptureFixture object at 0x7f0447844b80>
pochan = 
def test_list_alert_policies(capsys, pochan):
    snippets.list_alert_policies(pochan.project_name)
    out, _ = capsys.readouterr()
  assert pochan.alert_policy.display_name in out

E assert 'snippets-test-enclcibdnr' in 'name display_name\n---------------------------...est-iejfpcqoox\nprojects/python-docs-samples-tests-py38/alertPolicies/6723190430256917261 snippets-test-ndmxzmxehz\n'
E + where 'snippets-test-enclcibdnr' = name: "projects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435"\ndisplay_name: "snippets-test-enclci...jects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435/conditions/10500225107291068640"\n}\nenabled {\n}\n.display_name
E + where name: "projects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435"\ndisplay_name: "snippets-test-enclci...jects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435/conditions/10500225107291068640"\n}\nenabled {\n}\n = <snippets_test.PochanFixture object at 0x7f0449cbc8e0>.alert_policy

snippets_test.py:128: AssertionError

Adding labels to the metric descriptor in the snippets.py

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Is your feature request related to a problem? Please describe.
Currently the snippet doesn't show how to add labels to the custom metric, unlike how java sdk snippet does.

Describe the solution you'd like
Add a test custom label to the metric in the create_metric_descriptor function

google.api_core.exceptions.InvalidArgument: 400 One or more TimeSeries could not be written: Points must be written in order. One or more of the points specified had an older end time than the most recent point.: timeSeries[0]

These are the points (In the order i am pushing to google cloud monitor):

  1. points {
    interval {
    end_time {
    seconds: 1602230436
    }
    }

  2. points {
    interval {
    end_time {
    seconds: 1602230437
    }
    }

  3. points {
    interval {
    end_time {
    seconds: 1602230438
    }
    }

  4. points {
    interval {
    end_time {
    seconds: 1602230438
    }
    }

  5. points {
    interval {
    end_time {
    seconds: 1602230439
    }
    }

  6. points {
    interval {
    end_time {
    seconds: 1602230487
    }
    }

  7. points {
    interval {
    end_time {
    seconds: 1602230498
    }
    }

Is it because i am pushing two points with same end time?
Please suggest the solution as well.

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series_reduce failed

This test failed!

To configure my behavior, see the Build Cop Bot documentation.

If I'm commenting on this issue too often, add the buildcop: quiet label and
I will stop commenting.


commit: 2976654
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
  }
  resour...91041
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py37"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fa088041a10>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
}
resourc...8591041
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py37"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:


state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

samples.snippets.v3.cloud-client.snippets_test: test_get_delete_metric_descriptor failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: c903285
buildURL: Build Status, Sponge
status: failed

Test output
args = (name: "projects/python-docs-samples-tests-py37/metricDescriptors/custom.googleapis.com/my_metric49d4465d-ab8a-4ce7-bc35-22c3a902d2ee"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37/metricDescriptors/custom.googlea...c49d4465d-ab8a-4ce7-bc35-22c3a902d2ee'), ('x-goog-api-client', 'gl-python/3.7.10 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f6085c2ee10>
request = name: "projects/python-docs-samples-tests-py37/metricDescriptors/custom.googleapis.com/my_metric49d4465d-ab8a-4ce7-bc35-22c3a902d2ee"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37/metricDescriptors/custom.googleapis.com/my_metric49d4465d-ab8a-4ce7-bc35-22c3a902d2ee'), ('x-goog-api-client', 'gl-python/3.7.10 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f60843c4510>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f6085c365f0>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626170607.662749462","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

capsys = <_pytest.capture.CaptureFixture object at 0x7f6085d63150>
custom_metric_descriptor = 'projects/python-docs-samples-tests-py37/metricDescriptors/custom.googleapis.com/my_metric49d4465d-ab8a-4ce7-bc35-22c3a902d2ee'

def test_get_delete_metric_descriptor(capsys, custom_metric_descriptor):
    try:

        @backoff.on_exception(backoff.expo, (AssertionError, NotFound), max_time=60)
        def eventually_consistent_test():
            snippets.get_metric_descriptor(custom_metric_descriptor)
            out, _ = capsys.readouterr()
            assert "DOUBLE" in out

        eventually_consistent_test()
    finally:
      snippets.delete_metric_descriptor(custom_metric_descriptor)

snippets_test.py:65:


snippets.py:55: in delete_metric_descriptor
client.delete_metric_descriptor(name=descriptor_name)
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:889: in delete_metric_descriptor
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/retry.py:290: in retry_wrapped_func
on_error=on_error,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series_header failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 2348671
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
  }
  resour...62823
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py38"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f8c62158190>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
}
resourc...9862823
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py38"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Here's the output from running synth.py:

Igoogle/api/label.proto=google/api/label.proto -Igoogle/api/launch_stage.proto=google/api/launch_stage.proto -Igoogle/api/metric.proto=google/api/metric.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/api/monitored_resource.proto=google/api/monitored_resource.proto -Igoogle/protobuf/struct.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/struct_proto/google/protobuf/struct.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/type/calendar_period.proto=google/type/calendar_period.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/wrappers.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/wrappers_proto/google/protobuf/wrappers.proto google/monitoring/v3/alert.proto google/monitoring/v3/alert_service.proto google/monitoring/v3/common.proto google/monitoring/v3/dropped_labels.proto google/monitoring/v3/group.proto google/monitoring/v3/group_service.proto google/monitoring/v3/metric.proto google/monitoring/v3/metric_service.proto google/monitoring/v3/mutation_record.proto google/monitoring/v3/notification.proto google/monitoring/v3/notification_service.proto google/monitoring/v3/service.proto google/monitoring/v3/service_service.proto google/monitoring/v3/span_context.proto google/monitoring/v3/uptime.proto google/monitoring/v3/uptime_service.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 54 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
google/monitoring/v3/metric.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric.proto:19:1: warning: Import google/api/distribution.proto is unused.
google/monitoring/v3/metric_service.proto:28:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric_service.proto:25:1: warning: Import google/monitoring/v3/alert.proto is unused.
google/monitoring/v3/notification_service.proto:26:1: warning: Import google/protobuf/struct.proto is unused.
google/monitoring/v3/service.proto:22:1: warning: Import google/protobuf/timestamp.proto is unused.
google/monitoring/v3/service.proto:19:1: warning: Import google/api/monitored_resource.proto is unused.
google/monitoring/v3/uptime_service.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
Traceback (most recent call last):
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
    from gapic.cli import generate
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
    from gapic import generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
    from .generator import Generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
    from gapic.samplegen import manifest, samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
    from gapic.samplegen import samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
    from gapic.schema import wrappers
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
    from gapic.schema.api import API
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
    from google.api_core import exceptions  # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/monitoring/v3:monitoring-v3-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.080s, Critical Path: 0.85s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
    proto_output_path="google/cloud/monitoring_v3/proto"
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 197, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-01-28 05:44:04,304 autosynth [ERROR] > Synthesis failed
2021-01-28 05:44:04,304 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 8a7e2f7 chore: reorder classes  (#73)
2021-01-28 05:44:04,309 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-28 05:44:04,315 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Add "not equal" support to the query filter.

Is your feature request related to a problem? Please describe.
Right now we can not filter query by something like metric.label.response_code != 200

Describe the solution you'd like
Add "not equal" support to the query filter.

Describe alternatives you've considered
I can't find alternative to this.

Realistic examples (aggregation)

Hey y'all,

I'm having a hard time translating metrics queries into Python API usage. Can y'all make any recommendations?

For example, I have this query which reports the oldest unack'd message age in a PubSub subscription, sampled every minute.

fetch pubsub_subscription
| metric 'pubsub.googleapis.com/subscription/oldest_unacked_message_age'
| filter
    resource.project_id == '443684903852'
    && (resource.subscription_id == 'octoprint-events-webapp-prod')
| group_by 1m,
    [value_oldest_unacked_message_age_mean:
       max(value.oldest_unacked_message_age)]
| every 1m

When I try to to compose queries with the Python API, the resulting pandas Dataframe is always empty. Are there any examples of how I'm supposed to use this library?

Here's an example:

import logging
from google.cloud import monitoring_v3
from django.conf import settings
logger = logging.getLogger(__name__)
client = monitoring_v3.MetricServiceClient()
result = query.Query(
    client,
    settings.GCP_PROJECT_ID,
    'pubsub.googleapis.com/subscription/oldest_unacked_message_age', 
    minutes=10
).as_dataframe()
logger.info(f'***** {result}')
django             | INFO 2021-01-10 14:57:26,916 healthcheck 2714 139732496103168 ***** Empty DataFrame
django             | Columns: []
django             | Index: []
django             | INFO:     

Thanks!
Leigh

missing value_type for STRING in TimeSeriesDescriptor for QueryTimeSeriesPager

Environment details

  • OS type and version: debian 10.9
  • Python version: Python 3.7.3
  • pip version: pip 21.1.2
  • google-cloud-monitoring version: 2.2.1
project_name = 'XXXXXXXXXX'
client = QueryServiceClient(credentials=scoped_credential)
metric_name = "https_lb_rule::loadbalancing.googleapis.com/https/backend_latencies"
metric_key_name = "backend_latencies"
gcp_project_name = "projects/{}".format(project_name)
group_by = [
    "metric.cache_result",
    "metric.proxy_continent",
    "metric.response_code_class",
    "metric.response_code", 
    "metric.protocol", 
    "resource.backend_name", 
    "resource.backend_type", 
    "resource.backend_scope",
    "resource.backend_scope_type",
    "resource.backend_target_type",
    "resource.backend_target_name",
    "resource.forwarding_rule_name",
    "resource.matched_url_path_rule",
    "resource.target_proxy_name",
    "resource.url_map_name",
    "resource.region", 
    "resource.project_id"
]

query = '''fetch {}
| within 15m
| every 1m
| group_by
    [ {} ]'''.format(metric_name, ','.join(group_by))
metric_query = metric_service.QueryTimeSeriesRequest(name=gcp_project_name, query=query)
results = client.query_time_series(metric_query)

google.cloud.monitoring_v3.types.metric.TimeSeriesDescriptor does not have proper value_type for STRING:

>>> results.time_series_descriptor.label_descriptors
[key: "metric.cache_result"
, key: "metric.proxy_continent"
, key: "metric.response_code_class"
value_type: INT64
, key: "metric.response_code"
value_type: INT64
, key: "metric.protocol"
, key: "resource.backend_name"
, key: "resource.backend_type"
, key: "resource.backend_scope"
, key: "resource.backend_scope_type"
, key: "resource.backend_target_type"
, key: "resource.backend_target_name"
, key: "resource.forwarding_rule_name"
, key: "resource.matched_url_path_rule"
, key: "resource.target_proxy_name"
, key: "resource.url_map_name"
, key: "resource.region"
, key: "resource.project_id"
]
>>> results.time_series_descriptor.label_descriptors[0].value_type
0
>>> from google.api import metric_pb2 as ga_metric
>>> ga_metric.MetricDescriptor.STRING
4
>>> 

MetricServiceClient returns different results for fixed interval

Environment details

  • OS type and version: macOS 10.15.6
  • Python version: 3.8.5
  • pip version: 20.2.2
  • google-cloud-monitoring version: 1.1.0

Steps to reproduce

I do see different results for the same query (all parameter values are fixed), which I can't understand currently. Any idea why this can happen? Is this something expected?

  1. Execute the provided example function multiple times
  2. You will see different results even though the query uses interval with fixed start/end time

Code example

from google.cloud import monitoring_v3
import time

def list_time_series_aggregate(project_id):
    # [START monitoring_read_timeseries_align]
    client = monitoring_v3.MetricServiceClient()
    project_name = client.project_path(project_id)
    interval = monitoring_v3.types.TimeInterval()
    # fixed point in time, offset 7 days
    interval.end_time.seconds = 1607634588
    interval.end_time.nanos = 836217000
    interval.start_time.seconds = 1607029788
    interval.start_time.nanos = 836217000
    aggregation = monitoring_v3.types.Aggregation()
    aggregation.alignment_period.seconds = 604800 # 7 days

    aggregation.per_series_aligner = (
        monitoring_v3.enums.Aggregation.Aligner.ALIGN_SUM)
    aggregation.cross_series_reducer = (
        monitoring_v3.enums.Aggregation.Reducer.REDUCE_SUM)

    results = client.list_time_series(
        project_name,
        'metric.type="compute.googleapis.com/instance/cpu/utilization"',
        interval,
        monitoring_v3.enums.ListTimeSeriesRequest.TimeSeriesView.FULL,
        aggregation)

    print(list(results)[0].points[0].value.double_value)

Thx Sven

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series_aggregate failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 2348671
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
  }
  resour...62823
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py38"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f8c62158190>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
}
resourc...9862823
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py38"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

How to obtain version?

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Please run down the following list and make sure you've tried the usual "quick fixes":

If you are still having issues, please be sure to include as much information as possible:

Environment details

  • OS type and version: Linux
  • Python version: python --version 3.7.10
  • pip version: pip --version
pip 21.1.1 from /opt/conda/lib/python3.7/site-packages/pip (python 3.7)
  • google-cloud-monitoring version: pip show google-cloud-monitoring
pip show google-cloud-monitoring
Name: google-cloud-monitoring
Version: 2.2.1
Summary: Stackdriver Monitoring API client library
Home-page: https://github.com/googleapis/python-monitoring
Author: Google LLC
Author-email: [email protected]
License: Apache 2.0
Location: /opt/conda/lib/python3.7/site-packages
Requires: proto-plus, google-api-core
Required-by: 

Steps to reproduce

  1. Difference in version cause issues https://stackoverflow.com/questions/64337089/attributeerror-module-google-cloud-monitoring-v3-types-has-no-attribute-metr
  2. Need to get version from library at runtime

Code example

>>> from google.cloud import monitoring_v3
>>> monitoring_v3.VERSION
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'google.cloud.monitoring_v3' has no attribute 'VERSION'
>>> monitoring_v3.__version__
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'google.cloud.monitoring_v3' has no attribute '__version__'

Stack trace

N/A

Making sure to follow these steps will guarantee the quickest resolution possible.

Thanks!

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Here's the output from running synth.py:

Igoogle/api/label.proto=google/api/label.proto -Igoogle/api/launch_stage.proto=google/api/launch_stage.proto -Igoogle/api/metric.proto=google/api/metric.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/api/monitored_resource.proto=google/api/monitored_resource.proto -Igoogle/protobuf/struct.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/struct_proto/google/protobuf/struct.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/type/calendar_period.proto=google/type/calendar_period.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/wrappers.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/wrappers_proto/google/protobuf/wrappers.proto google/monitoring/v3/alert.proto google/monitoring/v3/alert_service.proto google/monitoring/v3/common.proto google/monitoring/v3/dropped_labels.proto google/monitoring/v3/group.proto google/monitoring/v3/group_service.proto google/monitoring/v3/metric.proto google/monitoring/v3/metric_service.proto google/monitoring/v3/mutation_record.proto google/monitoring/v3/notification.proto google/monitoring/v3/notification_service.proto google/monitoring/v3/service.proto google/monitoring/v3/service_service.proto google/monitoring/v3/span_context.proto google/monitoring/v3/uptime.proto google/monitoring/v3/uptime_service.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 54 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
google/monitoring/v3/metric.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric.proto:19:1: warning: Import google/api/distribution.proto is unused.
google/monitoring/v3/metric_service.proto:28:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric_service.proto:25:1: warning: Import google/monitoring/v3/alert.proto is unused.
google/monitoring/v3/notification_service.proto:26:1: warning: Import google/protobuf/struct.proto is unused.
google/monitoring/v3/service.proto:22:1: warning: Import google/protobuf/timestamp.proto is unused.
google/monitoring/v3/service.proto:19:1: warning: Import google/api/monitored_resource.proto is unused.
google/monitoring/v3/uptime_service.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
Traceback (most recent call last):
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
    from gapic.cli import generate
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
    from gapic import generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
    from .generator import Generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
    from gapic.samplegen import manifest, samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
    from gapic.samplegen import samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
    from gapic.schema import wrappers
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
    from gapic.schema.api import API
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
    from google.api_core import exceptions  # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/monitoring/v3:monitoring-v3-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.092s, Critical Path: 0.85s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
    proto_output_path="google/cloud/monitoring_v3/proto"
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 193, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-01-21 05:43:50,584 autosynth [ERROR] > Synthesis failed
2021-01-21 05:43:50,584 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 8a7e2f7 chore: reorder classes  (#73)
2021-01-21 05:43:50,591 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-21 05:43:50,597 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

can't import monitoring_v3 in python 3.9.6

A nasty error happens with conda-forge channel, the build pulled latest python version (3.9.6) and

google-api-core 1.31.0
google-auth 1.33.0
google-cloud-bigquery 2.21.0
google-cloud-bigquery-storage 2.2.1
google-cloud-bigtable 2.3.1
google-cloud-core 1.7.1
google-cloud-monitoring 2.4.0
google-crc32c 1.1.2
google-resumable-media 1.3.1
googleapis-common-protos 1.53.0
grpc-google-iam-v1 0.12.3

Then the import throws

from google.cloud import monitoring_v3

Traceback (most recent call last):
from google.cloud import monitoring_v3
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/init.py", line 17, in
from .services.alert_policy_service import AlertPolicyServiceClient
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/init.py", line 16, in
from .client import AlertPolicyServiceClient
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/client.py", line 33, in
from google.cloud.monitoring_v3.services.alert_policy_service import pagers
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/pagers.py", line 27, in
from google.cloud.monitoring_v3.types import alert
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/types/init.py", line 16, in
from .alert import AlertPolicy
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/types/alert.py", line 18, in
from google.cloud.monitoring_v3.types import common
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/types/common.py", line 48, in
class ServiceTier(proto.Enum):
File "/opt/conda/envs/env/lib/python3.9/site-packages/proto/enums.py", line 72, in new
cls = super().new(mcls, name, bases, attrs)
File "/opt/conda/envs/env/lib/python3.9/enum.py", line 288, in new
enum_member = new(enum_class, *args)
TypeError: int() argument must be a string, a bytes-like object or a number, not 'dict'

It's easy to fix with python=3.7.10 if anyone wonders.

GA Release

Package name: google-cloud-monitoring
Current release: alpha
Proposed release: GA

Instructions

Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.

Required

  • 28 days elapsed since last alphabeta release with new API surface
  • Server API is GA
  • Package API is stable, and we can commit to backward compatibility
  • All dependencies are GA

Optional

  • Most common / important scenarios have descriptive samples
  • Public manual methods have at least one usage sample each (excluding overloads)
  • Per-API README includes a full description of the API
  • Per-API README contains at least one โ€œgetting startedโ€ sample using the most common API scenario
  • Manual code has been reviewed by API producer
  • Manual code has been reviewed by a DPE responsible for samples
  • 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Here's the output from running synth.py:

ope_struct_54__handle_cancellation_from_core.tp_print = 0;
                                                                                    ^~~~~~~~
                                                                                    tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132284:72: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_55__schedule_rpc_coro.tp_print = 0;
                                                                        ^~~~~~~~
                                                                        tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132290:65: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_56__handle_rpc.tp_print = 0;
                                                                 ^~~~~~~~
                                                                 tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132296:67: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_57__request_call.tp_print = 0;
                                                                   ^~~~~~~~
                                                                   tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132302:71: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_58__server_main_loop.tp_print = 0;
                                                                       ^~~~~~~~
                                                                       tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132308:59: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_59_start.tp_print = 0;
                                                           ^~~~~~~~
                                                           tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132314:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_60__start_shutting_down.tp_print = 0;
                                                                          ^~~~~~~~
                                                                          tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132320:62: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_61_shutdown.tp_print = 0;
                                                              ^~~~~~~~
                                                              tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132326:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
   __pyx_type_7_cython_6cygrpc___pyx_scope_struct_62_wait_for_termination.tp_print = 0;
                                                                          ^~~~~~~~
                                                                          tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'PyObject* __Pyx_decode_c_bytes(const char*, Py_ssize_t, Py_ssize_t, Py_ssize_t, const char*, const char*, PyObject* (*)(const char*, Py_ssize_t, const char*))':
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:136866:45: warning: 'PyObject* PyUnicode_FromUnicode(const Py_UNICODE*, Py_ssize_t)' is deprecated [-Wdeprecated-declarations]
         return PyUnicode_FromUnicode(NULL, 0);
                                             ^
In file included from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/unicodeobject.h:1026:0,
                 from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/Python.h:97,
                 from bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:4:
bazel-out/host/bin/external/local_config_python/_python3/_python3_include/cpython/unicodeobject.h:551:42: note: declared here
 Py_DEPRECATED(3.3) PyAPI_FUNC(PyObject*) PyUnicode_FromUnicode(
                                          ^~~~~~~~~~~~~~~~~~~~~
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'void __pyx_f_7_cython_6cygrpc__unified_socket_write(int)':
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:72692:3: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
   (void)(write(__pyx_v_fd, ((char *)"1"), 1));
   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: At global scope:
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:144607:1: warning: 'void __Pyx_PyAsyncGen_Fini()' defined but not used [-Wunused-function]
 __Pyx_PyAsyncGen_Fini(void)
 ^~~~~~~~~~~~~~~~~~~~~
Target //google/monitoring/v3:monitoring-v3-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 4.248s, Critical Path: 4.02s
INFO: 8 processes: 8 linux-sandbox.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 790, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/root/.cache/synthtool/python-monitoring/synth.py", line 32, in <module>
    v3_library = gapic.py_library(
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 45, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 182, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 27, in run
    return subprocess.run(
  File "/usr/local/lib/python3.9/subprocess.py", line 524, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2020-12-05 03:08:23,957 autosynth [ERROR] > Synthesis failed
2020-12-05 03:08:23,957 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 2976654 chore(deps): update ubuntu docker tag to v20.10 (#52)
2020-12-05 03:08:23,963 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-12-05 03:08:23,968 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/usr/local/lib/python3.9/subprocess.py", line 456, in check_returncode
    raise CalledProcessError(self.returncode, self.args, self.stdout,
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Monitoring: Provide ability to filter by user labels.

Is your feature request related to a problem? Please describe.
It would be very useful to have the ability to filter based on user labels, e.g. a label that we have added to a GKE container that ends up in the user_labels section in Stackdriver. This kind of filtering is possible via the UI/API AFAIK, using the metadata.user_labels field.

Describe the solution you'd like
Exposing this field to the Python SDK

Describe alternatives you've considered
Using the API, but it's always better to have this functionality into the SDK

Error when using paging to read timeseries

I'm getting the following error when using list(timeseries_response) to iterate over the pages in the response. The error happens only on big timeseries response containing lots of data.

The code I'm using is:

response = client.list_time_series(parent, filter, measurement_window, monitoring_v3.enums.ListTimeSeriesRequest.TimeSeriesView.FULL, aggregation)
good_ts = list(good_ts)

Traceback:

  File "/usr/local/lib/python3.7/site-packages/slo_generator/backends/stackdriver.py", line 68, in good_bad_ratio
    good_ts = list(good_ts)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 212, in _items_iter
    for page in self._page_iter(increment=False):
  File "/usr/local/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 243, in _page_iter
    page = self._next_page()
  File "/usr/local/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 534, in _next_page
    response = self._method(self._request)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 206, in retry_target
    last_exc,
  File "<string>", line 3, in raise_from
google.api_core.exceptions.RetryError: Deadline of 600.0s exceeded while calling functools.partial(<function _wrap_unary_errors.<locals>.error_remapped_callable at 0x106da29e0>, filter: "project=\"<ANONYMIZED>\" AND resource.type=\"generic_task\" AND metric.name=\"logging.googleapis.com/user/job_execution_completed\""
interval {
  start_time {
    seconds: 1586785806
    nanos: 266461849
  }
  end_time {
    seconds: 1587390606
    nanos: 266461849
  }
}
aggregation {
  alignment_period {
    seconds: 604800
  }
  per_series_aligner: ALIGN_SUM
  cross_series_reducer: REDUCE_SUM
}
name: "projects/<ANONYMIZED>"
, metadata=[('x-goog-request-params', 'name=projects/<ANONYMIZED>'), ('x-goog-api-client', 'gl-python/3.7.5 grpc/1.27.2 gax/1.16.0 gapic/0.34.0')]), last exception: 504 Deadline Exceeded
M100214:measure-slo-achievement 20011381$ 

samples.snippets.v3.cloud-client.quickstart_test: test_quickstart failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: c903285
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metric"
  }
  resource {
    type: "gce_instance"
    la...34416
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py36"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py36'), ('x-goog-api-client', 'gl-python/3.6.13 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-6/lib/python3.6/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f1bcf72cef0>
request = time_series {
metric {
type: "custom.googleapis.com/my_metric"
}
resource {
type: "gce_instance"
lab...3734416
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py36"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py36'), ('x-goog-api-client', 'gl-python/3.6.13 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-6/lib/python3.6/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f1bcf73a128>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f1bcf725588>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626167412.739682685","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-6/lib/python3.6/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

capsys = <_pytest.capture.CaptureFixture object at 0x7f1bcf761198>

def test_quickstart(capsys):
    @backoff.on_exception(backoff.expo, AssertionError, max_time=60)
    def eventually_consistent_test():
        quickstart.run_quickstart(PROJECT)
        out, _ = capsys.readouterr()
        assert "wrote" in out
  eventually_consistent_test()

quickstart_test.py:32:


.nox/py-3-6/lib/python3.6/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
quickstart_test.py:28: in eventually_consistent_test
quickstart.run_quickstart(PROJECT)
quickstart.py:39: in run_quickstart
client.create_time_series(request={"name": project_name, "time_series": [series]})
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1105: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-6/lib/python3.6/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-6/lib/python3.6/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

Unit test alert policies

Is your feature request related to a problem? Please describe.

Feature request: support unit tests on alert policies without a real GCP project_id or using actual time series in Google Cloud Monitoring.

Describe the solution you'd like

I'd like to have a way to unit test my alert policies.

  1. I have alert policies stored in my repository as JSON that can be loaded into memory as an AlertPolicy object.
  2. I'd like to have unit tests to make sure the conditions are expressed correctly.
  3. The idea is to create a fake time series, apply the alert conditions, then verify whether the alert would fire (or not) as expected.
  4. Create different test cases with different fake time series to cover all the logic in the alert conditions.

Describe alternatives you've considered

I searched the codes here but didn't find anything related to evaluating alert conditions against a time series. It seems hard to create unit tests without an MQL evaluation function.

Additional context

I just wanted to unit test alert policies before sending them to Google Cloud Monitoring. I don't want to wake up oncalls at 3am because of buggy alert conditions.

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Here's the output from running synth.py:

rg/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
  Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
  Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
  Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
  Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmpfs/tmp/tmpurm5icul/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
        import distutils.core
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 82, in create_module
        return importlib.import_module('._distutils', 'setuptools')
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
    ModuleNotFoundError: No module named 'setuptools._distutils'
    
    ----------------------------------------
 (  Cache entry deserialization failed, entry ignored
Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-udrtb0av/pypandoc/
)
ERROR: no such package '@gapic_generator_python_pip_deps//': pip_import failed: Collecting click==7.1.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 1))
  Using cached https://files.pythonhosted.org/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl
  Saved ./click-7.1.2-py2.py3-none-any.whl
Collecting google-api-core==1.22.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 2))
  Using cached https://files.pythonhosted.org/packages/e0/2d/7c6c75013105e1d2b6eaa1bf18a56995be1dbc673c38885aea31136e9918/google_api_core-1.22.1-py2.py3-none-any.whl
  Saved ./google_api_core-1.22.1-py2.py3-none-any.whl
Collecting googleapis-common-protos==1.52.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 3))
  Using cached https://files.pythonhosted.org/packages/03/74/3956721ea1eb4bcf7502a311fdaa60b85bd751de4e57d1943afe9b334141/googleapis_common_protos-1.52.0-py2.py3-none-any.whl
  Saved ./googleapis_common_protos-1.52.0-py2.py3-none-any.whl
Collecting jinja2==2.11.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 4))
  Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
  Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
  Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
  Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
  Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmpfs/tmp/tmpurm5icul/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
        import distutils.core
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 82, in create_module
        return importlib.import_module('._distutils', 'setuptools')
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
    ModuleNotFoundError: No module named 'setuptools._distutils'
    
    ----------------------------------------
 (  Cache entry deserialization failed, entry ignored
Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-udrtb0av/pypandoc/
)
INFO: Elapsed time: 2.689s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
FAILED: Build did NOT complete successfully (0 packages loaded)

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 35, in <module>
    include_protos=True,
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 46, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2020-08-31 05:21:05,341 autosynth [ERROR] > Synthesis failed
2020-08-31 05:21:05,341 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6ea7842 chore: release 1.1.0 (#48)
2020-08-31 05:21:05,346 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-08-31 05:21:05,351 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Removing google/__pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Import error

While using the function to monitor CPU utilization via cloud functions we got the erroe:
cannot import name 'monitoring_v3' from 'google.cloud' (unknown location)

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 2348671
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
  }
  resour...62823
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py38"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f8c62158190>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricf7fa7079-2c6a-469a-83a0-d719bd2003e0"
}
resourc...9862823
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py38"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py38'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.0 gax/1.28.0 gapic/2.2.1')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

Expose MetricServiceAsyncClient

Is your feature request related to a problem? Please describe.

Currently exposed API has only MetricServiceClient that allows writing metrics. The problem is that this client is synchronous and each call consumes some ~40ms to complete which might be critical where performance is needed. To solve this problem it's necessary to create some sort of a wrapper (e.g. similar to what logging library does).

Describe the solution you'd like

By the looks of it there already is a code-generated MetricServiceAsyncClient (see https://github.com/googleapis/python-monitoring/blob/master/google/cloud/monitoring_v3/services/metric_service/async_client.py). Does it work as expected? Can it be also exposed?

It would be good if this library comes with an async option out of the box.

Describe alternatives you've considered

Creating a self-made wrapper to push the metrics in a background thread is definitely an option.

Failure to import the module (No module named 'google.cloud.monitoring_v3.proto.metric_pb2')

Describe your environment

(opentelemetry-bug) ant@ferrum opentelemetry % python
Python 3.8.3 (default, May 19 2020, 13:54:14)
[Clang 10.0.0 ] :: Anaconda, Inc. on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import platform
>>> platform.architecture()
('64bit', '')
>>> platform.mac_ver()
('10.15.7', ('', '', ''), 'x86_64')

Steps to reproduce

mkdir opentelemetry-bug && cd opentelemetry-bug && python3 -m venv opentelemetry-bug
source opentelemetry-bug/bin/activate
pip install --quiet opentelemetry-exporter-cloud-monitoring
echo "from opentelemetry.exporter.cloud_monitoring import CloudMonitoringMetricsExporter" > bug.py
python bug.py

Observe the error:

Traceback (most recent call last):
  File "bug.py", line 1, in <module>
    from opentelemetry.exporter.cloud_monitoring import CloudMonitoringMetricsExporter
  File "/Users/ant/t/opentelemetry-bug/opentelemetry-bug/lib/python3.8/site-packages/opentelemetry/exporter/cloud_monitoring/__init__.py", line 9, in <module>
    from google.cloud.monitoring_v3.proto.metric_pb2 import TimeSeries
ModuleNotFoundError: No module named 'google.cloud.monitoring_v3.proto.metric_pb2'

What is the expected behavior?
Not fail

What is the actual behavior?
Exception

Additional context
My guess is that the issue is due to the release google-cloud-monitoring 2.0.0 on 2020-10-06 (https://pypi.org/project/google-cloud-monitoring/). I tried to pin the version to <2 in the virtual environment and it helped:

pip install --quiet "google-cloud-monitoring<2"
python bug.py

Worked without any error.

samples.snippets.v3.alerts-client.snippets_test: test_enable_alert_policies failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: e6063de
buildURL: Build Status, Sponge
status: failed

Test output
capsys = <_pytest.capture.CaptureFixture object at 0x7fa0c7aa5810>
pochan = 
@pytest.mark.flaky(rerun_filter=delay_on_aborted, max_runs=5)
def test_enable_alert_policies(capsys, pochan):
    # These sleep calls are for mitigating the following error:
    # "409 Too many concurrent edits to the project configuration.
    # Please try again."
    # Having multiple projects will void these `sleep()` calls.
    # See also #3310
    time.sleep(2)
    snippets.enable_alert_policies(pochan.project_name, True)
    out, _ = capsys.readouterr()
    assert (
        "Enabled {0}".format(pochan.project_name) in out
        or "{} is already enabled".format(pochan.alert_policy.name) in out
    )

    time.sleep(2)
    snippets.enable_alert_policies(pochan.project_name, False)
    out, _ = capsys.readouterr()
  assert (
        "Disabled {}".format(pochan.project_name) in out
        or "{} is already disabled".format(pochan.alert_policy.name) in out
    )

E assert ('Disabled projects/python-docs-samples-tests-py37' in 'Policy projects/python-docs-samples-tests-py37/alertPolicies/14818226130155215502 is already disabled\nPolicy projects/python-docs-samples-tests-py37/alertPolicies/2404774893716709115 is already disabled\n' or 'projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319 is already disabled' in 'Policy projects/python-docs-samples-tests-py37/alertPolicies/14818226130155215502 is already disabled\nPolicy projects/python-docs-samples-tests-py37/alertPolicies/2404774893716709115 is already disabled\n')
E + where 'Disabled projects/python-docs-samples-tests-py37' = <built-in method format of str object at 0x7fa0c9cebcf0>('projects/python-docs-samples-tests-py37')
E + where <built-in method format of str object at 0x7fa0c9cebcf0> = 'Disabled {}'.format
E + and 'projects/python-docs-samples-tests-py37' = <snippets_test.PochanFixture object at 0x7fa0c7bb5350>.project_name
E + and 'projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319 is already disabled' = <built-in method format of str object at 0x7fa0c9c00120>('projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319')
E + where <built-in method format of str object at 0x7fa0c9c00120> = '{} is already disabled'.format
E + and 'projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319' = name: "projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319"\ndisplay_name: "snippets-test-ugrikrch..."projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319/conditions/486425858163223454"\n}\nenabled {\n}\n.name
E + where name: "projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319"\ndisplay_name: "snippets-test-ugrikrch..."projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319/conditions/486425858163223454"\n}\nenabled {\n}\n = <snippets_test.PochanFixture object at 0x7fa0c7bb5350>.alert_policy

snippets_test.py:149: AssertionError

samples.snippets.v3.uptime-check-client.snippets_test: test_update_uptime_config failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: c903285
buildURL: Build Status, Sponge
status: failed

Test output
args = (parent: "projects/python-docs-samples-tests"
uptime_check_config {
  display_name: "zvleqikmpb"
  monitored_resource ...path: "/"
    port: 80
    request_method: GET
  }
  period {
    seconds: 300
  }
  timeout {
    seconds: 10
  }
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/python-docs-samples-tests'), ('x-goog-api-client', 'gl-python/3.7.10 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fab9ffb24d0>
request = parent: "projects/python-docs-samples-tests"
uptime_check_config {
display_name: "zvleqikmpb"
monitored_resource {... path: "/"
port: 80
request_method: GET
}
period {
seconds: 300
}
timeout {
seconds: 10
}
}

timeout = None
metadata = [('x-goog-request-params', 'parent=projects/python-docs-samples-tests'), ('x-goog-api-client', 'gl-python/3.7.10 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7fab9ffb2a50>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fab9ffe49b0>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626170630.632382368","description":"Error received from peer ipv4:74.125.197.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

capsys = <_pytest.capture.CaptureFixture object at 0x7fab9ffa3dd0>

def test_update_uptime_config(capsys):
    # create and delete happen in uptime fixture.
    new_display_name = random_name(10)
    new_uptime_check_path = "/" + random_name(10)
  with UptimeFixture() as fixture:

snippets_test.py:72:


snippets_test.py:42: in enter
self.project_name, display_name=random_name(10)
snippets.py:43: in create_uptime_check_config_get
new_config = client.create_uptime_check_config(request={"parent": project_name, "uptime_check_config": config})
../../../../google/cloud/monitoring_v3/services/uptime_check_service/client.py:608: in create_uptime_check_config
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

KeyError: 'WhichOneof' while converting time series data to dataframe

Using the latest (2.0.1) version of the API and I just can't make this simple example work:

query=google.cloud.monitoring_v3.query.Query(monitoring_v3.services.metric_service.client.MetricServiceClient(),
            project_name,
            metric_type='storage.googleapis.com/authz/acl_operations_count', 
            end_time=None, 
            days=7
            )

query.as_dataframe()

I'm getting the following error:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
~/miniconda3/lib/python3.8/site-packages/proto/message.py in __getattr__(self, key)
    559         try:
--> 560             pb_type = self._meta.fields[key].pb_type
    561             pb_value = getattr(self._pb, key)

KeyError: 'WhichOneof'

During handling of the above exception, another exception occurred:

AttributeError                            Traceback (most recent call last)
<ipython-input-19-a95d81571ff4> in <module>
      8 print(type(query))
      9 
---> 10 query.as_dataframe()

~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/query.py in as_dataframe(self, label, labels)
    533         :returns: A dataframe where each column represents one time series.
    534         """
--> 535         return _dataframe._build_dataframe(self, label, labels)
    536 
    537     def __deepcopy__(self, memo):

~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/_dataframe.py in _build_dataframe(time_series_iterable, label, labels)
     94     for time_series in time_series_iterable:
     95         pandas_series = pandas.Series(
---> 96             data=[_extract_value(point.value) for point in time_series.points],
     97             index=[
     98                 point.interval.end_time.ToNanoseconds() for point in time_series.points

~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/_dataframe.py in <listcomp>(.0)
     94     for time_series in time_series_iterable:
     95         pandas_series = pandas.Series(
---> 96             data=[_extract_value(point.value) for point in time_series.points],
     97             index=[
     98                 point.interval.end_time.ToNanoseconds() for point in time_series.points

~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/_dataframe.py in _extract_value(typed_value)
     47 def _extract_value(typed_value):
     48     """Extract the value from a TypedValue."""
---> 49     value_type = typed_value.WhichOneof("value")
     50     return typed_value.__getattribute__(value_type)
     51 

~/miniconda3/lib/python3.8/site-packages/proto/message.py in __getattr__(self, key)
    563             return marshal.to_python(pb_type, pb_value, absent=key not in self)
    564         except KeyError as ex:
--> 565             raise AttributeError(str(ex))
    566 
    567     def __ne__(self, other):

AttributeError: 'WhichOneof'

Is it me or it should work?

MQL query exception

Trying to run a simple query using the SDK after MQL support was added:

def test_mql_expression(mql_query):
    client = QueryServiceClient()
    project = f'projects/{PROJECT_ID}'
    request = QueryTimeSeriesRequest(name=project, query=str(mql_query))
    result = client.query_time_series(request)
    print(result)
    return result
test_mql_expression('fetch global::custom.googleapis.com/metric_test')

Getting the following exception:

Traceback (most recent call last):
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 73, in error_remapped_callable
    return callable_(*args, **kwargs)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/grpc/_channel.py", line 946, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/grpc/_channel.py", line 849, in _end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.INVALID_ARGUMENT
        details = "One or more errors parsing the query."
        debug_error_string = "{"created":"@1618853686.419251000","description":"Error received from peer ipv6:[2a00:1450:4007:807::200a]:443","file":"src/core/lib/surface/call.cc","file_line":1068,"grpc_message":"One or more errors parsing the query.","grpc_status":3}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/ocervello/.virtualenvs/sabre/bin/promql-convert", line 33, in <module>
    sys.exit(load_entry_point('promql-to-mql', 'console_scripts', 'promql-convert')())
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 1041, in __call__
    return self.main(*args, **kwargs)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 971, in main
    rv = self.invoke(ctx)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 1309, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 715, in invoke
    return callback(*args, **kwargs)
  File "/Users/ocervello/Workspace/dev/promgrafana-cloudops/promql_to_mql/cli.py", line 164, in convert
    result = test_mql_expression(expression)
  File "/Users/ocervello/Workspace/dev/promgrafana-cloudops/promql_to_mql/test.py", line 51, in test_mql_expression
    result = client.query_time_series(request)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/cloud/monitoring_v3/services/query_service/client.py", line 381, in query_time_series
    response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
    return wrapped_func(*args, **kwargs)
  File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 75, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 One or more errors parsing the query.

Note that this query works in the Metrics Explorer Query editor, so not sure what's going on here.

module error for 'TimeSeries'

from google.cloud import monitoring_v3

series = monitoring_v3.TimeSeries()
Traceback (most recent call last):
File "", line 1, in
AttributeError: 'module' object has no attribute 'TimeSeries'

getting this error while trying to create timeseries data.
please help on this.

samples.snippets.v3.alerts-client.snippets_test: test_enable_alert_policies failed

Note: #150 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.


commit: c903285
buildURL: Build Status, Sponge
status: failed

Test output
args = (update_mask {
  paths: "enabled"
}
alert_policy {
  name: "projects/python-docs-samples-tests-py37/alertPolicies/4045...thon-docs-samples-tests-py37/alertPolicies/4045340882692684728/conditions/4045340882692681945"
  }
  enabled {
  }
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'alert_policy.name=projects/python-docs-samples-tests-py37/alertPolicies/4045340882692684728'), ('x-goog-api-client', 'gl-python/3.7.10 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f580c501490>
request = update_mask {
paths: "enabled"
}
alert_policy {
name: "projects/python-docs-samples-tests-py37/alertPolicies/40453...python-docs-samples-tests-py37/alertPolicies/4045340882692684728/conditions/4045340882692681945"
}
enabled {
}
}

timeout = None
metadata = [('x-goog-request-params', 'alert_policy.name=projects/python-docs-samples-tests-py37/alertPolicies/4045340882692684728'), ('x-goog-api-client', 'gl-python/3.7.10 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7f580c501090>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f580c4968c0>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626170581.487223623","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

capsys = <_pytest.capture.CaptureFixture object at 0x7f580c4e8450>
pochan = <snippets_test.PochanFixture object at 0x7f580de09e10>

@pytest.mark.flaky(rerun_filter=delay_on_aborted, max_runs=5)
def test_enable_alert_policies(capsys, pochan):
    # These sleep calls are for mitigating the following error:
    # "409 Too many concurrent edits to the project configuration.
    # Please try again."
    # Having multiple projects will void these `sleep()` calls.
    # See also #3310
    time.sleep(2)
    snippets.enable_alert_policies(pochan.project_name, True, "name='{}'".format(pochan.alert_policy.name))
    out, _ = capsys.readouterr()
    assert (
        "Enabled {0}".format(pochan.project_name) in out
        or "{} is already enabled".format(pochan.alert_policy.name) in out
    )

    time.sleep(2)
  snippets.enable_alert_policies(pochan.project_name, False, "name='{}'".format(pochan.alert_policy.name))

snippets_test.py:168:


snippets.py:89: in enable_alert_policies
client.update_alert_policy(alert_policy=policy, update_mask=mask)
../../../../google/cloud/monitoring_v3/services/alert_policy_service/client.py:832: in update_alert_policy
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Please investigate and fix this issue within 5 business days. While it remains broken,
this library cannot be updated with changes to the python-monitoring API, and the library grows
stale.

See https://github.com/googleapis/synthtool/blob/master/autosynth/TroubleShooting.md
for trouble shooting tips.

Here's the output from running synth.py:

l/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:77:1
DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4b714b35ee04ba90f560ee60e64c7357428efcb6b0f3a298f343f8ec2c6d4a5d"
DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:234:1
DEBUG: Rule 'protoc_docs_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "33b387245455775e0de45869c7355cc5a9e98b396a6fc43b02812a63b75fee20"
DEBUG: Call stack for the definition of repository 'protoc_docs_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:258:1
DEBUG: Rule 'rules_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "48f7e716f4098b85296ad93f5a133baf712968c13fbc2fdf3a6136158fe86eac"
DEBUG: Call stack for the definition of repository 'rules_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:42:1
DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "fe995def6873fcbdc2a8764ef4bce96eb971a9d1950fe9db9be442f3c64fb3b6"
DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:278:1
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "c0d0efba86429cee5e52baf838165b0ed7cafae1748d025abec109d25e006628"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:300:1
DEBUG: Rule 'gapic_generator_php' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "3dffc5c34a5f35666843df04b42d6ce1c545b992f9c093a777ec40833b548d86"
DEBUG: Call stack for the definition of repository 'gapic_generator_php' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:364:1
DEBUG: Rule 'gapic_generator_csharp' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4db430cfb9293e4521ec8e8138f8095faf035d8e752cf332d227710d749939eb"
DEBUG: Call stack for the definition of repository 'gapic_generator_csharp' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:386:1
DEBUG: Rule 'gapic_generator_ruby' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "a14ec475388542f2ea70d16d75579065758acc4b99fdd6d59463d54e1a9e4499"
DEBUG: Call stack for the definition of repository 'gapic_generator_ruby' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:400:1
DEBUG: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/rules_python/python/pip.bzl:61:5: DEPRECATED: the pip_repositories rule has been replaced with pip_install, please see rules_python 0.1 release notes
DEBUG: Rule 'bazel_skylib' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "1dde365491125a3db70731e25658dfdd3bc5dbdfd11b840b3e987ecf043c7ca0"
DEBUG: Call stack for the definition of repository 'bazel_skylib' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:35:1
Analyzing: target //google/monitoring/v3:monitoring-v3-py (1 packages loaded, 0 targets configured)
ERROR: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/upb/bazel/upb_proto_library.bzl:257:29: aspect() got unexpected keyword argument 'incompatible_use_toolchain_transition'
ERROR: Analysis of target '//google/monitoring/v3:monitoring-v3-py' failed; build aborted: error loading package '@com_github_grpc_grpc//': in /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl: Extension file 'bazel/upb_proto_library.bzl' has errors
INFO: Elapsed time: 0.368s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (5 packages loaded, 4 targets configured)
FAILED: Build did NOT complete successfully (5 packages loaded, 4 targets configured)

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
    proto_output_path="google/cloud/monitoring_v3/proto"
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
    return self._generate_code(service, version, "python", False, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 204, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-04-27 04:22:37,719 autosynth [ERROR] > Synthesis failed
2021-04-27 04:22:37,720 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at af95e7f chore(revert): revert preventing normalization (#126)
2021-04-27 04:22:37,725 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-04-27 04:22:37,729 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 356, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 191, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 336, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 68, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

samples.snippets.v3.cloud-client.snippets_test: test_list_time_series failed

This test failed!

To configure my behavior, see the Build Cop Bot documentation.

If I'm commenting on this issue too often, add the buildcop: quiet label and
I will stop commenting.


commit: 2976654
buildURL: Build Status, Sponge
status: failed

Test output
args = (time_series {
  metric {
    type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
  }
  resour...91041
      }
    }
    value {
      double_value: 3.14
    }
  }
}
name: "projects/python-docs-samples-tests-py37"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fa088041a10>
request = time_series {
metric {
type: "custom.googleapis.com/my_metricc24f5eed-85fd-4ca9-a085-5d82aeaca9af"
}
resourc...8591041
}
}
value {
double_value: 3.14
}
}
}
name: "projects/python-docs-samples-tests-py37"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests-py37'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.34.0 gax/1.23.0 gapic/2.0.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:


state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >

.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture(scope="module")
def write_time_series():
    @backoff.on_exception(backoff.expo, InternalServerError, max_time=120)
    def write():
        snippets.write_time_series(PROJECT_ID)
  write()

snippets_test.py:50:


.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"

???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]

:3: InternalServerError

Synthesis failed for python-monitoring

Hello! Autosynth couldn't regenerate python-monitoring. ๐Ÿ’”

Here's the output from running synth.py:

Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/monitoring/artman_monitoring.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/group_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/group_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/metric.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/metric.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/alert_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/alert_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/dropped_labels.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/dropped_labels.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/group.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/group.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/common.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/common.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/notification.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/notification.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/metric_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/metric_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/service_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/service_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/alert.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/alert.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/uptime_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/uptime_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/mutation_record.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/mutation_record.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/uptime.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/uptime.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/span_context.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/span_context.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/notification_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/notification_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/metric_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/group_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/alert_policy_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/service_monitoring_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/notification_channel_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/uptime_check_service_client.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/monitoring_v3/proto/common_pb2.py.
synthtool > No replacements made in google/cloud/monitoring_v3/gapic/alert_policy_service_client.py for pattern then a new `\[CONDITION_ID\]` is created.
, maybe replacement is not longer needed?
synthtool > Replaced '                ::\n\n' in google/cloud/monitoring_v3/gapic/alert_policy_service_client.py.
synthtool > No replacements made in google/cloud/monitoring_v3/proto/metric_service_pb2.py for pattern ^(\s+)have an ``id`` label:  ::      resource.type =
.*, maybe replacement is not longer needed?
synthtool > Replaced 'from google.cloud.monitoring_v3.gapic import notification_channel_service_client\n' in google/cloud/monitoring_v3/__init__.py.
synthtool > Replaced 'notification_channel_service_client.NotificationChannelServiceClient' in google/cloud/monitoring_v3/__init__.py.
.coveragerc
.flake8
.github/CONTRIBUTING.md
.github/ISSUE_TEMPLATE/bug_report.md
.github/ISSUE_TEMPLATE/feature_request.md
.github/ISSUE_TEMPLATE/support_request.md
.github/PULL_REQUEST_TEMPLATE.md
.github/release-please.yml
.gitignore
.kokoro/build.sh
.kokoro/continuous/common.cfg
.kokoro/continuous/continuous.cfg
.kokoro/docs/common.cfg
.kokoro/docs/docs.cfg
.kokoro/presubmit/common.cfg
.kokoro/presubmit/presubmit.cfg
.kokoro/publish-docs.sh
.kokoro/release.sh
.kokoro/release/common.cfg
.kokoro/release/release.cfg
.kokoro/trampoline.sh
CODE_OF_CONDUCT.md
CONTRIBUTING.rst
LICENSE
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
docs/conf.py.j2
noxfile.py.j2
renovate.json
setup.cfg
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
Error: pip is not installed into the virtualenv, it is located at /tmpfs/src/git/autosynth/env/bin/pip. Pass external=True into run() to explicitly allow this.
Session blacken failed.
synthtool > Failed executing nox -s blacken:

None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
  File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 99, in <module>
    s.shell.run(["nox", "-s", "blacken"], hide_output=False)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.

Synthesis failed

Google internal developers can see the full log here.

int() argument must be a string, a bytes-like object or a number, not 'dict'

Environment details

  • OS type and version: macOS Big Sur
  • Python version: Python 3.6.10
  • pip version: 21.0.1
  • google-cloud-monitoring version: 2.2.0

Steps to reproduce

  1. Just use the following import from google.cloud import monitoring_v3 in any file.

Code example

from google.cloud import monitoring_v3

Stack trace

  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/__init__.py", line 18, in <module>
    from .services.alert_policy_service import AlertPolicyServiceClient
  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/__init__.py", line 18, in <module>
    from .client import AlertPolicyServiceClient
  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/client.py", line 35, in <module>
    from google.cloud.monitoring_v3.services.alert_policy_service import pagers
  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/pagers.py", line 29, in <module>
    from google.cloud.monitoring_v3.types import alert
  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/types/__init__.py", line 18, in <module>
    from .alert import AlertPolicy
  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/types/alert.py", line 21, in <module>
    from google.cloud.monitoring_v3.types import common
  File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/types/common.py", line 51, in <module>
    class ServiceTier(proto.Enum):
  File "venv/lib/python3.6/site-packages/proto/enums.py", line 72, in __new__
    cls = super().__new__(mcls, name, bases, attrs)
  File ".pyenv/versions/3.6.10/lib/python3.6/enum.py", line 201, in __new__
    enum_member = __new__(enum_class, *args)
TypeError: int() argument must be a string, a bytes-like object or a number, not 'dict'

The problem is in the file: google/cloud/monitoring_v3/types/common.py and class ServiceTier(proto.Enum). When I remove the property _pb_options = {"deprecated": True} everything works correctly.

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.