GithubHelp home page GithubHelp logo

datadog / datadog-api-client-go Goto Github PK

View Code? Open in Web Editor NEW
121.0 475.0 50.0 288.55 MB

Golang client for the Datadog API

Home Page: http://datadoghq.dev/datadog-api-client-go/

License: Apache License 2.0

Go 94.29% Shell 0.02% Gherkin 4.76% Python 0.38% Jinja 0.54%
datadog datadog-api golang openapi

datadog-api-client-go's Introduction

datadog-api-client-go

This repository contains a Go API client for the Datadog API.

Requirements

  • Go 1.19+

Layout

This repository contains per-major-version API client packages. Right now, Datadog has two API versions, v1, v2 and the common package.

The API v1 Client

The client library for Datadog API v1 is located in the api/datadogV1 directory. Import it with

import "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1"

The API v2 Client

The client library for Datadog API v2 is located in the api/datadogV2 directory. Import it with

import "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"

The Datadog Package

The datadog package for Datadog API is located in the api/datadog directory. Import it with

import "github.com/DataDog/datadog-api-client-go/v2/api/datadog"

Getting Started

Here's an example creating a user:

package main

import (
    "context"
    "fmt"
    "os"

    "github.com/DataDog/datadog-api-client-go/v2/api/datadog"
    "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )

    body := *datadogV2.NewUserCreateRequest(*datadogV2.NewUserCreateData(*datadogV2.NewUserCreateAttributes("[email protected]"), datadogV2.UsersType("users")))

    configuration := datadog.NewConfiguration()
    apiClient := datadog.NewAPIClient(configuration)
    usersApi := datadogV2.NewUsersApi(apiClient)

    resp, r, err := usersApi.CreateUser(ctx, body)
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error creating user: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    responseData := resp.GetData()
    fmt.Fprintf(os.Stdout, "User ID: %s", responseData.GetId())
}

Save it to example.go, then run go get github.com/DataDog/datadog-api-client-go/v2. Set the DD_CLIENT_API_KEY and DD_CLIENT_APP_KEY to your Datadog credentials, and then run go run example.go.

Unstable Endpoints

This client includes access to Datadog API endpoints while they are in an unstable state and may undergo breaking changes. An extra configuration step is required to enable these endpoints:

    configuration.SetUnstableOperationEnabled("<APIVersion>.<OperationName>", true)

where <OperationName> is the name of the method used to interact with that endpoint. For example: GetLogsIndex, or UpdateLogsIndex

Changing Server

When talking to a different server, like the eu instance, change the ContextServerVariables:

    ctx = context.WithValue(ctx,
        datadog.ContextServerVariables,
        map[string]string{
            "site": "datadoghq.eu",
    })

Disable compressed payloads

If you want to disable GZIP compressed responses, set the compress flag on your configuration object:

    configuration.Compress = false

Enable requests logging

If you want to enable requests logging, set the debug flag on your configuration object:

    configuration.Debug = true

Enable retry

If you want to enable retry when getting status code 429 rate-limited, set EnableRetry to true

    configuration.RetryConfiguration.EnableRetry = true

The default max retry is 3, you can change it with MaxRetries

    configuration.RetryConfiguration.MaxRetries = 3

Configure proxy

If you want to configure proxy, set env var HTTP_PROXY, and HTTPS_PROXY or set custom HTTPClient with proxy configured on configuration object:

    proxyUrl, _ := url.Parse("http://127.0.0.1:80")
    configuration.HTTPClient = &http.Client{
        Transport: &http.Transport{Proxy: http.ProxyURL(proxyUrl)}
    }

Pagination

Several listing operations have a pagination method to help consume all the items available. For example, to retrieve all your incidents:

package main

import (
	"context"
	"fmt"
	"os"

	"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
	"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
	ctx := datadog.NewDefaultContext(context.Background())
	configuration := datadog.NewConfiguration()
	configuration.SetUnstableOperationEnabled("v2.ListIncidents", true)
	apiClient := datadog.NewAPIClient(configuration)
	incidentsApi := datadogV2.NewIncidentsApi(apiClient)

	resp, _ := incidentsApi.ListIncidentsWithPagination(ctx, *datadog.NewListIncidentsOptionalParameters())
	for paginationResult := range resp {
		if paginationResult.Error != nil {
			fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentsWithPagination`: %v\n", paginationResult.Error)
		}
		responseContent, _ := json.MarshalIndent(paginationResult.Item, "", "  ")
		fmt.Fprintf(os.Stdout, "%s\n", responseContent)
	}

}

Encoder/Decoder

By default, datadog-api-client-go uses the Go standard library enconding/json to encode and decode data. As an alternative users can opt in to use goccy/go-json by specifying the go build tag goccy_gojson.

In comparison, there was a significant decrease in cpu time with goccy/go-json with an increase in memory overhead. For further benchmark information, see goccy/go-json benchmark section.

Documentation

Developer documentation for API endpoints and models is available on Github pages. Released versions are available on pkg.go.dev.

Contributing

As most of the code in this repository is generated, we will only accept PRs for files that are not modified by our code-generation machinery (changes to the generated files would get overwritten). We happily accept contributions to files that are not autogenerated, such as tests and development tooling.

Author

[email protected]

datadog-api-client-go's People

Contributors

api-clients-generation-pipeline[bot] avatar arvindth avatar beardeddragon5 avatar bkabrda avatar code-lucidal58 avatar dependabot[bot] avatar dnaeon avatar grokify avatar gzussa avatar hantingzhang2 avatar houqp avatar jirikuncar avatar jmini avatar kraney avatar loveisgrief avatar mcristina422 avatar nathanbaulch avatar nkzou avatar nmische avatar nmuesch avatar nouemankhal avatar sebastien-rosset avatar semtexzv avatar skarimo avatar spacether avatar therve avatar thiagoarrais avatar urandom2 avatar wing328 avatar zippolyte avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datadog-api-client-go's Issues

Cannot download the client with dd-trace-go

Describe the bug

I want to use the api client with dd-trace-go in a Go module but cannot resolve dependencies with the following error.

go: finding module for package gopkg.in/DataDog/dd-trace-go.v1/ddtrace/tracer
go: finding module for package github.com/DataDog/datadog-api-client-go/api/v2/datadog
go: found github.com/DataDog/datadog-api-client-go/api/v2/datadog in github.com/DataDog/datadog-api-client-go v1.0.0-beta.22
go: found gopkg.in/DataDog/dd-trace-go.v1/ddtrace/tracer in gopkg.in/DataDog/dd-trace-go.v1 v1.31.1
go: finding module for package gopkg.in/DataDog/dd-trace-go.v1/contrib/testing
github.com/kamatama41/mymodule imports
        github.com/DataDog/datadog-api-client-go/api/v2/datadog tested by
        github.com/DataDog/datadog-api-client-go/api/v2/datadog.test imports
        github.com/DataDog/datadog-api-client-go/tests imports
        gopkg.in/DataDog/dd-trace-go.v1/contrib/testing: module gopkg.in/DataDog/dd-trace-go.v1@latest found (v1.31.1), but does not contain package gopkg.in/DataDog/dd-trace-go.v1/contrib/testing
..cts/sq-cloud-workspace/workspace/sq-event-processor/adapter/appsample(update-datadog-tag-on-connect)$ go mod tidy
go: finding module for package gopkg.in/DataDog/dd-trace-go.v1/ddtrace/tracer
go: finding module for package github.com/DataDog/datadog-api-client-go/api/v2/datadog
go: found github.com/DataDog/datadog-api-client-go/api/v2/datadog in github.com/DataDog/datadog-api-client-go v1.0.0-beta.22
go: found gopkg.in/DataDog/dd-trace-go.v1/ddtrace/tracer in gopkg.in/DataDog/dd-trace-go.v1 v1.31.1
go: finding module for package gopkg.in/DataDog/dd-trace-go.v1/contrib/testing
github.com/kamatama41/mymodule imports
        github.com/DataDog/datadog-api-client-go/api/v2/datadog tested by
        github.com/DataDog/datadog-api-client-go/api/v2/datadog.test imports
        github.com/DataDog/datadog-api-client-go/tests imports
        gopkg.in/DataDog/dd-trace-go.v1/contrib/testing: module gopkg.in/DataDog/dd-trace-go.v1@latest found (v1.31.1), but does not contain package gopkg.in/DataDog/dd-trace-go.v1/contrib/testing

To Reproduce

project

.
├── go.mod
└── main.go

go.mod

module github.com/kamatama41/mymodule

go 1.16

main.go

package main

import (
	"github.com/DataDog/datadog-api-client-go/api/v2/datadog"
	"gopkg.in/DataDog/dd-trace-go.v1/ddtrace/tracer"
)

func main() {
	tracer.Start()
	configuration := datadog.NewConfiguration()
	_ = datadog.NewAPIClient(configuration)
}

then run go mod tidy

Expected behavior
No error returns and go.mod is tidied up

Screenshots

Environment and Versions (please complete the following information):

  • Go 1.16
  • datadog-api-client-go v1.0.0-beta.22
  • dd-trace-go.v1 v1.31.1

Additional context

No longer possible to cross compile while using library.

Describe the bug
As of commit 34ff0ef, it is no longer possible to cross compile an application that uses datadog-api-client-go.

To Reproduce

This can be reproduced using the examples shipped with the library, so, on an amd64 system:

You can use any GOARCH that is not your current arch, or any GOOS that is not your current OS.

Explicitly disabling cgo by using CGO_ENABLED=0 works as well.

What you get is:

go build github.com/DataDog/zstd: build constraints exclude all Go files in .../go/pkg/mod/github.com/!data!dog/[email protected]

Expected behavior
The example should build.

Environment and Versions (please complete the following information):
Current git head, and in fact any version after v1.15.0.

Using go 1.18.

Additional context
My specific use case is developing on a darwin system attempting to cross compile to linux, I can make due with docker, but that's decidedly frustrating when this worked previously.

In a lot of ways, I suspect that the right fix would be in zstd, having an alternate version of the code that only compiles if CGO is disabled, and which returns an error from all of the functions which rely on zstd indicating a lack of support.

Alternatively, this could be done with a bit less work, and a hair more ugly, inside of datadog-api-client-go, by stubbing out the Content-Encoding == zstd1 case into a function in a separate file, and having a non-cgo version which always returns an err.

Missing fields in MetricQueryMetadata struct

Describe the bug
There are a couple of fields missing in the MetricQueryMetadata struct. These are also missing in the documentation.

Here is the actual response I am getting from the GET /api/v1/query endpoint:

{
    "end": 1622526239000,
    "attributes": {},
    "metric": "custom.load.1",
    "interval": 120,
    "tag_set": [
        "key1:tag1"
    ],
    "start": 1622504760000,
    "length": 96,
    "query_index": 0,
    "aggr": null,
    "scope": "key1:tag1",
    "pointlist": [
    ],
    "expression": "custom.load.1{key1:tag1}",
    "unit": null,
    "display_name": "custom.load.1"
}

As you can see these fields are missing from the struct and the documentation:

  • attributes
  • tag_set
  • query_index

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

  • version for this project in use: v1.0.0-beta.22

GetUsageBillableSummary returns nil ApplicationSecurityHostTop99

Describe the bug

The GetUsageBillableSummary function does not return the correct value for the ApplicationSecurityHostTop99 field. It appears to be due to a bug on the following line:

https://github.com/DataDog/datadog-api-client-go/blob/master/.generator/schemas/v1/openapi.yaml#L15448

The field name is missing the 'p' on the end. It should be application_security_host_top99p rather than application_security_host_top99.

To Reproduce
Steps to reproduce the behavior:

  1. Call the GetUsageBillableSummary function against an account that monitors application security hosts.
  2. Read the UsageBillableSummaryHour field from the UsageBillableSummaryResponse.
  3. Read the UsageBillableSummaryKeys field from the UsageBillableSummaryHour.
  4. Read the ApplicationSecurityHostTop99 field from the UsageBillableSummaryKeys.

Expected behavior
The ApplicationSecurityHostTop99 field is a non-nil *UsageBillableSummaryBody.

Actual behavior
The ApplicationSecurityHostTop99 field is nil.

Screenshots
Postman screenshot showing the response of the billable-summary API includes the 'p' on the end of application_security_host_top99p:
image

Environment and Versions (please complete the following information):
A DataDog account that uses application security hosts.

`DashboardsApi.GetDashboard()` returns error "Required field height missing"

Describe the bug
Using function DashboardsApi.GetDashboard() to query a new flow style dashboard. For some cases (especially after dragging the groups in 2-column mode) response is 200 but giving an error "Required field height missing".

Label the issue properly.

  • Add severity/ label.
  • Add documentation label if this issue is related to documentation changes.

To Reproduce
Steps to reproduce the behavior:

  1. Create a new style dashboard
  2. Create multiple groups
  3. Drag groups in high-density mode into 2nd column
  4. Save the dashboard and query it with DashboardsApi.GetDashboard() API error returned besides HTTP 200 response
  5. By looking the json in response body, the layout of one widget is "layout":{"is_column_break":true},, which missing x/y/height/width

Expected behavior
dashboard queried out

Screenshots
If applicable, add screenshots to help explain your problem.

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

  • version for this project in use.
  • services, libraries, languages and tools list and versions.

Additional context
Add any other context about the problem here.

Review stale PR to unblock issue #1565 for TF Provider for Datadog

Is your feature request related to a problem? Please describe.
This issue is similar to #1621. Namely, there's a feature request for TF Provider for Datadog to add a new resource DataDog/terraform-provider-datadog#1565 which is blocked since CRUD API for managing Confluent Cloud integration is unavailable at the moment.

Describe the solution you'd like
It looks like the corresponding PR has been generated already so we'd like someone from @integrations-tools-and-libraries to review & merge it (cc @therve @jirikuncar @zippolyte @nmuesch @skarimo): #1648

image

Thanks!

How to validate API key with EU site?

Describe the bug

Some of this repo states:

set the environment variable DATADOG_HOST to https://api.datadoghq.eu or override this value directly when creating your client.

Setting DATADOG_HOST doesn't work for me. Where/how can I set the host when creating my client?

Expected behavior

Clear and concise documentation on how to use EU DD host.

GetSLOHistory cannot unmarshal when using optional target param

Describe the bug
Making a apiClient.ServiceLevelObjectivesApi.GetSLOHistorycall with the optional target returns an error when unmarshaling: custom is not a valid SLOTimeframe

using the api v1 client

import datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"

To Reproduce
Using this snippet,

sloId := "a04..."
target := float64(65)

to := time.Now()
from := to.Add(-time.Hour * 24 * 7)

ctx := datadog.NewDefaultContext(context.Background())

configuration := datadog.NewConfiguration()
configuration.SetUnstableOperationEnabled("GetSLOHistory", true)

optionalParams := datadog.GetSLOHistoryOptionalParameters{
Target: &target,
}

apiClient := datadog.NewAPIClient(configuration)
resp, _, err := apiClient.ServiceLevelObjectivesApi.GetSLOHistory(
ctx, sloId, from.UTC().Unix(), to.UTC().Unix(), optionalParams)
if err != nil {
  fmt.Println("error:", err)
}

error: output error message is:

custom is not a valid SLOTimeframe

Unmarshaling is failing at

err = a.client.decode(&localVarReturnValue, localVarBody, localVarHTTPResponse.Header.Get("Content-Type"))
if err != nil {
newErr := GenericOpenAPIError{
body: localVarBody,
error: err.Error(),
}

Expected behavior
I expect the function to be able to unmarshal and return data

Environment and Versions (please complete the following information):

  • golang 1.16
  • github.com/DataDog/datadog-api-client-go v1.0.0-beta.22

Additional context
Making the same query from curl/postman produces valid json

Looking at the json result, I saw a new entry in thresholds:

"thresholds": {
            "7d": {
                "warning": 97.0,
                "warning_display": "97.",
                "target": 95.0,
                "target_display": "95.",
                "timeframe": "7d"
            },
            "30d": {
                "target": 89.99,
                "target_display": "89.99",
                "timeframe": "30d"
            },
            "custom": {
                "target": 65.0,
                "target_display": "65.",
                "timeframe": "custom"
            }
        },

where a timeframe value is a string "custom". This is probably causing the error when unmarshaling into the SLOThreshold struct

type SLOThreshold struct {
// The target value for the service level indicator within the corresponding timeframe.
Target float64 `json:"target"`
// A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (e.g. `98.00`). Always included in service level objective responses. Ignored in create/update requests.
TargetDisplay *string `json:"target_display,omitempty"`
Timeframe SLOTimeframe `json:"timeframe"`

where "custom" is not a valid value for the allowedSLOTimeframeEnumValues enum

var allowedSLOTimeframeEnumValues = []SLOTimeframe{
"7d",
"30d",
"90d",
}

Examples shown in docs will result in "ctx declared but not used"

Describe the bug

Take the most basic example, to validate an API key: https://github.com/DataDog/datadog-api-client-go/blob/master/api/v1/datadog/docs/AuthenticationApi.md#example

ctx is not used and will result in error ctx declared but not used.

I can guess that ctx is supposed to be used in place of context.Background() but is this really good documentation, for every single example in this repo and on docs.datadoghq.com/api/v1?

Expected behavior

Example that is ready to be copy, pasted and work without error 🙂

CloudQuery Source Plugin?

Hi Team, hopefully this is right place to ask, if not, I'd appreciate if you can direct me.

I'm the founder of cloudquery.io, a high performance open source ELT framework.

Our users are asked for a Datadog source plugin, and we already shipped a first version.

As we have limited capacity to maintain all plugins, we are usually looking for the official vendor to help maintain it (similar to terraform provider).

I was curious if this would be an interesting collaboration, where we can help with the initial version (already implemented) and you will help maintain it?

This will give your users the ability to sync/ELT Datdog APIs to any of their datalakes/data-warehouses/databases easily using any of the growing list of CQ destination plugins.

Best,
Yevgeny

LogsAggregateRequest response returns null values for log timeseries requests

Describe the bug
JSON marshaling of aggregate log timeseries response is broken and always returns null values for the timeseries.

To Reproduce
Steps to reproduce the behavior:

  1. Execute this example.
  2. Notice null values.

Expected behavior
Values are returned for each timeseries that is produced by the query.

Screenshots
image

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

datadog-api-client-go v.1.14.0
golang 1.16.5

Additional context
Add any other context about the problem here.

UpdateMonitor() uses a different but similar "monitor" object compared to other Monitor functions

Gzip ContentEncoding not working as intended

Describe the bug
Gzip Content Encoding on v1 Submit log does not seem to be working

To Reproduce
Steps to reproduce the behavior:

  1. Use the example Go code with the gzip ContentEncoding. Only difference is apply some HTTPLogItems
package main

import (
	"context"
	"encoding/json"
	"fmt"
	"os"

	datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

var service = "test_service"
var hostname = "test_host"
var source = "test"
var message = "{\"test_key\": \"test_message\"}"

func main() {
	ctx := datadog.NewDefaultContext(context.Background())

	body := []datadog.HTTPLogItem{{
		Ddsource: &source,
		Hostname: &hostname,
		Service:  &service,
		Message:  &message,
	}} // []HTTPLogItem | Log to send (JSON format).
	contentEncoding := datadog.ContentEncoding("gzip") // ContentEncoding | HTTP header used to compress the media-type. (optional)
	ddtags := "env:prod,user:my-user"                  // string | Log tags can be passed as query parameters with `text/plain` content type. (optional)
	optionalParams := datadog.SubmitLogOptionalParameters{
		ContentEncoding: &contentEncoding,
		Ddtags:          &ddtags,
	}

	configuration := datadog.NewConfiguration()

	apiClient := datadog.NewAPIClient(configuration)
	resp, r, err := apiClient.LogsApi.SubmitLog(ctx, body, optionalParams)
	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}
	// response from `SubmitLog`: interface{}
	responseContent, _ := json.MarshalIndent(resp, "", "  ")
	fmt.Fprintf(os.Stdout, "Response from LogsApi.SubmitLog:\n%s\n", responseContent)
}
  1. Run the code
  2. Check the output
  3. Get Error
Error when calling `LogsApi.SubmitLog`: Required field code missing
Full HTTP response: &{400 Bad Request 400 HTTP/2.0 2 0 map[Content-Length:[2] Content-Type:[application/json] Date:[Thu, 02 Sep 2021 21:10:57 GMT]] {{}} 2 [] false false map[] 0xc000190200 0xc00032f8c0}
Response from LogsApi.SubmitLog:
null

Expected behavior
I believe the expected behavior is the json body payload would have been gzip and compressed and sent to DD APIs

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

  • go version go1.16.6 darwin/amd64
  • github.com/DataDog/datadog-api-client-go v1.3.0

Additional context
The code works perfectly if you remove the gzip encoding optional parameter.
I believe this issue is caused due to the header being added for gzip encoding but the body never getting encoded with gzip compression.

Required field errors missing in UpdateDashboard operation

Describe the bug
When using the client to update a dashboard with a bunch of QueryValue definitions i get a panic: Required field errors missing but there's not even an errors property on anything of the objects.

To Reproduce
Build a simple dashboard with a QueryValue definition and run

	_, _, err = apiClient.DashboardsApi.UpdateDashboard(ctx, dashboardId, body)
	if err != nil {
		panic(err)
	}

and you'll get the stated error

Expected behavior
The dashboard is updated

  • version for this project in use: 1.14.0

Paths with whitespace are escaped incorrectly

When issuing requests that include spaces in parameters, this client escapes parameters incorrectly. I observed this when trying to get a PagerDuty service object:

https://docs.datadoghq.com/api/v1/pagerduty-integration/#get-a-single-service-object

If the service_name is foo - bar, the request issued is GET /api/v1/integration/pagerduty/configuration/services/foo+-+bar but should be GET /api/v1/integration/pagerduty/configuration/services/foo%20-%20bar. This may affect other routes.

I ran into this via https://github.com/terraform-providers/terraform-provider-datadog.

I've submitted a fix to openapi-generator and just wanted to open this issue to track the fix here:

OpenAPITools/openapi-generator#6618

Support for Synthetics CI trigger endpoints

Note:

Is your feature request related to a problem? Please describe.
I'd like to use the Go library to trigger and poll the results of Synthetic CI tests:

It appears that API code is updated from a repo:

(But I don't believe that's a public repository to open a pull-request for)

Describe the solution you'd like
Support API spec updates to generate Go code to support the following endpoints:

SyntheticsAssertionOperator - `validatesJSONPath` is missing

This bug is related to this issue on the terraform provider,
https://github.com/terraform-providers/terraform-provider-datadog/issues/566

The current list of Operators is:

19:// List of SyntheticsAssertionOperator                                                                   
21:     SYNTHETICSASSERTIONOPERATOR_CONTAINS             SyntheticsAssertionOperator = "contains"           
22:     SYNTHETICSASSERTIONOPERATOR_DOES_NOT_CONTAIN     SyntheticsAssertionOperator = "doesNotContain"     
23:     SYNTHETICSASSERTIONOPERATOR_IS                   SyntheticsAssertionOperator = "is"                 
24:     SYNTHETICSASSERTIONOPERATOR_IS_NOT               SyntheticsAssertionOperator = "isNot"              
25:     SYNTHETICSASSERTIONOPERATOR_LESS_THAN            SyntheticsAssertionOperator = "lessThan"           
26:     SYNTHETICSASSERTIONOPERATOR_MATCHES              SyntheticsAssertionOperator = "matches"            
27:     SYNTHETICSASSERTIONOPERATOR_DOES_NOT_MATCH       SyntheticsAssertionOperator = "doesNotMatch"       
28:     SYNTHETICSASSERTIONOPERATOR_VALIDATES            SyntheticsAssertionOperator = "validates"
29:     SYNTHETICSASSERTIONOPERATOR_IS_IN_MORE_DAYS_THAN SyntheticsAssertionOperator = "isInMoreThan"

Having this new type will make it possible to import probes containing a test on the values of a JSON response.

SYNTHETICSASSERTIONOPERATOR_JSON_PATH           SyntheticsAssertionOperator = "validatesJSONPath"

Datadog client does not support querying scalar values using `api/v2/query/scalar`

Note:
If you have a feature request, you should contact support so the request can be properly tracked.

Noted.

Is your feature request related to a problem? Please describe.
I want to be able to get scalar value for a query so that I can use it to evaluate SLO/SLI (related).

Based on my understanding:

image

URL: POST | https://app.datadoghq.com/api/v2/query/scalar

Request

{
   "meta":{
      "dd_extra_usage_params":{
         
      }
   },
   "data":[
      {
         "type":"scalar_request",
         "attributes":{
            "formulas":[
               {
                  "formula":"query1"
               }
            ],
            "queries":[
               {
                  "query":"avg:system.load.1{*}.rollup(avg, 1200)",
                  "data_source":"metrics",
                  "name":"query1",
                  "aggregator":"avg"
               }
            ],
            "from":1638368994000,
            "to":1638372594000
         }
      }
   ],
   "_authentication_token":"XXX"
}

Response

{
   "meta":{
      "res_type":"scalar",
      "responses":[
         
      ]
   },
   "data":[
      {
         "type":"scalar_response",
         "attributes":{
            "columns":[
               {
                  "type":"number",
                  "meta":{
                     "unit":null
                  },
                  "values":[
                     0.8087170141667361
                  ],
                  "name":"query1"
               }
            ]
         }
      }
   ]
}

I want to obtain this value programmatically using the go client library. The v1 version of the client only supports querying Time Series values.
If I use the same Datadog Notebook as above,
URL: POST | https://app.datadoghq.com/api/v2/query/timeseries

Request

{
   "meta":{
      "dd_extra_usage_params":{
         
      }
   },
   "data":[
      {
         "type":"timeseries_request",
         "attributes":{
            "queries":[
               {
                  "query":"avg:system.load.1{*}.rollup(avg, 1200)",
                  "data_source":"metrics",
                  "name":"query1"
               }
            ],
            "from":1638368994000,
            "to":1638372594000,
            "interval":20000,
            "formulas":[
               {
                  "formula":"query1"
               }
            ]
         }
      }
   ],
   "_authentication_token":"XXX"
}

Response

{
   "meta":{
      "res_type":null,
      "responses":[
         {
            "is_millisecond":true,
            "res_type":"time_series",
            "from_date":1638368994000,
            "to_date":1638372594000,
            "interval":1200000
         }
      ]
   },
   "data":[
      {
         "type":"timeseries_response",
         "attributes":{
            "series":[
               {
                  "unit":null,
                  "group_tags":[
                     
                  ],
                  "query_index":0
               }
            ],
            "times":[
               1638370800000,
               1638372000000
            ],
            "values":[
               [
                  0.811864,
                  0.80557
               ]
            ]
         }
      }
   ]
}

If you take an average of the values in the Time Series query response above (0.80557 + 0.811864) / 2 = 0.808717 which is the value you will get in the Query Value aka scalar query response.

This means, if I had to use the Time Series query API to mimic the behavior of Query Value, I will have to do the above calculation.

Describe the solution you'd like
I want Datadog client to support api/v2/query/scalar so that I can get the same behavior as the UI.

Describe alternatives you've considered
No real alternatives as of now. If I had to mimic the behavior for Query Value using the time series API, I would have to implement the calculation part by myself which Datadog does for me if I am using the UI.

Additional context
keptn/integrations#23
Please correct me if I misunderstood something 🙏

RUM application API

Is your feature request related to a problem? Please describe.

We would like to be able to manage RUM applications through the API, so we can manage RUM applications with terraform (related issue: DataDog/terraform-provider-datadog#1058).
The documentation (and this client) seems to only describe endpoints related to RUM events, not applications.
If the endpoints were documented and added to the client, we could add it to the terraform provider.

Describe the solution you'd like

That CRUD endpoints for RUM applications are documented and added to this client (like the /api/v1/rum/projects that seems to be used in the portal today).

Describe alternatives you've considered

Additional context

SubmitMetrics does not comply with http.Error spec, crashes on any spec-compliant error

Describe the bug
The http stdlib has this to say about http.Error*:

func Error
func Error(w ResponseWriter, error string, code int)
Error replies to the request with the specified error message and HTTP code. It does not otherwise end the request; the caller should ensure no further writes are done to w. The error message should be plain text.

However, if you follow this rule for a server which can receive Datadog metrics**, the library will crash if it receives any error generated this way. It expects JSON-formatted error strings and tries to parse any string it receives as JSON; for an error like so:

http.Error(wr, "Client Error: Headers", http.StatusBadRequest)

SubmitMetrics fails with the following message:

level=error msg="invalid character 'C' looking for beginning of value"

*An example of using net/http this way can be viewed at pkg.go.dev/net/http#example-Hijacker
**In my case the server is a proxy which lets us limit our client's access-control exceptions to a single URL we control rather than a selection of Datadog-controlled URLs. Code snippets abstract this away but are simplified versions of the real production code.

To Reproduce
Steps to reproduce the behavior:

1 Create a proxy or other endpoint which accepts HTTP traffic. It can be run anywhere - localhost:80 is fine. Set the endpoint to return a plain-text error message like the above snippet. (https://gist.github.com/jkopczyn/068fd2edd8d259a94d18e0229030d629 should be sufficient, ignoring the request entirely for brevity.)
2. Create an instance of the library which directs a SubmitMetrics request to that endpoint. Configure the library to submit metrics to a nonstandard URL. (https://gist.github.com/jkopczyn/f9dc116cc4c03b6ccaeee25d9313f902 should be sufficient; the metric body doesn't matter.)
3. Run the server and client locally, watching logs of the client.
4. The invalid character 'C' looking for beginning of value message will immediately appear.

Expected behavior
All plain-text error messages SHOULD be accepted. Other messages MAY be accepted; ideally they wouldn't be, but that's a breaking change to the API and doesn't seem crucial to do until and unless a new major version of the API is coming out.

Environment and Versions (please complete the following information):
A clear and precise description of your setup:
Go version 1.16 through 1.18 should make no difference here; different parts of the setup use both, and testing code used during development made no effort to match them. This behavior was consistent throughout.

Gzip ContentEncoding not working as intended

Reopening as this still does not work per my example code down below using master module off of the datadog-api-client-go repo code.

Describe the bug
Gzip Content Encoding on v1 Submit log does not seem to be working

To Reproduce
Steps to reproduce the behavior:

  1. Use the example Go code with the gzip ContentEncoding. Only difference is apply some HTTPLogItems
package main

import (
	"context"
	"encoding/json"
	"fmt"
	"os"

	datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

var service = "test_service"
var hostname = "test_host"
var source = "test"
var message = "{\"test_key\": \"test_message\"}"

func main() {
	ctx := datadog.NewDefaultContext(context.Background())

	body := []datadog.HTTPLogItem{{
		Ddsource: &source,
		Hostname: &hostname,
		Service:  &service,
		Message:  &message,
	}} // []HTTPLogItem | Log to send (JSON format).
	contentEncoding := datadog.ContentEncoding("gzip") // ContentEncoding | HTTP header used to compress the media-type. (optional)
	ddtags := "env:prod,user:my-user"                  // string | Log tags can be passed as query parameters with `text/plain` content type. (optional)
	optionalParams := datadog.SubmitLogOptionalParameters{
		ContentEncoding: &contentEncoding,
		Ddtags:          &ddtags,
	}

	configuration := datadog.NewConfiguration()

	apiClient := datadog.NewAPIClient(configuration)
	resp, r, err := apiClient.LogsApi.SubmitLog(ctx, body, optionalParams)
	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}
	// response from `SubmitLog`: interface{}
	responseContent, _ := json.MarshalIndent(resp, "", "  ")
	fmt.Fprintf(os.Stdout, "Response from LogsApi.SubmitLog:\n%s\n", responseContent)
}
  1. Run the code
  2. Check the output
  3. Get Error
Error when calling `LogsApi.SubmitLog`: Required field code missing
Full HTTP response: &{400 Bad Request 400 HTTP/2.0 2 0 map[Content-Length:[2] Content-Type:[application/json] Date:[Thu, 02 Sep 2021 21:10:57 GMT]] {{}} 2 [] false false map[] 0xc000190200 0xc00032f8c0}
Response from LogsApi.SubmitLog:
null

Expected behavior
I believe the expected behavior is the json body payload would have been gzip and compressed and sent to DD APIs

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

  • go version go1.16.6 darwin/amd64
  • github.com/DataDog/datadog-api-client-go v1.3.0

Additional context
The code works perfectly if you remove the gzip encoding optional parameter.
I believe this issue is caused due to the header being added for gzip encoding but the body never getting encoded with gzip compression.

Missing pagination support for SearchIncidents method

Describe the bug
The SearchIncidents API method supports pagination but is not properly exposed through the DataDog go, client. It's also not present in the documentation on the website.

To Reproduce
Steps to reproduce the behavior:

  1. Submit cURL request to API
# Required query arguments
export query="CHANGE_ME"
curl -X GET "https://api.datadoghq.com/api/v2/incidents/search?query=${query}" \
-H "Accept: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_APP_KEY}"
  1. Add the pagination parameters page[size] & page[offset]
export query="CHANGE_ME"
curl -X GET "https://api.datadoghq.com/api/v2/incidents/search?query=${query}&page[size]=1&page[offset]=0" \
-H "Accept: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_APP_KEY}"
  1. See that pagination support is possible for the SearchIncidents endpoint.

Expected behavior
The apiSearchIncidentsRequest should include both page[size] & page[offset] parameters similar to apiListIncidentsRequest since both endpoints support pagination.

Screenshots
Screenshot 2023-03-27 at 5 43 43 PM

Environment and Versions (please complete the following information):
Production Datadog instance, version 2.11.0 of the datadog-api-client-go package.

Add hasExtendedTitle support for security monitoring rule

Note:
If you have a feature request, you should contact support so the request can be properly tracked.

Is your feature request related to a problem? Please describe.
The DataDog Security monitoring rule API includes the hasExtendedTitle field. This is not reflected by the client package, and thus can't be implemented in the DataDog Terraform provider right now. This blocks us from using the Security monitoring rule resource in the DataDog Terraform provider, relying on a third party REST API provider instead.

Describe the solution you'd like
Support the field in the client API.

Describe alternatives you've considered
Use a different client.

Additional context
image

error-tracking alert cannot be parsed

Describe the bug

I get the following error when importing error-tracking alerts in terraform-provider-datadog.

$ terraform import datadog_monitor.my_monitor 123456789
│ Error: object contains unparsed element: map[created:2022-03-14T03:14:44.712215+00:00 created_at:1.647227684e+12 creator:map[email:xxx@xxx handle:xxx@xxx id:xxx name:genki SUGAWARA] deleted:<nil> id:xxx message:@xxx modified:2022-03-14T04:33:12.454800+00:00 multi:true name:test {{[@issue.id].name}} options:map[enable_logs_sample:true escalation_message: groupby_simple_monitor:false include_tags:true new_host_delay:300 notify_audit:false notify_no_data:false restriction_query:<nil> silenced:map[] thresholds:map[critical:1]] org_id:xxx overall_state:No Data overall_state_modified:2022-03-14T03:18:48+00:00 priority:<nil> query:error-tracking-traces("env:staging @issue.age:<=300000").rollup("count").by("@issue.id").last("5m") > 1 restricted_roles:<nil> tags:[] type:error-tracking alert]

This appears to be due to the fact that datadog-api-client-go does not support error-tracking alerts.

https://github.com/DataDog/terraform-provider-datadog/blob/ada5d0ffa90353ba9586cad4555a57db7ef67464/datadog/internal/utils/utils.go#L51-L56

To Reproduce

Execute the following command in terraform:

$ terraform import datadog_monitor.my_monitor 123456789
# import error-tracking monitor:
# {
#   "restricted_roles": null,
#   "tags": [],
#   "deleted": null,
#   "query": "error-tracking-traces(\"env:staging @issue.age:<=300000\").rollup(\"count\").by(\"@issue.id\").last(\"5m\") > 1",
#   ...
#   "type": "error-tracking alert",

Expected behavior

Can parse "type": "error-tracking monitor

Environment and Versions (please complete the following information):

  • Terraform v1.0.11
  • terraform-provider-datadog v3.9.0
    • github.com/DataDog/datadog-api-client-go v1.10.0

Additional context

Support for Post Event endpoint

My team would like to use the go library to post events to the event stream. However, it seems that the post method is missing. Is there any estimated timeframe for when support for this might be added? Thanks!

502 Bad Gateway for LogsApi.ListLogs

From https://github.com/DataDog/datadog-api-client-go/blob/master/api/v2/datadog/docs/LogsApi.md#ListLogs, it says "// LogsListRequest | (optional)"
But if I don't specify the body
optionalParams := datadog.ListLogsOptionalParameters{ // Body: &body, }
I got "502 bad gateway" issue, after research, root cause is here
https://github.com/DataDog/datadog-api-client-go/blob/master/api/v2/datadog/client.go#L466
The value of "body" is nil, but after err = json.NewEncoder(bodyBuf).Encode(body), the bodyBuf is "null", it causes the 502 error, I simply add
if reflect.ValueOf(body).IsNil() { return nil, nil }
at https://github.com/DataDog/datadog-api-client-go/blob/master/api/v2/datadog/client.go#L451 then everything is working, what's the correct solution should be?

New v2 CIVisibility api doesn't return an error from a pagination function

Describe the bug
When the CIVisibilityTestsApi.ListCIAppTestEventsWithPagination func is called, all errors are swallowed.

https://github.com/DataDog/datadog-api-client-go/blob/master/api/datadogV2/api_ci_visibility_tests.go#L243-L301

To Reproduce
Steps to reproduce the behavior:

  • Run the function with no API credentials, which results in a 403 in the logs. But no error is returned from the function.

Expected behavior
I would expect errors to be caught in the async func and bubbled up to the calling function using channels or some other method.

Environment and Versions (please complete the following information):
v2 2.5.0

Many documented api_client.MetricsApi.*(...).Execute() have more arguments that needed

Describe the bug
Documentation issue :
Many api_client.MetricsApi.*(...).Execute() have more arguments that needed
https://github.com/DataDog/datadog-api-client-go/blob/master/api/v1/datadog/docs/MetricsApi.md

Eg :
resp, r, err := api_client.MetricsApi.QueryMetrics(ctx, from, to, query).Execute()

Where (a *MetricsApiService) QueryMetrics takes only a _context.Context arg.

func (a *MetricsApiService) QueryMetrics(ctx _context.Context) apiQueryMetricsRequest {

To Reproduce
Steps to reproduce the behavior:

  1. Go to https://github.com/DataDog/datadog-api-client-go/blob/master/api/v1/datadog/docs/MetricsApi.md
  2. Check references to api_client.MetricsApi.*(...).Execute()
  • ListActiveMetrics
  • MetricsApi
  • QueryMetrics
  • ...
  1. See too many args

Expected behavior
That the documentation reflects the implemented code

README and some examples are out to date

Describe the bug
The CreateUser example on the README as well as the code under examples/v2/users/CreateUser.go seems to be out of date. When you add the dependency to a go mod project, you don't have access to the common package

For maintainer: Please add the documentation tag

To Reproduce
On an empty project, run: go get github.com/DataDog/datadog-api-client-go
The common package is not accessible, rendering the example unusable.

Fix
The example below fixes the README, I can open a PR for that but I wanted to get some context as if the examples are somehow auto generated? I can update those in the PR as well.

package main

import (
	"context"
	"fmt"
	"github.com/DataDog/datadog-api-client-go/api/v2/datadog"
	"os"
)

func main() {
	ctx := context.WithValue(
		context.Background(),
		datadog.ContextAPIKeys,
		map[string]datadog.APIKey{
			"apiKeyAuth": {
				Key: os.Getenv("DD_CLIENT_API_KEY"),
			},
			"appKeyAuth": {
				Key: os.Getenv("DD_CLIENT_APP_KEY"),
			},
		},
	)

	body := *datadog.NewUserCreateRequest(*datadog.NewUserCreateData(*datadog.NewUserCreateAttributes("[email protected]"), datadog.UsersType("users")))

	configuration := datadog.NewConfiguration()
	apiClient := datadog.NewAPIClient(configuration)

	resp, r, err := apiClient.UsersApi.CreateUser(ctx, body)
	if err != nil {
		fmt.Fprintf(os.Stderr, "Error creating user: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}
	responseData := resp.GetData()
	fmt.Fprintf(os.Stdout, "User ID: %s", responseData.GetId())
}

Add Logs Restrictions Query support

Note:
If you have a feature request, you should contact support so the request can be properly tracked.

Is your feature request related to a problem? Please describe.
We'd like to use the Log Restrictions Queries feature via Terraform which would necessitate it being added to the Go client. It's not currently implemented here.

Describe the solution you'd like
Support for that API implemented in the client.

Describe alternatives you've considered
N/A

Additional context
N/A

Can't import module without pulling in unwanted test dependencies

As an end-user, I'm interested in packages github.com/DataDog/datadog-api-client-go/api/v1/datadog and github.com/DataDog/datadog-api-client-go/api/v2/datadog, which have pleasantly few external dependencies. However, the go.mod also contains many undesirable additional dependencies needed by github.com/DataDog/datadog-api-client-go/tests/..., which are not actually needed to use the library, just to test it.

Please create a separate go module for the test packages to isolate their dependencies. Look at the go.mod file of gopls for an example:

Its development version uses replace to refer to the containing module:

module golang.org/x/tools/gopls
go 1.16
require (
    // ...
	golang.org/x/tools v0.1.0
    // ...
)
replace golang.org/x/tools => ../

While release versions use an exact version instead:

module golang.org/x/tools/gopls
go 1.16
require (
    // ...
	golang.org/x/tools v0.1.1-0.20210504170620-03ebc2c9fca8
    // ...
)

This way users e.g. don't need to use a release candidate of dd-trace-go.

API does not work for enterprise subdomain xxx.datadoghq.com

Describe the bug
When we use the client with enterprise datadog, which is typically https://.datadoghq.com, there is an error that states "Error calling : the variable site in the server URL has invalid value https://xxx.datadoghq.com. Must be [ ]

Label the issue properly.
severity/medium

  • Add severity/ label.
  • Add documentation label if this issue is related to documentation changes.

To Reproduce
Steps to reproduce the behavior:
The error is reproducable by using DD_SITE as some abcd.datadoghq.com.

Expected behavior
API should support all datadoghq.com sites.

Screenshots
If applicable, add screenshots to help explain your problem.

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

  • version for this project in use.
  • services, libraries, languages and tools list and versions.

Additional context
Add any other context about the problem here.

OpenAPI definition lacks the Submit Metrics endpoint.

Describe the bug
The OpenAPI definition for the Submit Metric datadog API endpoint is missing. Given this missing endpoint, it appears that one cannot submit metrics using this API client.

Label the issue properly.

  • Add severity/ label.
  • Add documentation label if this issue is related to documentation changes.

To Reproduce
Steps to reproduce the behavior:

  1. Navigate to the documentation for Submit Metric and observe that it is a documented endpoint.
  2. Navigate to API v1 openapi.yml and observe that the Submit Metric endpoint is not described.
  3. Navigate to API v2 openapi.yml and observe that the Submit Metric endpoint is not described (expected as the Submit metric endpoint is a V1 endpoint.)

Expected behavior
First I expect the https://docs.datadoghq.com/api/latest documentation to match the openapi description. Thus, I expect to find the Submit Metric endpoint listed as a viable endpoint in the openapi config. It is concerning that the documentation does not match the openapi.

If there is an alternative to the Submit Metric, I would expect that to be called out in https://docs.datadoghq.com/api/latest. If this is an intentional omission, I would similarly expect that to be called out, both in the documentation and in the OpenApi definition (I don't know if the latter is actually possible.)

Screenshots
N/A

Environment and Versions (please complete the following information):
My plan was to submit metrics to datadog when users execute actions from a binary cli on end-user machines. Thus running a datadog agent on an end user machine (internal to the company) is prohibitive and seems like overkill.

Additional context
N/A

DataDog Synthetics API Assertion Operator for Matches Regex and Does Not Match Regex

Note:
If you have a feature request, you should contact support so the request can be properly tracked.

Screenshot 2023-01-13 at 11 08 41 AM

Is your feature request related to a problem? Please describe.
The problem is that not all manually created tests are portable to programatic test creation by use of platforms like Terraform or frameworks like Python. The API should handle ALL cases available on the Web UI.

Describe the solution you'd like
Add the following operators

SYNTHETICSASSERTIONOPERATOR_MATCHES_REGEX        SyntheticsAssertionOperator = "matchesRegex"
SYNTHETICSASSERTIONOPERATOR_DOES_NOT_MATCH_REGEX SyntheticsAssertionOperator = "doesNotMatchRegex"

Describe alternatives you've considered
There is no way to programatically add this type of assertion operator without the DataDog API. The online documentation does not clearly outline what operator values are available to use. I had to read through this code base. I also don't see unit tests for Regex.

Additional context
A potential test data would be:

{
  type     = "body"
  operator = "validatesJSONPath"
  target   = {
    jsonpath    = "$.PhoneNumber"
    operator    = "matchesRegex"
    targetvalue = "\d+"
  }
}

SyntheticsGlobalVariableParserType is missing "x_path"

Describe the bug
The api returns the value x_path as parser type – but fails with error "x_path is not a valid SyntheticsGlobalVariableParserType".

I encountered this when using the terraform integration and tried to import a multistep HTTP test. Shortened example response:

{
    "config": {…},
        "steps": [
            {
                "assertions": […],
                "extractedValues": [
                    {
                        "name": "FORM_CS",
                        "parser": {
                            "type": "x_path",
                            "value": "//form[@name=\"form2\"]/input[@name=\"cs\"]/@value"
                        },
                        "type": "http_body"
                    },
                    …
                ],
                "id": "…",
                "name": "…",
                "request": {…},
                "subtype": "http"
            }, …
        ],
    "locations": […],
    "message": "…",
    "monitor_id": …,
    "name": "…",
    "options": {…},
    "public_id": "…",
    "status": "paused",
    "subtype": "multi",
    "tags": […],
    "type": "api"
}

To Reproduce

  1. Create a multistep API test, including extraction of a value using x_path.
  2. Import in terraform-datadog provider.

Expected behavior
Import should work

Environment and Versions (please complete the following information):

  • Terraform v0.14.11
  • provider registry.terraform.io/datadog/datadog v3.2.0

BETA // New monitor `slo alert` type is missing

The new type of monitor is not on the list:
https://github.com/DataDog/datadog-api-client-go/blob/master/api/v1/datadog/api/openapi.yaml#L3261

But the API can return it:

[2] pry(main)> @dog.get_monitor 19079344
=> ["200",
 {"restricted_roles"=>nil,
  "tags"=>[],
  "deleted"=>nil,
  "query"=>"error_budget(\"8a8b101bcede5114ae94943267d58c8b\").over(\"7d\") > 100",
  "message"=>"fooo",
  "id"=>19079344,
  "multi"=>false,
  "name"=>"[TMP] Budget alert on SLO: [SLO] - Server Availability",
  "created"=>"2020-07-03T12:28:25.179891+00:00",
  "created_at"=>1593779305000,
  "creator"=>{"id"=>1000011058, "handle"=>"", "name"=>"chussenot", "email"=>""},
  "org_id"=>1000001108,
  "modified"=>"2020-07-03T12:28:25.179891+00:00",
  "overall_state_modified"=>nil,
  "overall_state"=>"No Data",
  "type"=>"slo alert",
  "options"=>
   {"notify_audit"=>false, "locked"=>false, "silenced"=>{}, "include_tags"=>true, "thresholds"=>{"critical"=>100.0}, "new_host_delay"=>300, "notify_no_data"=>false}}]

Can you check that it's on the openapi future specs @therve?

Cheers,

Convention not followed

Describe the bug
https://github.com/DataDog/datadog-api-client-go/blob/v1.1.0/api/v1/datadog/docs/SyntheticsApi.md#GetAPITest

CreateSyntheticsAPITest and CreateSyntheticsBrowserTest are the only two functions that contain Synthetics in the function name. Other functions don't have this redundancy, like GetAPITest and GetBrowserTest.

To Reproduce
Look at the docs

Expected behavior
They should all have Synthetics in the name, or don't. Don't make it unexpected.

Screenshots
image

Additional context
Looks like this is going to be another breaking change...
#881
#885

Support incident attachments

Is your feature request related to a problem? Please describe.

This is a feature request to add support for incident attachments. Our specific use case is grabbing the postmortem_url for a specific incident, which is exposed in the API like so:

{
  "included": [
      ...
    {
      "type": "incident_attachments",
      "id": "XXX",
      "attributes": {
        "modified": "2022-05-05T20:21:25.194582+00:00",
        "attachment_type": "postmortem",
        "attachment": {
          "documentUrl": "https://app.datadoghq.com/notebook/XXX",
          "title": "Postmortem IR-XXX"
        }
      },
      "relationships": {
        "last_modified_by_user": {
          "data": {
            "type": "users",
            "id": "XXX"
          }
        }
      }
    }
  ],
  ...
}

Right now, it seems like we only deserialize the user attribute.

Describe the solution you'd like

We ended up implementing these structs:

type IncludedResp struct {
	Type       string
	ID         string
	Attributes map[string]interface{}
}

type IncidentResp struct {
	Included []IncludedResp
}

And this deserialization logic:

for _, incl := range data.Included {
	if incl.Type != "incident_attachments" {
		// handle err
	}
	attachType, ok := incl.Attributes["attachment_type"]
	if !ok || attachType != "postmortem" {
		// handle err
	}
	attach, ok := incl.Attributes["attachment"].(map[string]interface{})
	if !ok || attach["documentUrl"] == "" {
		// handle err
	}
	docURL, ok := attach["documentUrl"].(string)
	if !ok {
		// handle err
	}
}

Describe alternatives you've considered

See above.

/cc @mrhwick

LegendLayout and LegendColumns not supported

Is your feature request related to a problem? Please describe.
I was trying to fix an issue in the datadog terraform provider to add support for the above mentioned Timeseries definition arguments, but it appears this API package doesn't support them either.

Describe the solution you'd like
Add support for LegendLayout and LegendColumns to the timeseries dashboard object

Describe alternatives you've considered
The workaround right now is simply to not use those options in terraform.

multi is not a valid SyntheticsTestDetailsSubType error

While using datdadog api client, we are getting multi is not supported as a valid SyntheticsTestDetailsSubType when we have multi step synthetic tests - https://docs.datadoghq.com/synthetics/multistep/?tab=requestoptions in the account.

I assume it might be because multi is not supported in here - https://github.com/DataDog/datadog-api-client-go/blob/v1.0.0-beta.14/api/v1/datadog/model_synthetics_test_details_sub_type.go#L20

We are using github.com/DataDog/datadog-api-client-go v1.0.0-beta.14

Error log

2021-01-26T02:26:16.042-0800	ERROR	controllers.Datadog	Error when calling SyntheticsApi.ListTests	{"datadog": "default/datadog-sample", "responseError": "json: unsupported type: func() (io.ReadCloser, error)", "error": "multi is not a valid SyntheticsTestDetailsSubType"}
github.com/go-logr/zapr.(*zapLogger).Error
	/Users/ffrancis/workspace/go/pkg/mod/github.com/go-logr/[email protected]/zapr.go:128
guidewire.com/oculus/datadog.(*datadogClient).ListSyntheticTests
	/Users/ffrancis/workspace/oculus-operator/datadog/datadog_client.go:85
guidewire.com/oculus/controllers.(*DatadogReconciler).isSyntheticTestPresent
	/Users/ffrancis/workspace/oculus-operator/controllers/datadog_controller.go:108
guidewire.com/oculus/controllers.(*DatadogReconciler).Reconcile
	/Users/ffrancis/workspace/oculus-operator/controllers/datadog_controller.go:79
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler
	/Users/ffrancis/workspace/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:256
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem
	/Users/ffrancis/workspace/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:232
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker
	/Users/ffrancis/workspace/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:211
k8s.io/apimachinery/pkg/util/wait.JitterUntil.func1
	/Users/ffrancis/workspace/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:152
k8s.io/apimachinery/pkg/util/wait.JitterUntil
	/Users/ffrancis/workspace/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:153
k8s.io/apimachinery/pkg/util/wait.Until
	/Users/ffrancis/workspace/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:88

CIAppTestsAggregationBucketsResponse bucket type incompatibility

Describe the bug
If you group by a facet that is a non-string type, like a number, the client is not able to deserialize into the type CIAppTestsBucketResponse.By which is map[string]string

To Reproduce
Make this request:

package main

import (
	"context"
	"encoding/json"
	"fmt"
	"os"

	"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
	"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
	body := datadogV2.CIAppTestsAggregateRequest{
		Filter: &datadogV2.CIAppTestsQueryFilter{
			From:  datadog.PtrString("now-30d"),
			Query: datadog.PtrString(fmt.Sprintf("@git.branch:%s @git.commit.sha:%s", branch, commit)),
			To:    datadog.PtrString("now"),
		},
		GroupBy: []datadogV2.CIAppTestsGroupBy{
			{
				Facet: "@test.status",
			},
			{
				Facet: "@test.service",
			},
			{
				Facet: "@ci.pipeline.number",
				Limit: datadog.PtrInt64(1),
				Sort:  &datadogV2.CIAppAggregateSort{Order: datadogV2.CIAPPSORTORDER_DESCENDING.Ptr()},
			},
		},
		Options: &datadogV2.CIAppQueryOptions{
			Timezone: datadog.PtrString("UTC"),
		},
		Page: &datadogV2.CIAppQueryPageOptions{
			Limit: datadog.PtrInt32(100),
		},
	}
	ctx := datadog.NewDefaultContext(context.Background())
	configuration := datadog.NewConfiguration()
	apiClient := datadog.NewAPIClient(configuration)
	api := datadogV2.NewCIVisibilityTestsApi(apiClient)
	resp, r, err := api.AggregateCIAppTestEvents(ctx, body)

	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityTestsApi.AggregateCIAppTestEvents`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}

	responseContent, _ := json.MarshalIndent(resp, "", "  ")
	fmt.Fprintf(os.Stdout, "Response from `CIVisibilityTestsApi.AggregateCIAppTestEvents`:\n%s\n", responseContent)
}

resp.Data.Buckets[0].By is empty because it has a number value (@ci.pipeline.number) in it and the type is map[string]string.

If I dump the raw value I get this but I'm unable to get anything from the map.

{
  "by": {
    "@ci.pipeline.number": 3018,
    "@test.service": "e2e-tests",
    "@test.status": "skip"
  },
  "computes": {
    "c0": 36
  }
}

Expected behavior
CIAppTestsBucketResponse.By type should be map[string]interface{}

Monthly Usage Attribution API response model does not contain all supported metric types

Describe the bug
The API response model for the GetMonthlyUsageAttribution API does not specify fields for all types of MonthlyUsageAttributionSupportedMetrics. For example, the DBM metrics are supported as request parameters, but the response model does not contain the corresponding fields.

To Reproduce
Call the GetMonthlyUsageAttribution API using datadogV1.MONTHLYUSAGEATTRIBUTIONSUPPORTEDMETRICS_DBM_HOSTS_USAGE as the fields parameter.

Expected behavior
The MonthlyUsageAttributionResponse struct should contain the DBM host data, but doesn't due to the field missing.

Environment and Versions (please complete the following information):
github.com/DataDog/datadog-api-client-go/v2 v2.5.0

GetUsageTopAvgMetrics not accepting month param

Describe the bug

I used the example code from GetUsageTopAvgMetrics, but seeing the following error:

./main.go:30:67: too many arguments in call to api_client.UsageMeteringApi.GetUsageTopAvgMetrics
        have (context.Context, time.Time)
        want (context.Context)

To Reproduce
Steps to reproduce the behavior:

  1. Go to GetUsageTopAvgMetrics and copy the example code as main.go
  2. Update DD_CLIENT_API_KEY, DD_CLIENT_APP_KEY
  3. Replace "Get-Date" string with time.Now()
  4. Run go run main.go

Expected behavior
Run without an error

Environment and Versions (please complete the following information):
A clear and precise description of your setup:

  • macOS Mojave v10.14.6
  • go version go1.14.2 darwin/amd64

Please add example to authenticate using application key, and api key

Note:
If you have a feature request, you should contact support so the request can be properly tracked.

Is your feature request related to a problem? Please describe.

Describe the solution you'd like
It is not clear how I authenticate the golang api client with datadog.

To save time please add an example of how to authenticate.
I read from some site and trying to export various attempt the env var without success. The java api sdk has some example about authentication though, but not golang.

Describe alternatives you've considered

Additional context
Add any other context or screenshots about the feature request here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.