GithubHelp home page GithubHelp logo

datadog / dd-trace-dotnet Goto Github PK

View Code? Open in Web Editor NEW
403.0 464.0 131.0 128.44 MB

.NET Client Library for Datadog APM

Home Page: https://docs.datadoghq.com/tracing/

License: Apache License 2.0

C# 76.03% PowerShell 0.13% Shell 0.23% Batchfile 0.03% C++ 19.28% CMake 0.10% Perl 0.07% C 3.71% Objective-C 0.01% Assembly 0.01% Dockerfile 0.13% CSS 0.01% JavaScript 0.01% HTML 0.17% ASP.NET 0.02% Visual Basic .NET 0.01% Pawn 0.02% TSQL 0.04% Logos 0.01%
datadog apm tracing opentracing dotnet profiling

dd-trace-dotnet's Introduction

Datadog APM .NET Client Libraries

bits dotnet

This repository contains the sources for the client-side components of the Datadog product suite for Application Telemetry Collection and Application Performance Monitoring for .NET Applications.

Datadog .NET Tracer: A set of .NET libraries that let you trace any piece of your .NET code. It automatically instruments supported libraries out-of-the-box and also supports custom instrumentation to instrument your own code.

This library powers Distributed Tracing, Application Security Management, Continuous Integration Visibility, Dynamic Instrumentation and more.

Datadog .NET Continuous Profiler: Libraries that automatically profile your application. Documentation.

Downloads

Package Download
Windows and Linux Installers GitHub release (latest SemVer)
Datadog.Trace Datadog.Trace
Datadog.Trace.OpenTracing Datadog.Trace.OpenTracing

Build status

  • Build status on master: Build

  • Benchmarks dashboard on master: Dashboard

Copyright

Copyright (c) 2017 Datadog https://www.datadoghq.com

License

See license information.

Contact us

Security Vulnerabilities

If you have found a security issue, please contact the security team directly at [email protected].

Other feedback

If you have questions, feedback, or feature requests, reach our support.

dd-trace-dotnet's People

Contributors

andrewlock avatar anna-git avatar bmermet avatar bobuva avatar bouwkast avatar cbeauchesne avatar chrisnas avatar colin-higgins avatar daniel-romano-dd avatar dd-caleb avatar dudikeleti avatar duncanista avatar e-n-0 avatar github-actions[bot] avatar gleocadie avatar greenmatan avatar kevingosse avatar kr-igor avatar link04 avatar lucaspimentel avatar macrogreg avatar nachoechevarria avatar omerraviv avatar pablomartinezbernardo avatar pierotibou avatar robertpi avatar shurivich avatar tonyredondo avatar vandonr avatar zacharycmontoya avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dd-trace-dotnet's Issues

DataDog.Trace.OpenTracing should support `ISpan.Log` or no-op

When using a framework instrumented with OpenTracing, any time an ISpan.Log call is made it produces a ISpan.Log is not implemented by Datadog.Trace DEBUG message.

This is due to the fact that, as the error message describes, this feature is simply not implemented by DataDog at this time:

public ISpan Log(DateTimeOffset timestamp, IEnumerable<KeyValuePair<string, object>> fields)
{
_log.Debug("ISpan.Log is not implemented by Datadog.Trace");
return this;
}

In system where lots of pieces of data are being logged, this makes the Debug log quite noisy but more importantly - ISpan.Log is actually a rather important part of the OpenTracing ecosystem as a whole, as that's often how framework specific logs and critical events are recorded in the context of the trace.

Describe the solution you'd like
Ideally DataDog should implement the ISpan.Log interface and have it propagate log data to individually traced operations.

If that isn't feasible for whatever reason, don't log anything or only log the first N occurrences of the issue so the same message isn't logged over and over again by the frameworks that are calling the ISpan.Log interface.

Describe alternatives you've considered
Wrapping the Datadog.Trace.OpenTracing in our own NuGet package in order to elide the calls to Datadog.Trace.OpenTracing.OpentracingSpan.Log.

Set global tags

Is there any way to set global tags for a trace as available in the Node dd-trace-js and Go dd-trace-go libraries?

Make TraceId/SpanId public

Is your feature request related to a problem? Please describe.
i want to forward the traceId and spanId, but i cant acces them.
i saw in the source, the ids are public in the SpanContext but the SpanContext isnt public in the Span
How can i access them ?

Describe the solution you'd like
can Access the TraceId / SpanId of the current Span - scope

Instrument ASP.NET lifecycle events without requiring NuGet package

Hi -- for asp.net integration running in IIS (web forms, mvc, web api, etc.) using automatic instrumentation, is the adding the nuget package to the solution/project optional or required?

In other words, can we just install the regular DD agent and the APM agent (installed via MSI)?

Thanks.

build.ps1 doesn't work

Describe the bug

when i tried to build by run script of build.ps1, had some issuesใ€‚

To Reproduce
Steps to reproduce the behavior:

  1. clone new repo
  2. just run build.ps1

Expected behavior

just want all of projects ,build success

Screenshots
If applicable, add screenshots to help explain your problem.

Runtime environment (please complete the following information):

  • Instrumentation mode: [e.g. automatic with msi installer or manual with NuGet package]
  • Tracer version: [e.g. 0.7.1]
  • OS: [e.g. Windows 10 ]
  • CLR: [e.g. .NET Framework 4.6.2, .NET Core 2.1]

Additional context
Add any other context about the problem here.

Sending traces to the agent in k8s

Describe the bug

Getting the following error trying to send traces to my DD agent running in Kubernetes

{"Timestamp":"2018-11-01T14:24:57.5749038+00:00","Level":"Error","MessageTemplate":"An error occurred while sending traces to the agent at {Endpoint}","Exception":"System.Net.Http.HttpRequestException: Cannot assign requested address ---> System.Net.Sockets.SocketException: Cannot assign requested address\n at System.Net.Http.ConnectHelper.ConnectAsync(String host, Int32 port, CancellationToken cancellationToken)\n --- End of inner exception stack trace ---\n at System.Net.Http.ConnectHelper.ConnectAsync(String host, Int32 port, CancellationToken cancellationToken)\n at System.Threading.Tasks.ValueTask1.get_Result()\n at System.Net.Http.HttpConnectionPool.CreateConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)\n at System.Threading.Tasks.ValueTask1.get_Result()\n at System.Net.Http.HttpConnectionPool.WaitForCreatedConnectionAsync(ValueTask1 creationTask)\n at System.Threading.Tasks.ValueTask1.get_Result()\n at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)\n at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\n at System.Net.Http.HttpClient.FinishSendAsyncBuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)\n at Datadog.Trace.Agent.Api.SendAsync[T](T value, Uri endpoint)","Properties":{"Endpoint":"http://localhost:8126/v0.3/traces","SourceContext":"Datadog.Trace.Agent.Api"}}

To Reproduce

string ddAgentAddr = String.IsNullOrWhiteSpace(Environment.GetEnvironmentVariable("DD_AGENT_ADDR")) ? "http://localhost:8126" : $"http://{Environment.GetEnvironmentVariable("DD_AGENT_ADDR")}";
System.Uri uri = new System.Uri(ddAgentAddr);
Tracer.Create(uri);

Expected behavior
Traces show up fine if I run my service locally + sending to local DD Agent. I'm sure the agent runs fine in k8s as my other services, using different libraries, are able to send traces to the agent.

Runtime environment (please complete the following information):

  • Instrumentation mode: manual with NuGet
  • Tracer version: tried both 0.4.1-beta and 0.5.0-beta

Additional context

I'm sending over http://192.168.64.30:8126 as the uri but I notice the error contains "Endpoint":"http://localhost:8126/v0.3/traces ? Not sure if the default of localhost:8126 is being written over on Tracer.Create?

In trace.agent.log, there are no error messages


2018-11-01 14:40:03 INFO (trace_writer.go:97) - flushed trace payload to the API, time:253.26402ms, size:307 bytes
2018-11-01 14:40:13 INFO (trace_writer.go:97) - flushed trace payload to the API, time:163.809104ms, size:555 bytes
2018-11-01 14:40:18 INFO (service_mapper.go:59) - total number of tracked services: 0
2018-11-01 14:40:18 INFO (stats_writer.go:265) - flushed stat payload to the API, time:247.369233ms, size:742 bytes
2018-11-01 14:40:28 INFO (stats_writer.go:265) - flushed stat payload to the API, time:274.774159ms, size:419 bytes
2018-11-01 14:40:38 INFO (stats_writer.go:265) - flushed stat payload to the API, time:298.825843ms, size:493 bytes
2018-11-01 14:40:58 INFO (api.go:324) - [lang:go lang_version:1.11beta1 interpreter:gc-amd64-linux tracer_version:v1.0] -> traces received: 3, traces dropped: 0, traces filtered: 0, traces amount: 1205 bytes, services received: 0, services amount: 0 bytes
2018-11-01 14:40:58 INFO (api.go:324) - [lang:nodejs lang_version:v9.4.0 interpreter:v8 tracer_version:0.6.0] -> traces received: 2, traces dropped: 0, traces filtered: 0, traces amount: 3145 bytes, services received: 0, services amount: 0 bytes

Able to see that other services are able to send to DD agent.

automatic instrumentation under linux does not work

Describe the bug
i saw on your homepage that automatic instrumentation is now available and would like to try it, but it seems not to work or i do something wrong

and i cant find any exceptions

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic with deb packages
  • Tracer version: 0.5.1-beta
  • OS: Linux / Docker image ( microsoft/dotnet:2.1-runtime-deps-stretch-slim )
  • CLR: .Net Core 2.1

Additional context
i followed this guide

https://docs.datadoghq.com/tracing/setup/dotnet/#add-the-datadog-trace-clrprofiler-managed-nuget-package

below the some lines from dockerfile

#Install Datadog APM package
RUN mkdir -p /tmp
RUN curl -L https://github.com/DataDog/dd-trace-dotnet/releases/download/v0.5.1-beta/datadog-dotnet-apm_0.5.1_amd64.deb --output /tmp/datadog_apm.deb
RUN sudo dpkg -i /tmp/datadog_apm.deb


#Setup Datadog APM
ENV CORECLR_ENABLE_PROFILING=1
ENV CORECLR_PROFILER={846F5F1C-F9AE-4B07-969E-05C26BC060D8}
ENV CORECLR_PROFILER_PATH=/opt/datadog/Datadog.Trace.ClrProfiler.Native.so
ENV DD_INTEGRATIONS=/opt/datadog/integrations.json

and i have this nuget packages installed

Datadog.Trace 0.5.1-beta
Datadog.Trace.ClrProfilerManaged 0.5.1-beta

Cannot overwrite DefaultServiceName when using OpenTracer

Describe the bug

The OpenTracingSpan has functionality to overwrite the resource name and span type, but overwriting the service name is missing:

https://github.com/DataDog/dd-trace-csharp/blob/5e7656e/src/Datadog.Trace.OpenTracing/OpenTracingSpan.cs#L116-L139

Expected behavior

Calling scope.Span.SetTag(DatadogTags.ServiceName, "a.new.service.name"); should overwrite the default service name defined in OpenTracingTracerFactory.CreateTracer(new Uri(_configuration["datadog:endpointUri"]), "defult.service.name");

Additional context

According to the unit tests defined here: https://github.com/DataDog/dd-trace-csharp/blob/0107d72a650c6536412b8db7b3393226418cfba3/src/Datadog.Trace.Tests/SpanBuilderTests.cs#L117-L142 this behavior should work in the normal datadog tracing span, but seemingly was forgotten in the OpenTracingSpan

I can't use datadog tracer, because I use opentracing for ef core and those 2 libraries don't seem to mix well as of now.

Profiler not writing to log file on Linux

Describe the bug
I am trying to run the APM in a Docker Container for a .Net Core API. When I check the log file to see if any events have been recorded it doesn't exist and neither does the /vars/log/datadog directory.

To Reproduce
Sample Dockerfile

ENV CORECLR_ENABLE_PROFILING 1
ENV CORECLR_PROFILER {846F5F1C-F9AE-4B07-969E-05C26BC060D8}
ENV CORECLR_PROFILER_PATH /opt/datadog/Datadog.Trace.ClrProfiler.Native.so
ENV DD_INTEGRATIONS /opt/datadog/integrations.json
ENV DD_TRACE_LOG_PATH /var/log/datadog/dotnet-profiler.log

RUN curl -LO https://github.com/DataDog/dd-trace-dotnet/releases/download/v0.5.2-beta/datadog-dotnet-apm_0.5.2_amd64.deb \
	&& dpkg -i ./datadog-dotnet-apm_0.5.2_amd64.deb

Nuget Packages included in Project:

    <PackageReference Include="Datadog.Trace" Version="0.5.2-beta" />
    <PackageReference Include="Datadog.Trace.ClrProfiler.Managed" Version="0.5.2-beta" />

Environment Variables after container is run:

printenv
CORECLR_PROFILER={846F5F1C-F9AE-4B07-969E-05C26BC060D8}
ASPNETCORE_URLS=http://*:8080
DD_INTEGRATIONS=/opt/datadog/integrations.json
CORECLR_ENABLE_PROFILING=1
CORECLR_PROFILER_PATH=/opt/datadog/Datadog.Trace.ClrProfiler.Native.so
DD_TRACE_LOG_PATH=/var/log/datadog/dotnet-profiler.log

Verification that ClrProfiler installed correctly:

opt/datadog# ls
Datadog.Trace.ClrProfiler.Native.so  integrations.json

Runtime environment (please complete the following information):

  • Instrumentation mode: Automatic with Nuget Package
  • Tracer version: 0.5.2
  • OS: NAME="Debian GNU/Linux", VERSION_ID="9", VERSION="9 (stretch)"
  • CLR: .NetCore 2.1

Let me know if you need anymore information from me, the Docker image in question is for a production application so I can't provide the image itself.

APM causes error 500 in IIS site

Describe the bug
When I install DD APM for .NET Core X64 v4.1.0.0, my app suddenly starts giving 500 Errors (Internal Server Error). My App Pool has is an app that runs on port 5007. If I start the application using cmd with "dotnet app.dll", it loads up just fine, I can access the web/app as expected. So, when I use IIS, it doesn't work, but with dotnet cmd, it does.

To Reproduce
Have a .NET Core 4.1.0.0 app running on port 5007
Install APM
Stop/Start IIS using commands net stop was /y - net start w3svc provided by the guide.
Try to access the website

Expected behavior
My default IIS app to work

Screenshots
If applicable, add screenshots to help explain your problem.

Runtime environment (please complete the following information):

  • Instrumentation mode: tried both
  • Tracer version: DatadogDotNetTracing-0.7.0-x64.msi
  • OS: Windows Server 2016 Datacenter
  • CLR: .NetCore 4.1.0.0

Additional context
I changed the installed to include the Nuget package Datadog.Trace.ClrProfiler.Managed but it just ends up telling me that "website name cannot be empty"

HttpHeadersCodec.Extract Throws ArgumentException because a default trace-id isn't found

Describe the bug
I'm using Datadog.Trace.OpenTracing with OpenTracing and OpenTracing.Contrib.NetCore to attempt some automatic tracing.

When an my MVC app is the "top" of the execution chain, the DD tracer seems to attempt and then fails to extract two headers: x-datadog-parent-id and x-datadog-trace-id and throws an ArgumentException.

https://github.com/DataDog/dd-trace-csharp/blob/8d49793b905dff599671858aa76c99cd3726e417/src/Datadog.Trace.OpenTracing/HttpHeadersCodec.cs

To Reproduce
Steps to reproduce the behavior:

  • Use OpenTracing.Contrib to inject a Tracer into an MVC web api
  • Manually instrument a method

Expected behavior
If header extraction fails, then assume this is the root application and set default x-datadog-trace-id and allow for an empty x-datadog-parent-id

Runtime environment (please complete the following information):

  • Instrumentation mode: manual with NuGet package
  • Tracer version: 0.5.0
  • OS: Windows 10
  • CLR: .NET Core 2.1

Stack trace

System.ArgumentException: x-datadog-parent-id should be set.
   at Datadog.Trace.OpenTracing.HttpHeadersCodec.Extract(Object carrier)
   at Datadog.Trace.OpenTracing.OpenTracingTracer.Extract[TCarrier](IFormat`1 format, TCarrier carrier)
   at OpenTracing.Contrib.NetCore.AspNetCore.HostingEventProcessor.ProcessEvent(String eventName, Object arg) in C:\projects\csharp-netcore\src\OpenTracing.Contrib.NetCore\AspNetCore\HostingEventProcessor.cs:line 50
   at OpenTracing.Contrib.NetCore.AspNetCore.AspNetCoreDiagnostics.OnNext(String eventName, Object untypedArg) in C:\projects\csharp-netcore\src\OpenTracing.Contrib.NetCore\AspNetCore\AspNetCoreDiagnostics.cs:line 51
   at OpenTracing.Contrib.NetCore.Internal.DiagnosticListenerObserver.System.IObserver<System.Collections.Generic.KeyValuePair<System.String,System.Object>>.OnNext(KeyValuePair`2 value) in C:\projects\csharp-netcore\src\OpenTracing.Contrib.NetCore\Internal\DiagnosticListenerObserver.cs:line 46

An error occured on application exit

Describe the bug
I'm working on Csharp application with Datadog integration.
Then I terminating my app by using ctrl-c it will lead to following error:

Unhandled Exception: Syacstem.AggregateException: One or more errors occurred. (An attempt was made to transition a task to a final state when it had already completed.) ---> System.InvalidOperationException: An attempt was made to transition a task to a final state when it had already completed.
at Datadog.Trace.Agent.AgentWriter.FlushAndCloseAsync()
--- End of inner exception stack trace ---
at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
at Datadog.Trace.Tracer.CurrentDomain_UnhandledException(Object sender, UnhandledExceptionEventArgs e)

Looks like FlushAndCloseAsync will be called twice in this case:

  1. Event for Console.CancelKeyPress inside of Tracer.cs.
  2. Event for ProcessExitinside of Tracer.cs.

To Reproduce
Steps to reproduce the behavior:

  1. Start console app.
  2. Terminate by ctrl-c

Runtime environment (please complete the following information):

  • Tracer version: 0.3.2
  • OS: Windows 10
  • CLR: .Net Core 2.1

An error occured while sending traces to the agent

Describe the bug
Sometimes, not everytime, we gettings the folling error message

System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> System.IO.IOException: The server returned an invalid or unrecognized response.
at System.Net.Http.HttpConnection.SendAsyncCore(HttpRequestMessage request, CancellationToken cancellationToken)
--- End of inner exception stack trace ---
at System.Net.Http.HttpConnection.SendAsyncCore(HttpRequestMessage request, CancellationToken cancellationToken)
at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)
at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
at System.Net.Http.HttpClient.FinishSendAsyncBuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)
at Datadog.Trace.Agent.Api.SendAsync[T](T value, Uri endpoint)

To Reproduce
dont know because its completly random :(

Runtime environment (please complete the following information):

  • Instrumentation mode: manual
  • Tracer version: 0.3.0
  • OS: Linux / Docker image ( microsoft/dotnet:2.1-runtime-deps-stretch-slim )
  • CLR: .Net Core 2.1

Additional context
if possible, see request 167415 over dd support

i tried also to enable "isDebugEnabled" but i cant see any other logging messages ( using nlog )

Datadog.Trace.ClrProfiler.Native.Tests DEBUG Problem

hi ,
i'm a c++ beginner .
i use vs2017 run Datadog.Trace.ClrProfiler.Native.Tests , debug metadata_builder_test.cpp , metadata_dispenser_->OpenScope "Samples.ExampleLibrary.dll" return ERROR_FILE_NOT_FOUND.
i don't konw why , can help me ?

ADO.NET auto-tracing should support other DbCommand.ExecuteXXX methods

Are you requesting automatic instrumentation for a framework or library? Please describe.

  • Framework or library name : SqlClient
  • Library type: System.Data.SqlClient.SqlCommand

Describe the solution you'd like
The current instrumentation only supports ExecuteReader. The other execute methods ExecuteXmlReader, ExecuteScalar and ExecuteNonQuery should also be supported.

Instrumenting filesystem classes

Hi ,
I see that clrprofiler is prohibiting instrumenting .net framework classes.
Is there a plan to support filesystem classes? Or can someone guide me how to do this?

Thanks,
Hemant.

Make SamplingPriority on SpanContext public

Is your feature request related to a problem? Please describe.

internal SamplingPriority? SamplingPriority { get; }

i trying to forward the informations to the next service (over kafka) but the property "SamplingPriority" on the SpanContext is internal, is there any reason for that ?
it is possible to change that ?

Describe the solution you'd like
public instead of internal.

Custom Span Tagging w/ Middleware - aspnetcore 2.1

I'm working with the Datadog.Trace.ClrProfiler.Managed nuget package, version 0.5.2-beta on an ASP.NET Core 2.1 application, that is running on Kubernetes.

The DataDog Agent is running in a DaemonSet and is configured properly for APM (as far as I've observed so far).

I'm trying to set span tags using middleware with:

Create an ApplicationBuilder extension method:

using System.Threading.Tasks;
using Datadog.Trace;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;

namespace MyCoolService.Middleware
{
    public class DataDogTracing
    {
        private readonly RequestDelegate _next;
        private string _serviceName;

        public DataDogTracing(RequestDelegate next, string serviceName)
        {
            _next = next;
            _serviceName = serviceName;
        }
        public async Task InvokeAsync(HttpContext context)
        {
            using (var scope = Tracer.Instance.StartActive("aspnet_core_mvc.request", serviceName:_serviceName))
            {
                var span = scope.Span;
                span.Type = SpanTypes.Web;
                span.ResourceName = context.Request.Path;
                span.SetTag(Tags.HttpMethod, context.Request.Method);
                span.SetTag("sourcecategory", "sourcecode");
                span.SetTag("source", "csharp");

                await _next(context);
            }
        }
    }

    public static class DataDogTracingExtensions
    {
        public static IApplicationBuilder UseDataDogTracing(this IApplicationBuilder builder, string serviceName = "web")
        {
            return builder.UseMiddleware<DataDogTracing>(serviceName);
        }
    }
}

And then in Startup.cs:

        public void Configure(IApplicationBuilder app, IHostingEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }
            else
            {
                app.UseHsts();
            }

            app.UseMvc();

            app.UseDataDogTracing("myCoolService");
        }

This kind of works; I see traces showing up in our APM Trace List, but it seems like the entire InvokeAsync block doesn't appear to be used or flushed at all? I do know the block is being invoked on request since I've walked through the code locally.

Any ideas?

How to get a method signature

I see internal SqlDataReader ExecuteReader(CommandBehavior behavior, string method) signature is 20 02 0C 52 08 0B 52 5B 0E ๏ผŒBut I use CFF Explorer find signature is 20 02 12 87 00 11 82 54 0E , how to convert ?

I want to write some code to wapper MySqlConnector by Datadog.Trace.ClrProfiler.Native.dll , what way can i get any method signature in other assembly ?

thanks.

Environment variables missing?

On Windows Server, .net web apps running under IIS. I have installed APM via MSI installer. I see the registry keys (System\CurrentControlSet\Services\W3SVC\Environment and System\CurrentControlSet\Services\WAS\Environment). However when running SET at the command prompt the only DD related variable is DD_INTEGRATIONS (which is pointing to a non-existing file). Do I need to add the others or is the registry value sufficient? Note: in our case we want to automatically instrument all IIS web apps.

Could not load file or assembly System.CodeDom

Describe the bug
I Updated from 0.3.1-beta to 0.5-beta and we got a exception in all our services ( linux/windows )

System.Reflection.ReflectionTypeLoadException: Unable to load one or more of the requested types.
Could not load file or assembly 'System.CodeDom, Version=4.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'. The system cannot find the file specified.
Could not load file or assembly 'System.CodeDom, Version=4.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'. The system cannot find the file specified.
   at System.Reflection.RuntimeModule.GetTypes(RuntimeModule module)
   at System.Reflection.RuntimeModule.GetTypes()
   at System.Reflection.Assembly.GetTypes()
   at flowfact.services.attributes.ServiceRegister.<>c__DisplayClass3_0.<FindTypesByAttribute>b__1(Assembly s)
   at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()
   at System.Linq.Enumerable.SelectManySingleSelectorIterator2.ToList()
   at flowfact.services.attributes.ServiceRegister.FindTypesByAttribute(Type[] types)
System.IO.FileNotFoundException: Could not load file or assembly 'System.CodeDom, Version=4.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'. The system cannot find the file specified.
File name: 'System.CodeDom, Version=4.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'
System.IO.FileNotFoundException: Could not load file or assembly 'System.CodeDom, Version=4.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'. The system cannot find the file specified.

File name: 'System.CodeDom, Version=4.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'

Runtime environment (please complete the following information):

  • Instrumentation mode: manual
  • Tracer version: 0.5.0-beta
  • OS: Linux / Docker image ( microsoft/dotnet:2.1-runtime-deps-stretch-slim )
  • CLR: .Net Core 2.1

Additional context
i dont know what the exact problem is, but i think this lines below does produce the error
with 0.3.1-beta everythings works fine ( already did the downgrade ), with 0.5-beta i can reproduce the exception above

private static List<(Type Type, object[] CustomAttributes)> FindTypesByAttribute(params Type[] types)
{
    try
    {
        var allTypes = AppDomain.CurrentDomain.GetAssemblies()
            .Where(s => !s.FullName.StartsWith("Microsoft") && !s.FullName.StartsWith("System."))
            .Select(s => s.GetTypes()
                .Select(type => (Type: type, CustomAttributes: type.GetCustomAttributes(true)))
                .Where(z => z.CustomAttributes.Length > 0 && z.CustomAttributes.Any(t => types.Any(i => i == t.GetType())))
            );

        return allTypes.SelectMany(s => s).ToList();
    }
    catch (Exception ex)
    {
        Logger.Error(ex);
    }
    return null;
}

UnexpectedElasticsearchClientException when using Elasticsearch.net 5.x

I installed trace agent 0.7 to test it out. After installing every call to ElasticSearch fails. After uninstalling the trace agent things returned to normal. Something that is being injected into the code causes a cast exception. All the calls to elastic search are async. This is a C# .net 4.7.2 app running in IIS on windows 2016. I am using Elasticsearch.net version 5.6.1 and Nest 5.6.1 to make my outbound api calls.

I would see different type of UnexpectedElasticsearchClientException in my logs

Unable to cast object of type 'System.Threading.Tasks.Task1[Elasticsearch.Net.ElasticsearchResponse1[Nest.SearchResponse1[MyNameSpace.MyClass]]]' to type 'System.Threading.Tasks.Task1[Nest.SearchResponse1[MyNameSpace.MyClass]]'.`

and

Unable to cast object of type 'System.Threading.Tasks.Task1[Elasticsearch.Net.ElasticsearchResponse1[Nest.MultiGetResponse]]' to type 'System.Threading.Tasks.Task1[Nest.MultiGetResponse]'.
`

Both have the same stack trace.

Stack trace:

at Elasticsearch.Net.Transport`1.<RequestAsync>d__15`1.MoveNext() in c:\Projects\elastic\net-5\src\Elasticsearch.Net\Transport\Transport.cs:line 151
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
   at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1.ConfiguredTaskAwaiter.GetResult()
   at Nest.ElasticClient.<Nest-IHighLevelToLowLevelDispatcher-DispatchAsync>d__261`4.MoveNext() in c:\Projects\elastic\net-5\src\Nest\ElasticClient.cs:line 76
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()

If I log into datadog and look at the trace with the error I see the above exceptions and this is all that is recorded in the stack in datadog.

 at CallElasticsearchAsync(Object , Object , CancellationToken )
   at Datadog.Trace.ClrProfiler.Integrations.ElasticsearchNetIntegration.<CallElasticsearchAsyncInternal>d__15`1.MoveNext()

Alpine build of Datadog.Trace.ClrProfiler.Native

Is your feature request related to a problem? Please describe.
Datadog.Trace.ClrProfiler.Native is built explicitly for Linux x64 and does not work in alpine containers.

Describe the solution you'd like
An alpine build of Datadog.Trace.ClrProfiler.Native so it will work in .Net Core alpine containers.

COR_ENABLE_PROFILING not mentioned in documentation

Describe the bug

The documentation doesn't clearly mention that environment variables COR_ENABLE_PROFILING and COR_PROFILER need to be set in order to enable auto-tracing for non-IIS applications.

Perhaps the MSI should set these as well as DD_INTEGRATIONS

Exit 80131506 after installing Datadog csharp

Describe the bug
After installing Datadog c-sharp some of my applications won't start up, giving exit code 80131506

To Reproduce
Steps to reproduce the behavior:

  1. Install DataDog tracer from https://github.com/DataDog/dd-trace-csharp/releases
  2. Restart IIS using steps on https://docs.datadoghq.com/tracing/setup/dotnet/
  3. Load application
  4. App Pool crashes before first page is loaded

Expected behavior
The application works the same after installing Datadog tracer.

Screenshots
None

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic with msi installer
  • Tracer version: 0.5.0
  • OS: Windows Server 2016 Data Center
  • CLR: .NET 4.0.30319 (reported in IIS), .NET 4.7 installed on machine

Additional context
Event Viewer Logs

Application Error:
Faulting application name: w3wp.exe, version: 10.0.14393.0, time stamp: 0x57899b8a
Faulting module name: clr.dll, version: 4.7.3190.0, time stamp: 0x5b693b48
Exception code: 0xc0000005
Fault offset: 0x000000000056604b
Faulting process id: 0x4d8
Faulting application start time: 0x01d47a9f85863416
Faulting application path: c:\windows\system32\inetsrv\w3wp.exe
Faulting module path: C:\Windows\Microsoft.NET\Framework64\v4.0.30319\clr.dll
Report Id: 769c85ef-ad3c-45f2-9441-e4d4ce2a7381
Faulting package full name: 
Faulting package-relative application ID: 

.NET Runtime:
Application: w3wp.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an internal error in the .NET Runtime at IP 00007FFD0A77604B (00007FFD0A210000) with exit code 80131506.

Attempted to change GC settings, but it didn't help

  • tried <gcServer enabled="true" />
  • tried <gcConcurrent enabled="false" />

Oddly this doesn't happen for all of my .NET applications, some work fine.

  • both applications are compiled with .NET 4.5.2

Debugging with sysinternals/procmon showed attempts to write to ProgramData\Datadog .NET Tracer\logs, but ProgramData\Datadog .NET Tracer didn't exist...?

  • After manually creating ProgramData\Datadog .NET Tracer\logs, a dotnet-profiler.log was created, but only had [info] level messages in it

Going to try attaching with Windbg, but I'm a bit out of my depth on this.

Can't get SpanContext from Span (for use in childOf of Tracer.{StartActive,StartSpan})

Are you requesting automatic instrumentation for a framework or library?
No

Is your feature request related to a problem?
I'm using manual instrumentation and I cannot get a SpanContext from a Span (using v0.6.0-beta and dotnet --version = 2.2.101).

Describe the solution you'd like
It seems Span does have an underlying SpanContext but it's not exposed (internal visibility), exposing this would help me to create child traces which right now I can't.

Describe alternatives you've considered
Either changing the signature of Tracer.StartActive and Tracer.StartSpan to take a Span instead of a SpanContext or adding a new override.

Additional context
N/A

Wrong span duration using 0.8.0-beta

Describe the bug
Ever since we switched to Datadog.Trace version 0.8.0-beta we're getting wrong tracing results in dotnet core apps in docker containers. The numbers are much higher than they actually are. I was able to debug it a little bit and found that TraceContext was changed to use _utcStart.AddTicks(_stopwatch.ElapsedTicks) instead of _start.Add(_sw.Elapsed) in version 0.7.1-beta.

I wrote a simple app to reproduce the issue. This is what it does:

static async Task Main(string[] args)
{
    var utcStart = DateTimeOffset.UtcNow;
    var stopwatch = Stopwatch.StartNew();
    await Task.Delay(100);
    var diff = utcStart.AddTicks(stopwatch.ElapsedTicks) - utcStart;
    Console.WriteLine("{0}", stopwatch.ElapsedTicks);
    Console.WriteLine("diff using ticks: {0}ms", diff.TotalMilliseconds);
    Console.WriteLine("diff using offset: {0}ms", (DateTimeOffset.UtcNow - utcStart).TotalMilliseconds);
}

Here are results of running this app on Linux:

dotnet run
101802724
diff using ticks: 10179.6821ms
diff using offset: 114.2126ms

Windows

dotnet run
293109
diff using ticks: 29.309ms
diff using offset: 110.9422ms

windows results are also kind of weird.

Runtime environment (please complete the following information):

  • Instrumentation mode: manual
  • Tracer version: [e.g. 0.8.0]
  • OS: Linux, docker, kubernetes
  • CLR: .NET Core 2.1

AutoMapper throws EntryPointNotFoundException when profiler is enabled on Linux

Describe the bug

Unhandled Exception: System.EntryPointNotFoundException: Entry point was not found.
   at AutoMapper.IMemberMap.get_UseDestinationValue()
   at AutoMapper.Mappers.Internal.CollectionMapperExpressionFactory.<MapCollectionExpression>g__UseDestinationValue|1_0(<>c__DisplayClass1_0& )
   at AutoMapper.Mappers.Internal.CollectionMapperExpressionFactory.MapCollectionExpression(IConfigurationProvider configurationProvider, ProfileMap profileMap, IMemberMap memberMap, Expression sourceExpression, Expression destExpression, Expression contextExpression, Type ifInterfaceType, MapItem mapItem)
   at AutoMapper.Mappers.CollectionMapper.MapExpression(IConfigurationProvider configurationProvider, ProfileMap profileMap, IMemberMap memberMap, Expression sourceExpression, Expression destExpression, Expression contextExpression)
   at AutoMapper.Execution.ExpressionBuilder.MapExpression(IConfigurationProvider configurationProvider, ProfileMap profileMap, TypePair typePair, Expression sourceParameter, Expression contextParameter, IMemberMap propertyMap, Expression destinationParameter)
   at AutoMapper.Execution.TypeMapPlanBuilder.CreatePropertyMapFunc(IMemberMap memberMap, Expression destination, MemberInfo destinationMember)
   at AutoMapper.Execution.TypeMapPlanBuilder.TryPropertyMap(PropertyMap propertyMap)
   at AutoMapper.Execution.TypeMapPlanBuilder.CreateAssignmentFunc(Expression destinationFunc)
   at AutoMapper.Execution.TypeMapPlanBuilder.CreateMapperLambda(HashSet`1 typeMapsPath)
   at AutoMapper.TypeMap.Seal(IConfigurationProvider configurationProvider)
   at AutoMapper.MapperConfiguration.Seal()
   at AutoMapper.MapperConfiguration..ctor(MapperConfigurationExpression configurationExpression)
   at AutoMapper.Mapper.Initialize(Action`1 config)
   at OurService.Api.Startup.ConfigureServices(IServiceCollection services) in /data/workspace/our_service/src/OurService.Api/Startup.cs:line 198
--- End of stack trace from previous location where exception was thrown ---
   at Microsoft.AspNetCore.Hosting.ConventionBasedStartup.ConfigureServices(IServiceCollection services)
   at Microsoft.AspNetCore.Hosting.Internal.WebHost.EnsureApplicationServices()
   at Microsoft.AspNetCore.Hosting.Internal.WebHost.Initialize()
   at Microsoft.AspNetCore.Hosting.WebHostBuilder.Build()
   at OurService.Api.Program.Main(String[] args) in /data/workspace/our_service/src/OurService.Api/Program.cs:line 14

To Reproduce

We introduced automated instrumentation in the service like described here: https://docs.datadoghq.com/tracing/languages/dotnet/?tab=netcoreonlinux

During our e2e tests and on staging datadog is disabled using the env variable DD_TRACE_ENABLED=false on production it was enabled just by switching the flag DD_TRACE_ENABLED=true - this caused the error above.

The line in question was validating if the automapper configuration is correct Mapper.Configuration.AssertConfigurationIsValid();

The implementation of that function can be found here: https://github.com/AutoMapper/AutoMapper/blob/d9e9e7cbdfc46e4b762a9815f025ef20a52597fc/src/AutoMapper/Mapper.cs#L179

Expected behavior
It should work with automapper

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic on .net core linux
  • Tracer version: 0.8.2
  • OS: linux in docker container
  • CLR: .NET Core 2.2

Additional context

  • Automapper version: 8.0.0

Allow excluding of values in Redis commands

Are you requesting automatic instrumentation for a framework or library? Please describe.
No, this is regarding an existing instrumentation.

Is your feature request related to a problem? Please describe.
We're trying to use the built-in instrumentation for ServiceStack.Redis, but it is tagging spans with the entire command (including the data we cache) and we'd like to exclude that data, as it'll be confidential in most cases. Also, it makes the traces pretty large as we're caching multi-MB values.

Describe the solution you'd like
It would be great to have the ability to exclude any sent values for SET commands, just keeping the command itself (e.g. SET the:cache:key <excluded-value>).

WCF client integration

Are you requesting automatic instrumentation for a framework or library? Please describe.

  • Framework or library name : System.ServiceModel
  • Library type: SOAP client
  • Library version: 4.5.3

Is your feature request related to a problem? Please describe.
We noticed that tracing for WCF is enabled for the full framework, but disabled for Net Standard 2.0. It would be great of the support could be extended for Net Standard, or at least .Net Core.
Note that only WCF clients are supported through Net Standard, not services.

Describe the solution you'd like
We would like to be able to trace outgoing WCF client connections.

Describe alternatives you've considered
We're currently looking at manually tracing those events, but if the existing support could be extended, it would make our life that much easier.

Additional context

DynamicMethodBuilder.CreateMethodCallDelegate BUG

When I use DynamicMethodBuilder.CreateMethodCallDelegate and delegateType is Action
It throw Value cannot be null. exception in code dynamicMethod.CastClass(returnType);

I see line 78 of code

returnType = null; //before modify

should be

returnType = typeof(void);

I try modify code and it runing well.

childOf field of StartSpan doesn't work

Describe the bug
Setting childOf spancontext of Tracer.Instance.StartSpan(name, childSpan..) does not work. I have passed the correct "x-datadog-trace-id" and "x-datadog-parent-id" values from another service and I believe I've extracted and constructed the span context correctly to be passed in the childOf field.

To Reproduce
Steps to reproduce the behavior:

public class Headerz: IHeaderCollection
{
    private IDictionary<string, string> dict = new Dictionary<string, string>();

    public string Get(string name)
    {
        return dict[name];
    }

    public void Set(string name, string value)
    {
        dict.Add(name, value);
    }
}

var cc = new Headerz();
var x = context.RequestHeaders;
foreach (var item in x)
{

    cc.Set(item.Key, item.Value);
}

Log.Information(cc.Get("x-datadog-trace-id"));
Log.Information(cc.Get("x-datadog-parent-id"));

System.Uri uri = new System.Uri("http://localhost:8126");
Tracer.Create(uri);
var parentSpan = Datadog.Trace.Propagators.HeaderCollectionPropagator.Extract(cc);

// Sanity check
Console.WriteLine(parentSpan.SpanId);

var span = Datadog.Trace.Tracer.Instance.StartSpan("jobname", parentSpan, "anothername");
... Do work
span.Finish();

Expected behavior
Expected the span to show up on Datadog . I can see the span if I do not set a childOf span context.

Runtime environment (please complete the following information):

  • Instrumentation mode: NuGet package

  • Tracer version:
    screen shot 2018-10-30 at 5 07 04 pm

  • OS: macOS HS

  • CLR: Not sure.

Reliably deal with Agent's payload size limitation

Describe the bug
Log message
An error occurred while sending traces to the agent at http://localhost:8126/v0.3/traces

ERROR MESSAGE
Response status code does not indicate success: 413 (Request Entity Too Large).

ERROR STACK
System.Net.Http.HttpRequestException: Response status code does not indicate success: 413 (Request Entity Too Large).
at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode()
at Datadog.Trace.Agent.Api.d__8`1.MoveNext()

To Reproduce
Steps to reproduce the behavior:

  1. Install the automatic instrumentation
  2. Wait 5min
  3. Errors start appearing

Expected behavior
No errors ๐Ÿ˜„

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic
  • Tracer version: both 0.7.0 and 0.7.1 (installed on top of 0.7.0 and I don't know how to check which version is running)
  • OS: Windows Server 2016 Datacenter
  • CLR: .NET Framework 4.6.1

Additional context
Not all messages are failing, most are working correctly.

System.NullReferenceException on trace 1.3

Describe the bug
After updating from version 1.0 to version 1.3 of the trace agent (both msi as nuget packages are updated) because of a StackoverflowException on Azure Storage the AspNetCoreMvc2Integration is giving errors on my MVC calls.

System.NullReferenceException: Object reference not set to an instance of an object.
at System.Object.GetType()
at Datadog.Trace.ClrProfiler.Emit.ObjectExtensions.TryGetPropertyValue[TResult](Object source, String propertyName, TResult& value)
at Datadog.Trace.ClrProfiler.Integrations.AspNetCoreMvc2Integration.Rethrow(Object context, Int32 opCode)

System.NullReferenceException: Object reference not set to an instance of an object.
at System.Object.GetType()
at Datadog.Trace.ClrProfiler.Integrations.Interception.ParamsToTypes(Object[] objectsToCheck)
at Datadog.Trace.ClrProfiler.Integrations.AspNetCoreMvc2Integration.Rethrow(Object context, Int32 opCode)
at Microsoft.AspNetCore.Mvc.Internal.ResourceInvoker.ResultNext[TFilter,TFilterAsync](State& next, Scope& scope, Object& state, Boolean& isCompleted)
at Microsoft.AspNetCore.Mvc.Internal.ResourceInvoker.InvokeAlwaysRunResultFilters()
at Microsoft.AspNetCore.Mvc.Internal.ResourceInvoker.InvokeFilterPipelineAsync()
at Microsoft.AspNetCore.Mvc.Internal.ResourceInvoker.InvokeAsync()
at Microsoft.AspNetCore.Builder.RouterMiddleware.Invoke(HttpContext httpContext)
at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware.Invoke(HttpContext context)
at Swashbuckle.AspNetCore.SwaggerUI.SwaggerUIMiddleware.Invoke(HttpContext httpContext)
at Swashbuckle.AspNetCore.Swagger.SwaggerMiddleware.Invoke(HttpContext httpContext, ISwaggerProvider swaggerProvider)
at Microsoft.AspNetCore.Builder.Extensions.MapWhenMiddleware.Invoke(HttpContext context)

To Reproduce
Any call on our test environment is producing this.

Expected behavior
No errors and working application

Screenshots
image

  • Instrumentation mode: automatic with msi installer
  • Tracer version: 1.3.0
  • OS: Windows Server 2012 R2
  • CLR: .NET Core 2.2

Additional context
I'm running my asp.net core application as a Windows Service.

Automatic instrumentation fails after setup

Describe the bug
After setting up .NET tracing using DataDog documentation web application starts logging errors non-stop. Error example:
[2018-09-18 14:25:41] [Error] An error occured while sending traces to the agent at http://localhost:8126/v0.3/traces System.Net.Http.HttpRequestException: Response status code does not indicate success: 400 (Bad Request). at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode() at Datadog.Trace.Agent.Api.<SendAsync>d__81.MoveNext()`

And at the same time there is an error on trace-agent side:
2018-09-19 08:11:26 ERROR (receiver.go:386) - cannot decode v0.3 traces payload: msgp: attempted to decode type "nil" with method for "str"

Attempts to turn off automatic instrumentation do not work (there is no clear documentation on how to turn it off):

  1. Set apm_config:enabled: false
  2. Turn off trace agent
  3. Do iis_reset
  4. Redeploy the application

To Reproduce
Steps to reproduce the behavior:

  1. Follow the documentation available at official website.
  2. Find yourself in a quite of a pickle.

Expected behavior
Application does not throw exceptions after being set up.

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic
  • Tracer version: 0.3.0
  • OS: Windows Server 2012 R2
  • CLR: .NET Framework 4.5.1

Additional context
Our application includes MsgPack library. Maybe it conflicts with the one used in the injected CLR code Data Dog is using.

Exit 80131506 after installing Datadog csharp

Describe the bug
After installing Datadog c-sharp some of my applications won't start up, giving exit code 80131506

To Reproduce
Steps to reproduce the behavior:

  1. Install DataDog tracer from https://github.com/DataDog/dd-trace-csharp/releases
  2. Restart IIS using steps on https://docs.datadoghq.com/tracing/setup/dotnet/
  3. Load application
  4. App Pool crashes before first page is loaded

Expected behavior
The application works the same after installing Datadog tracer.

Screenshots
None

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic with msi installer
  • Tracer version: 0.5.0
  • OS: Windows Server 2016 Data Center
  • CLR: .NET 4.0.30319 (reported in IIS), .NET 4.7 installed on machine

Additional context
Event Viewer Logs

Application Error:
Faulting application name: w3wp.exe, version: 10.0.14393.0, time stamp: 0x57899b8a
Faulting module name: clr.dll, version: 4.7.3190.0, time stamp: 0x5b693b48
Exception code: 0xc0000005
Fault offset: 0x000000000056604b
Faulting process id: 0x4d8
Faulting application start time: 0x01d47a9f85863416
Faulting application path: c:\windows\system32\inetsrv\w3wp.exe
Faulting module path: C:\Windows\Microsoft.NET\Framework64\v4.0.30319\clr.dll
Report Id: 769c85ef-ad3c-45f2-9441-e4d4ce2a7381
Faulting package full name: 
Faulting package-relative application ID: 

.NET Runtime:
Application: w3wp.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an internal error in the .NET Runtime at IP 00007FFD0A77604B (00007FFD0A210000) with exit code 80131506.

Attempted to change GC settings, but it didn't help

  • tried <gcServer enabled="true" />
  • tried <gcConcurrent enabled="false" />

Oddly this doesn't happen for all of my .NET applications, some work fine.

  • both applications are compiled with .NET 4.5.2

Checked ProgramData\Datadog .NET Tracer\logs\dotnet-profiler.log, but it only had [info] level messages in it

I was able to attach are remote debugger to the failing process and received the following stack trace:

"Access Violation"
clr.dll!ProfToEEInterfaceImpl::GetModuleFlags(class Module *)
clr.dll!ProfToEEInterfaceImpl::GetModuleInfo2(unsigned __int64,unsigned char const * *,unsigned long,unsigned long *,unsigned short * const,unsigned __int64 *,unsigned long *)
Datadog.Trace.ClrProfiler.Native.dll!00007ffcd9ec1bee()

I was able to narrow down the culprit to applications that had ClearScript as a dependancy. After removing ClearScript, those applications ran without error when Datadog .NET was installed.

While I was able to do a bunch of work to my application to remove ClearScript, it looks like Datadog is mishandling dependancies in certain scenarios. It's calling clr.dll/GetModuleInfo2() with an invalid module id.

Overriding HttpClientHandler.SendAsync() causes StackOverflowException

Describe the bug
When using the dd-trace-dotnet in a linux docker container, along with a dotnet core application which is using the WindowsAzure.Storage nuget package (9.3.3 - latest), a StackOverflowException is thrown whenever the application attempts to use any Asnyc Methods from the WindowsAzure.Storage package.

To Reproduce
Steps to reproduce the behavior:
ConsoleApp1.csproj

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp2.2</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Datadog.Trace.ClrProfiler.Managed" Version="1.1.0" />
    <PackageReference Include="WindowsAzure.Storage" Version="9.3.3" />
  </ItemGroup>

</Project>

Program.cs

using Microsoft.WindowsAzure.Storage;
using System;
using System.Threading.Tasks;

namespace ConsoleApp1
{
    class Program
    {
        private static string AzureStorageConnectionString = "My-Azure-Storage-ConnectionString";
        private static string ContainerName = "some-container-name";

        static async Task Main(string[] args)
        {
            var storageAccount = CloudStorageAccount.Parse(AzureStorageConnectionString);
            var cloudBlobClient = storageAccount.CreateCloudBlobClient();
            var container = cloudBlobClient.GetContainerReference(ContainerName);

            Console.WriteLine($"Checking if Container ({ContainerName}) Exists...");
            var containerExists = await container.ExistsAsync();
            Console.WriteLine($"Container ({ContainerName}) Exists: {containerExists}");

            Console.WriteLine("Shutting Down.");
        }
    }
}

Dockerfile

FROM mcr.microsoft.com/dotnet/core/sdk:2.2 as builder
WORKDIR /src
COPY . .
RUN dotnet restore
RUN dotnet build -c Release --no-restore
RUN dotnet publish -c Release --no-build -o /app

FROM mcr.microsoft.com/dotnet/core/aspnet:2.2 AS release
WORKDIR /app
ENV ASPNETCORE_URLS=http://*:80

#Setup Datadog APM
RUN mkdir -p /tmp
RUN curl -L https://github.com/DataDog/dd-trace-dotnet/releases/download/v1.1.0/datadog-dotnet-apm_1.1.0_amd64.deb --output /tmp/datadog_apm.deb
RUN dpkg -i /tmp/datadog_apm.deb
ENV CORECLR_ENABLE_PROFILING=1
ENV CORECLR_PROFILER={846F5F1C-F9AE-4B07-969E-05C26BC060D8}
ENV CORECLR_PROFILER_PATH=/opt/datadog/Datadog.Trace.ClrProfiler.Native.so
ENV DD_INTEGRATIONS=/opt/datadog/integrations.json
 
COPY --from=builder /app .

CMD ["dotnet", "ConsoleApp1.dll"]

Expected behavior
When running in Visual Studio, the application should complete successfully, and display in the Console, whether or not the container exists in the Storage Account. This works correctly.

When running in Docker, having built as above with the dd-trace-dotnet...
docker build --tag dd-trace-agent-test .
docker run --rm dd-trace-agent-test
I expect the application to also run through correctly, displaying in the console whether or not the container exists in the Storage Account. This is where the application throws a StackOverflowException.

Screenshots
Working Visual Studio Debug Console...
dd-trace-agent_VS

Failed when running in Docker with dd-trace-dotnet
dd-trace-agent_Docker

Additional context
When running in Docker, but not including the dd-trace-dotnet, by commenting out the dd-trace-dotnet setup in the Dockerfile but still keeping the nuget reference in the application, the application does work correctly.

This isn't just an issue with the await container.ExistsAsync() method either, this also happens with await container.CreateIfNotExistsAsync(), await container.CreateAsync() etc.
This also happens if, assuming the container is already created, trying to upload a blob etc.

var blob = container.GetBlockBlobReference("my-blob");
await blob.UploadTextAsync("Testing");

Extract span context

How can I extract a spancontext from the context?

ie in Golang

opentracing.GlobalTracer().Extract(
			opentracing.HTTPHeaders,
			opentracing.HTTPHeadersCarrier(header))

Thanks.

Make resource names consistent across web framework integrations

Describe the bug
automatic instrumentation looks not like the traces from java lib. , see screenshot below

Expected behavior
should be the same

Screenshots
java:
image

.NetCore:
image

Runtime environment (please complete the following information):

  • Instrumentation mode: automatic with deb packages
  • Tracer version: 0.5.1-beta
  • OS: Linux / Docker image ( microsoft/dotnet:2.1-runtime-deps-stretch-slim )
  • CLR: .Net Core 2.1

Allow setting sampling priority for spans

Are you requesting automatic instrumentation for a framework or library? Please describe.
I'd like both Datadog.Trace and Datadog.Trace.OpenTracing packages to support this feature.

Is your feature request related to a problem? Please describe.
I'd like to be able to pass sampling priority for some spans like described here:

https://docs.datadoghq.com/tracing/getting_further/trace_sampling_and_storage/#priority-sampling-for-distributed-tracing

Describe the solution you'd like
I think span.SetTag(...) override that parses sampling.priority key and int value would be enough.
Also for distributed tracing it would be necessary to update Propagators in Datadog.Trace to pass it through x-datadog-sampling-priority header and HttpHeadersCodec in Datadog.Trace.OpenTracing package.

100% error rate when response content is null

One of my API calls gets 100% error rate in datadog and the stack trace when you look at the trace in datadog is below. The only thing that is different about this call is the content would be null (it's a mock endpoint for the moment). The method returns a HttpResponseMessage and the content would be null with a status code of 204 no content.

at System.Web.Http.Controllers.ApiControllerActionSelector.ActionSelectorCacheItem.SelectAction(HttpControllerContext controllerContext)
   at System.Web.Http.ApiController.ExecuteAsync(HttpControllerContext controllerContext, CancellationToken cancellationToken)
   at Datadog.Trace.ClrProfiler.Integrations.AspNetWebApi2Integration.<ExecuteAsyncInternal>d__2.MoveNext()

w3wp.exe crashes with ExecutionEngineException (0x80131506)

Describe the bug
We're experiencing an issue where our application pool crashes, primarily
when we make new deploys. We observe the following events in the Event
Viewer:

Application: w3wp.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an internal error in the
.NET Runtime at IP 00007FFCDC1D9FB5 (00007FFCDC060000) with exit code
80131506.

and

Faulting application name: w3wp.exe, version: 10.0.14393.0, time stamp:
0x57899b8a
Faulting module name: clr.dll, version: 4.7.3362.0, time stamp: 0x5c2fcfd4
Exception code: 0xc0000005
Fault offset: 0x0000000000179fb5
Faulting process id: 0x1df0
Faulting application start time: 0x01d50fe325b81c08
Faulting application path: c:\windows\system32\inetsrv\w3wp.exe
Faulting module path:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\clr.dll
Report Id: 7710cbc6-6198-4108-b674-473fae9f3e02
Faulting package full name:
Faulting package-relative application ID:

To Reproduce
We don't have a consistent repro of this, but when it happens for us is when we deploy a new set of DLLs into an already running application pool with our deployment tool (Octopus Deploy)

Expected behavior
We expect the W3WP to continue running, which allows any existing keepalives to continue living and be served by the updated application.

Runtime environment:

  • Instrumentation mode: Automatic with msi installer and nuget package
  • Tracer version: started failing with nuget package 1.2.0 (previously using 1.0.0) from what we can tell, using the either of the native msi 1.1.0 and 1.2.0.
  • OS: Windows Server 2016 R2
  • CLR: .NET Framework 4.6.2

From what we observed: MSI 1.1.0, Nuget 1.0.0 doesn't crash. MSI 1.1.0, Nuget 1.2.0 crashes, as well does MSI 1.2.0 and Nuget 1.2.0.

Additional context
We've gathered two memory dumps, one in each crashing setup. If it crashes, it has crashed while JITting MySql.Web.Common.SchemaManager.GetSchemaVersion(System.String)

The stack trace below is quite noisy and potentially confusing because it is "scraped".
A line to highlight is 0000008e09f3aae0 00007ffcdc1d9fb5 clr!AssemblySpec::InitializeSpec+0x42 ====> Exception cxr@0000008e09f39d00 which triggers the application's termination.

Are any symbols available for the native profiling assembly so that the symbols can be resolved better?

OS Thread Id: 0x1bdc (0)
Current frame: ntdll!NtTerminateProcess+0x14
Child-SP         RetAddr          Caller, Callee
0000008e09f38810 00007ffce8691eaf ntdll!RtlExitUserProcess+0xbf, calling ntdll!NtTerminateProcess
0000008e09f38840 00007ffce6bbcfea kernel32!ExitProcessImplementation+0xa, calling ntdll!RtlExitUserProcess
0000008e09f38858 000001e57a704070 000001e57a704070, calling 000001e57a70c06c
0000008e09f38870 00007ffcdcaced68 mscoreei!RuntimeDesc::ShutdownAllActiveRuntimes+0x287, calling kernel32!ExitProcessImplementation
0000008e09f388f0 00007ffcdc0656f9 clr!EEHeapFreeInProcessHeap+0x45, calling kernel32!HeapFreeStub
0000008e09f389f0 00007ffcdcac202f mscoreei!ComUtil::IUnknownCommon<ICLRRuntimeHostInternal,mpl::null_type,mpl::null_type,mpl::null_type,mpl::null_type,mpl::null_type,mpl::null_type,mpl::null_type,mpl::null_type,mpl::null_type>::QueryInterface+0x73
0000008e09f38a20 00007ffcdcace1bf mscoreei!CLRRuntimeInfoImpl::GetInterfaceInternal+0x37d, calling mscoreei!_security_check_cookie
0000008e09f38ae0 00007ffcdcace0ae mscoreei!CLRRuntimeInfoImpl::GetInterface+0x11f, calling mscoreei!CLRRuntimeInfoImpl::GetInterfaceInternal
0000008e09f38b38 000001e57a704070 000001e57a704070, calling 000001e57a70c06c
0000008e09f38b60 00007ffcdcaceee4 mscoreei!CLRRuntimeHostInternalImpl::ShutdownAllRuntimesThenExit+0x14, calling mscoreei!RuntimeDesc::ShutdownAllActiveRuntimes
0000008e09f38b68 000001e57a704070 000001e57a704070, calling 000001e57a70c06c
0000008e09f38b90 00007ffcdc22ab85 clr!EEPolicy::ExitProcessViaShim+0x95
0000008e09f38be0 00007ffcdc22aae5 clr!SafeExitProcess+0x9d, calling clr!EEPolicy::ExitProcessViaShim
0000008e09f38e00 00007ffcdc5070e6 clr!EEPolicy::GetActionOnFailureNoHostNotification+0x2e, calling clr!EEPolicy::GetFinalAction
0000008e09f38e30 00007ffcdc507076 clr!EEPolicy::GetActionOnFailure+0x26, calling clr!EEPolicy::GetActionOnFailureNoHostNotification
0000008e09f38e50 00007ffcdc3ac20f clr!CLRVectoredExceptionHandlerPhase3+0x2fa34b, calling clr!GetCurrentIP
0000008e09f38e60 00007ffcdc507383 clr!EEPolicy::HandleFatalError+0x15c, calling clr!SafeExitProcess
0000008e09f38e70 00007ffcdc3ac20f clr!CLRVectoredExceptionHandlerPhase3+0x2fa34b, calling clr!GetCurrentIP
0000008e09f38f10 00007ffcdc22ac63 clr!AdjustContextForWriteBarrier+0xb7, calling clr!_security_check_cookie
0000008e09f39160 00007ffce4ba3eef KERNELBASE!WaitForSingleObjectEx+0x8f, calling ntdll!NtWaitForSingleObject
0000008e09f39200 00007ffce6bb33f3 kernel32!SortCompareString+0x1f3, calling kernel32!_security_check_cookie
0000008e09f39240 00007ffce6bb33f3 kernel32!SortCompareString+0x1f3, calling kernel32!_security_check_cookie
0000008e09f39360 00007ffce4b8658f KERNELBASE!StrCmpW+0x9f, calling ntdll!LdrpDispatchUserCallTarget
0000008e09f393c0 00007ffcdc1d5492 clr!FusionCompareStringN+0xe4, calling shlwapi!StrCmpWStub
0000008e09f393e0 00007ffcdc065433 clr!ClrFlsGetValue+0x23
0000008e09f39440 00007ffcdc3ac222 clr!CLRVectoredExceptionHandlerPhase3+0x2fa35e, calling clr!EEPolicy::HandleFatalError
0000008e09f39470 00007ffcdc0b4be4 clr!CLRVectoredExceptionHandlerPhase2+0x2d, calling clr!CLRVectoredExceptionHandlerPhase3
0000008e09f394c0 00007ffcdc0b4bab clr!CLRVectoredExceptionHandler+0x94, calling clr!CLRVectoredExceptionHandlerPhase2
0000008e09f394f0 00007ffcdc0b4912 clr!SaveCurrentExceptionInfo+0x72, calling clr!ClrFlsSetValue
0000008e09f39520 00007ffcdc0b4adf clr!CLRVectoredExceptionHandlerShim+0xa3, calling clr!CLRVectoredExceptionHandler
0000008e09f39550 00007ffce86e5e90 ntdll!RtlpCallVectoredHandlers+0x104, calling ntdll!LdrpDispatchUserCallTarget
0000008e09f39560 00007ffce87cb5e0 ntdll!LdrpVectorHandlerList, calling ntdll!__PchSym_+0x18c
0000008e09f395f0 00007ffce86bfa1b ntdll!RtlDispatchException+0x6b, calling ntdll!RtlpCallVectoredHandlers
0000008e09f398e0 00007ffce86ade98 ntdll!RtlpAllocateHeapInternal+0xf8, calling ntdll!RtlpLowFragHeapAllocFromContext
0000008e09f39920 00007ffcdc1d2ab6 clr!StringCchPrintfW+0x4a, calling MSVCR120_CLR0400!vsnwprintf
0000008e09f39930 00007ffcdc1d9836 clr!StringCchCatW+0x5a, calling clr!StringCopyWorkerW
0000008e09f39970 00007ffcdc1e7444 clr!CAssemblyName::GetCustomDisplayName+0x370, calling clr!_security_check_cookie
0000008e09f39a30 00007ffcdc06568a clr!EEHeapAllocInProcessHeap+0x46, calling ntdll!RtlAllocateHeap
0000008e09f39a60 00007ffcdc1e67c6 clr!CBindingInput::Init+0x12f, calling clr!_security_check_cookie
0000008e09f39b00 00007ffce4ba3eef KERNELBASE!WaitForSingleObjectEx+0x8f, calling ntdll!NtWaitForSingleObject
0000008e09f39b20 00007ffce4ba3eef KERNELBASE!WaitForSingleObjectEx+0x8f, calling ntdll!NtWaitForSingleObject
0000008e09f39b60 00007ffce4ba3eef KERNELBASE!WaitForSingleObjectEx+0x8f, calling ntdll!NtWaitForSingleObject
0000008e09f39b70 00007ffce86ade98 ntdll!RtlpAllocateHeapInternal+0xf8, calling ntdll!RtlpLowFragHeapAllocFromContext
0000008e09f39c00 00007ffce6bb33f3 kernel32!SortCompareString+0x1f3, calling kernel32!_security_check_cookie
0000008e09f39c40 00007ffce6bb33f3 kernel32!SortCompareString+0x1f3, calling kernel32!_security_check_cookie
0000008e09f39cf0 00007ffce87296ea ntdll!KiUserExceptionDispatch+0x3a, calling ntdll!RtlDispatchException
0000008e09f3aae0 00007ffcdc1d9fb5 clr!AssemblySpec::InitializeSpec+0x42 ====> Exception cxr@0000008e09f39d00
0000008e09f39d00 00007ffcdc0656f9 clr!EEHeapFreeInProcessHeap+0x45, calling kernel32!HeapFreeStub
0000008e09f39fc0 00007ffcdc1d9a55 clr!Wrapper<IAssemblyLocation * __ptr64,&DoNothing<IAssemblyLocation * __ptr64>,&DoTheRelease<IAssemblyLocation>,0,&CompareDefault<IAssemblyLocation * __ptr64>,2,1>::~Wrapper<IAssemblyLocation * __ptr64,&DoNothing<IAssemblyLocation * __ptr64>,&DoTheRelease<IAssemblyLocation>,0,&CompareDefault<IAssemblyLocation * __ptr64>,2,1>+0x52
0000008e09f3a000 00007ffcdc1e56fe clr!FusionBind::RemoteLoad+0x2ce, calling clr!ETWTraceStartup::StartupTraceEvent
0000008e09f3a050 00007ffcdc1d2170 clr!SString::EqualsCaseInsensitive+0xc0, calling clr!_security_check_cookie
0000008e09f3a0d0 00007ffcdc1c5960 clr!CAssemblyName::Release+0x52
0000008e09f3a100 00007ffcdc1c543a clr!Wrapper<IAssemblyName * __ptr64,&DoNothing<IAssemblyName * __ptr64>,&DoTheRelease<IAssemblyName>,0,&CompareDefault<IAssemblyName * __ptr64>,2,1>::~Wrapper<IAssemblyName * __ptr64,&DoNothing<IAssemblyName * __ptr64>,&DoTheRelease<IAssemblyName>,0,&CompareDefault<IAssemblyName * __ptr64>,2,1>+0x52
0000008e09f3a120 00007ffcdc0abb11 clr!SetupThreadNoThrow+0x29, calling clr!GetThread
0000008e09f3a140 00007ffcdc1e5410 clr!AssemblySpec::LoadAssembly+0x244, calling clr!Wrapper<IApplicationContext * __ptr64,&DoNothing<IApplicationContext * __ptr64>,&DoTheRelease<IApplicationContext>,0,&CompareDefault<IApplicationContext * __ptr64>,2,1>::~Wrapper<IApplicationContext * __ptr64,&DoNothing<IApplicationContext * __ptr64>,&DoTheRelease<IApplicationContext>,0,&CompareDefault<IApplicationContext * __ptr64>,2,1>
0000008e09f3a190 00007ffcdc06a2b0 clr!SystemDomain::GetAppDomainAtId+0x40, calling clr!AppDomain::CanThreadEnter
0000008e09f3a380 00007ffcdc1ad75b clr!ComparePtr::CompareHelper+0x2b
0000008e09f3a3b0 00007ffcdc1ad720 clr!HashMap::LookupValue+0x1f1
0000008e09f3a420 00007ffcdc0a5def clr!SString::IsRepresentation+0x2d, calling clr!SString::ScanASCII
0000008e09f3a450 00007ffcdc0bff2d clr!SString::GetCompatibleString+0x6d, calling clr!SString::IsRepresentation
0000008e09f3a480 00007ffcdc1d533b clr!SString::CompareCaseInsensitive+0xfb, calling clr!_security_check_cookie
0000008e09f3a4c0 00007ffce86ade98 ntdll!RtlpAllocateHeapInternal+0xf8, calling ntdll!RtlpLowFragHeapAllocFromContext
0000008e09f3a510 00007ffcdc1d3511 clr!MDInternalRO::GetAssemblyProps+0xdc, calling clr!CMiniMdTemplate<CMiniMd>::getNameOfAssembly
0000008e09f3a540 00007ffcdc065226 clr!CrstBase::Enter+0x6a, calling ntdll!RtlTryEnterCriticalSection
0000008e09f3a580 00007ffcdc06518c clr!CrstBase::Leave+0x30, calling ntdll!RtlLeaveCriticalSection
0000008e09f3a590 00007ffcdc0c21cc clr!CPackedLen::SafeGetData+0xc, calling clr!CPackedLen::SafeGetLength
0000008e09f3a5b0 00007ffcdc0652e1 clr!ClrFlsIncrementValue+0x29
0000008e09f3a5c0 00007ffcdc1f7786 clr!CustomAttributeParser::GetData+0x1b, calling clr!CPackedLen::SafeGetData
0000008e09f3a5f0 00007ffcdc1f77ad clr!CustomAttributeParser::GetNonNullString+0xe, calling clr!CustomAttributeParser::GetString
0000008e09f3a610 00007ffcdc06568a clr!EEHeapAllocInProcessHeap+0x46, calling ntdll!RtlAllocateHeap
0000008e09f3a620 00007ffcdc1f65a1 clr!CustomAttributeParser::GetNonEmptyString+0x9, calling clr!CustomAttributeParser::GetNonNullString
0000008e09f3a650 00007ffcdc1f64f5 clr!ParseKnownCaNamedArgs+0x1e5, calling clr!_security_check_cookie
0000008e09f3a710 00007ffcdc1d5203 clr!stricmpUTF8+0x54, calling clr!SString::~SString
0000008e09f3a780 00007ffcdc1d5194 clr!BaseAssemblySpec::IsMscorlib+0xa0, calling clr!_security_check_cookie
0000008e09f3a810 00007ffcdc06568a clr!EEHeapAllocInProcessHeap+0x46, calling ntdll!RtlAllocateHeap
0000008e09f3a840 00007ffcdc065784 clr!operator new+0x24
0000008e09f3a850 00007ffcdc1c943a clr!id_InternalBaseIdentityValue_ApplyDeltas<Windows::Isolation::Rtl::_IDENTITY_ATTRIBUTE,unsigned char (__cdecl*)(_RTL_ALLOCATION_LIST * __ptr64,_LUNICODE_STRING * __ptr64,_LUNICODE_STRING * __ptr64,_LUNICODE_STRING * __ptr64,void * __ptr64)>+0x8f3, calling clr!BCL::CSmartArrayHolder<BCL::CDefaultObjectTraits<CInternalIdentityAttribute,BCL::Nt::CCallDisposition>,BCL::Nt::CDefaultArrayBufferTraits<unsigned __int64>,BCL::Nt::CCallDisposition>::~CSmartArrayHolder<BCL::CDefaultObjectTraits<CInternalIdentityAttribute,BCL::Nt::CCallDisposition>,BCL::Nt::CDefaultArrayBufferTraits<unsigned __int64>,BCL::Nt::CCallDisposition>
0000008e09f3a860 00007ffcdc1c5f40 clr!BaseAssemblySpec::CompareEx+0x179, calling MSVCR120_CLR0400!strcmp
0000008e09f3a870 00007ffcdc0a5876 clr!CStructArray::Grow+0x8a, calling clr!operator new
0000008e09f3a890 00007ffcdc1adc27 clr!DomainAssemblyCache::CompareBindingSpec+0x33, calling clr!BaseAssemblySpec::CompareEx
0000008e09f3a8d0 00007ffcdc070ebd clr!StgPoolReadOnly::GetString+0x3d
0000008e09f3a8f0 00007ffcdc1aef1e clr!CStructArray::AppendThrowing+0x13, calling clr!CStructArray::Grow
0000008e09f3a910 00007ffcdc1b5752 clr!CMiniMdTemplate<CMiniMdRW>::CommonGetNameOfCustomAttribute+0x1f2, calling clr!CMiniMdTemplate<CMiniMdRW>::getNameOfTypeRef
0000008e09f3a920 00007ffcdc1aef5e clr!CStructArray::Append+0x26, calling clr!CStructArray::AppendThrowing
0000008e09f3a970 00007ffcdc0656f9 clr!EEHeapFreeInProcessHeap+0x45, calling kernel32!HeapFreeStub
0000008e09f3a990 00007ffcdc1b8064 clr!CMiniMdTemplate<CMiniMd>::getCustomAttributeForToken+0x74, calling clr!CMiniMdBase::SearchTableForMultipleRows
0000008e09f3a9a0 00007ffcdc065749 clr!operator delete+0x29
0000008e09f3a9d0 00007ffcdc0a589b clr!CStructArray::Clear+0x2e, calling clr!operator delete
0000008e09f3a9f0 00007ffcdc1b57fd clr!CMiniMdTemplate<CMiniMdRW>::getValueOfCustomAttribute+0x6d
0000008e09f3aa00 00007ffcdc1ae754 clr!HENUMInternal::ClearEnum+0x18, calling clr!CStructArray::Clear
0000008e09f3aa20 00007ffcdc07036b clr!SimpleRWLock::EnterRead+0x7b, calling clr!GetThread
0000008e09f3aa30 00007ffcdc1d07ef clr!Holder<SimpleRWLock * __ptr64,&SimpleRWLock::AcquireReadLock,&SimpleRWLock::ReleaseReadLock,0,&CompareDefault<SimpleRWLock * __ptr64>,2,1>::~Holder<SimpleRWLock * __ptr64,&SimpleRWLock::AcquireReadLock,&SimpleRWLock::ReleaseReadLock,0,&CompareDefault<SimpleRWLock * __ptr64>,2,1>+0x27, calling clr!GetThread
0000008e09f3aa70 00007ffcdc1ee805 clr!PEFile::GetMDImportWithRef+0x95, calling clr!Holder<SimpleRWLock * __ptr64,&SimpleRWLock::AcquireReadLock,&SimpleRWLock::ReleaseReadLock,0,&CompareDefault<SimpleRWLock * __ptr64>,2,1>::~Holder<SimpleRWLock * __ptr64,&SimpleRWLock::AcquireReadLock,&SimpleRWLock::ReleaseReadLock,0,&CompareDefault<SimpleRWLock * __ptr64>,2,1>
0000008e09f3aac0 00007ffcdc083efb clr!CMDSemReadWrite::~CMDSemReadWrite+0x1c, calling clr!UTSemReadWrite::UnlockRead
0000008e09f3aad0 00007ffcdc1d9f97 clr!AssemblySpec::InitializeSpec+0x24, calling clr!PEFile::GetMDImportWithRef
0000008e09f3ab20 00007ffcdc0f4115 clr!AssemblySpec::AssemblySpec+0x45, calling clr!memset
0000008e09f3ab30 00007ffcdc1f6652 clr!CaNamedArg::InitBoolField+0x3f, calling clr!CaNamedArg::Init
0000008e09f3ab60 00007ffcdc0f4b4d clr!AppDomain::LoadDomainNeutralModuleDependency+0x102, calling clr!AssemblySpec::InitializeSpec
0000008e09f3aca0 00007ffcdc149479 clr!DomainFile::Activate+0xfff5ce4d, calling clr!AppDomain::LoadDomainNeutralModuleDependency
0000008e09f3acc0 00007ffcdc06518c clr!CrstBase::Leave+0x30, calling ntdll!RtlLeaveCriticalSection
0000008e09f3ad30 00007ffcdc1d71e1 clr!FileLoadLock::CompleteLoadLevel+0x81, calling clr!StressLog::LogOn
0000008e09f3ad60 00007ffcdc1d7312 clr!DomainFile::DoIncrementalLoad+0xbf, calling clr!DomainFile::Activate
0000008e09f3adb0 00007ffcdc1d777c clr!AppDomain::TryIncrementalLoad+0xdd, calling clr!DomainFile::DoIncrementalLoad
0000008e09f3ae00 00007ffcdc071ec8 clr!ClassLoader::CheckAccessMember+0x38, calling clr!ClassLoader::CanAccessClass
0000008e09f3aeb0 00007ffcdc1d2170 clr!SString::EqualsCaseInsensitive+0xc0, calling clr!_security_check_cookie
0000008e09f3afb0 00007ffcdc089a63 clr!CompareTypeTokens+0x2b0, calling clr!CompareTypeTokens
0000008e09f3b000 00007ffcdc0656f9 clr!EEHeapFreeInProcessHeap+0x45, calling kernel32!HeapFreeStub
0000008e09f3b020 00007ffcdc07fbae clr!SimpleRWLock::EnterWrite+0x5a, calling clr!GetThread
0000008e09f3b030 00007ffcdc08533b clr!MethodTable::MethodDataObject::`vector deleting destructor'+0x4b
0000008e09f3b060 00007ffce86d96f8 ntdll!RtlDeleteCriticalSection+0x48, calling ntdll!memset
0000008e09f3b070 00007ffcdc080afb clr!MethodDataCache::Insert+0xc7, calling clr!GetThread
0000008e09f3b080 00007ffcdc065226 clr!CrstBase::Enter+0x6a, calling ntdll!RtlTryEnterCriticalSection
0000008e09f3b090 00007ffcdc1d284f clr!CrstBase::Destroy+0x70, calling ntdll!RtlDeleteCriticalSection
0000008e09f3b0c0 00007ffcdc06518c clr!CrstBase::Leave+0x30, calling ntdll!RtlLeaveCriticalSection
0000008e09f3b0d0 00007ffcdc0656f9 clr!EEHeapFreeInProcessHeap+0x45, calling kernel32!HeapFreeStub
0000008e09f3b0f0 00007ffcdc065226 clr!CrstBase::Enter+0x6a, calling ntdll!RtlTryEnterCriticalSection
0000008e09f3b100 00007ffcdc0a4a89 clr!DeadlockAwareLock::ReleaseBlockingLock+0x9, calling clr!GetThread
0000008e09f3b130 00007ffcdc1d6e65 clr!ListLockEntry::DeadlockAwareEnter+0x52, calling clr!DeadlockAwareLock::ReleaseBlockingLock
0000008e09f3b1a0 00007ffcdc1d7475 clr!AppDomain::LoadDomainFile+0x155, calling clr!AppDomain::TryIncrementalLoad
0000008e09f3b230 00007ffcdc06518c clr!CrstBase::Leave+0x30, calling ntdll!RtlLeaveCriticalSection
0000008e09f3b260 00007ffcdc1ec8fd clr!AppDomain::LoadDomainFile+0xcb, calling clr!AppDomain::LoadDomainFile
0000008e09f3b2c0 00007ffcdc1df5ec clr!DomainFile::EnsureLoadLevel+0x38, calling clr!AppDomain::LoadDomainFile
0000008e09f3b2f0 00007ffcdc1f6f6a clr!DomainFile::TryEnsureActive+0x4a, calling clr!DomainFile::EnsureLoadLevel
0000008e09f3b340 00007ffcdc082789 clr!MemberLoader::GetDescFromMemberRef+0x473, calling clr!CrstBase::Leave
0000008e09f3b390 00007ffcdc1f708b clr!DomainFile::PropagateNewActivation+0x8b, calling clr!DomainFile::TryEnsureActive
0000008e09f3b3b0 00007ffcdc065226 clr!CrstBase::Enter+0x6a, calling ntdll!RtlTryEnterCriticalSection
0000008e09f3b3f0 00007ffcdc06518c clr!CrstBase::Leave+0x30, calling ntdll!RtlLeaveCriticalSection
0000008e09f3b420 00007ffcdc1f6f9f clr!Module::AddActiveDependency+0x1bc, calling clr!DomainFile::PropagateNewActivation
0000008e09f3b4b0 00007ffcdc083704 clr!CEEInfo::resolveToken+0x677
0000008e09f3b560 00007ffcdc1ad720 clr!HashMap::LookupValue+0x1f1
0000008e09f3b600 00007ffcdc1d253d clr!PEFile::Equals+0x5d, calling clr!PEImage::Equals
0000008e09f3b640 00007ffcdc1d5091 clr!AssemblySpecBindingCache::StoreAssembly+0x8a, calling clr!_security_check_cookie
0000008e09f3b660 00007ffcd979b164 clrjit!Compiler::verMakeTypeInfo+0xb0 [f:\dd\ndp\clr\src\jit\importer.cpp:4288]
0000008e09f3b6d0 00007ffcd97dc517 clrjit!Compiler::impImportCall+0x976 [f:\dd\ndp\clr\src\jit\importer.cpp:6902], calling clrjit!Compiler::impPushOnStack [f:\dd\ndp\clr\src\jit\importer.cpp:80]
0000008e09f3b730 00007ffcd97ae2b6 clrjit!Compiler::fgWalkTreePre+0x5b [f:\dd\ndp\clr\src\jit\compiler.hpp:2872], calling clrjit!GenTreeVisitor<GenericTreeWalker<0,1,0,0,1> >::WalkTree [f:\dd\ndp\clr\src\jit\compiler.h:9618]
0000008e09f3b780 00007ffce4b7f14a KERNELBASE!MultiByteToWideChar+0xfa, calling KERNELBASE!_security_check_cookie
0000008e09f3b810 00007ffcd97c5339 clrjit!Compiler::impResolveToken+0x69 [f:\dd\ndp\clr\src\jit\importer.cpp:277]
0000008e09f3b840 00007ffcd9795ce9 clrjit!Compiler::impImportBlockCode+0x3a33 [f:\dd\ndp\clr\src\jit\importer.cpp:12857], calling clrjit!Compiler::impResolveToken [f:\dd\ndp\clr\src\jit\importer.cpp:263]
0000008e09f3b860 00007ffcdc083efb clr!CMDSemReadWrite::~CMDSemReadWrite+0x1c, calling clr!UTSemReadWrite::UnlockRead
0000008e09f3b870 00007ffcdc065226 clr!CrstBase::Enter+0x6a, calling ntdll!RtlTryEnterCriticalSection
0000008e09f3b890 00007ffcdc0fd473 clr!MDInternalRW::GetAssemblyProps+0xfe, calling clr!CMDSemReadWrite::~CMDSemReadWrite
0000008e09f3b8b0 00007ffcdc06518c clr!CrstBase::Leave+0x30, calling ntdll!RtlLeaveCriticalSection
0000008e09f3b980 00007ffcdc1d4f61 clr!PEAssembly::HasBindableIdentity+0x9, calling clr!PEFile::GetFlags
0000008e09f3b9b0 00007ffcdc1d4f45 clr!PEAssembly::CanUseWithBindingCache+0x16, calling clr!PEAssembly::HasBindableIdentity
0000008e09f3b9e0 00007ffcdc1d4934 clr!AppDomain::LoadDomainAssemblyInternal+0x129, calling clr!APIThreadStress::~APIThreadStress
0000008e09f3ba40 00007ffcd9786a4c clrjit!Compiler::fgMorphTree+0x4c [f:\dd\ndp\clr\src\jit\morph.cpp:15669], calling clrjit!GenTreeUseEdgeIterator::GenTreeUseEdgeIterator [f:\dd\ndp\clr\src\jit\gentree.cpp:8748]
0000008e09f3ba50 00007ffcdc0fd319 clr!CMiniMdTemplate<CMiniMdRW>::getPublicKeyOfAssembly+0x6d
0000008e09f3ba60 00007ffcdc083efb clr!CMDSemReadWrite::~CMDSemReadWrite+0x1c, calling clr!UTSemReadWrite::UnlockRead
0000008e09f3ba90 00007ffcdc0fd473 clr!MDInternalRW::GetAssemblyProps+0xfe, calling clr!CMDSemReadWrite::~CMDSemReadWrite
0000008e09f3bac0 00007ffcdc1c5f40 clr!BaseAssemblySpec::CompareEx+0x179, calling MSVCR120_CLR0400!strcmp
0000008e09f3baf0 00007ffcdc1adc27 clr!DomainAssemblyCache::CompareBindingSpec+0x33, calling clr!BaseAssemblySpec::CompareEx
0000008e09f3bb10 00007ffcdc1d37de clr!PEFile::IsStrongNamed+0x76, calling clr!Wrapper<IAssemblyName * __ptr64,&DoNothing<IAssemblyName * __ptr64>,&DoTheRelease<IAssemblyName>,0,&CompareDefault<IAssemblyName * __ptr64>,2,1>::~Wrapper<IAssemblyName * __ptr64,&DoNothing<IAssemblyName * __ptr64>,&DoTheRelease<IAssemblyName>,0,&CompareDefault<IAssemblyName * __ptr64>,2,1>
0000008e09f3bb20 00007ffcdc1ad75b clr!ComparePtr::CompareHelper+0x2b
0000008e09f3bb50 00007ffcdc1ad720 clr!HashMap::LookupValue+0x1f1
0000008e09f3bc00 00007ffcdc1d380d clr!LookupMap<Module * __ptr64>::TrySetElement+0xe, calling clr!LookupMapBase::GetElementPtr
0000008e09f3bc30 00007ffcdc0bd826 clr!Module::GetAssemblyIfLoaded+0xe7, calling clr!_security_check_cookie
0000008e09f3bca0 00007ffcdc1d37de clr!PEFile::IsStrongNamed+0x76, calling clr!Wrapper<IAssemblyName * __ptr64,&DoNothing<IAssemblyName * __ptr64>,&DoTheRelease<IAssemblyName>,0,&CompareDefault<IAssemblyName * __ptr64>,2,1>::~Wrapper<IAssemblyName * __ptr64,&DoNothing<IAssemblyName * __ptr64>,&DoTheRelease<IAssemblyName>,0,&CompareDefault<IAssemblyName * __ptr64>,2,1>
0000008e09f3bf40 00007ffcdc070ebd clr!StgPoolReadOnly::GetString+0x3d
0000008e09f3bf80 00007ffcdc0b619f clr!MDInternalRW::GetNameOfTypeDef+0xaf, calling clr!CMiniMdTemplate<CMiniMdRW>::getNamespaceOfTypeDef
0000008e09f3bf90 00007ffcdc0bcc56 clr!ConstructKeyCallbackCompare::UseKeys+0x64, calling MSVCR120_CLR0400!strcmp
0000008e09f3bfc0 00007ffcdc0bcbec clr!EEClassHashTable::ConstructKeyFromData+0x109, calling clr!_security_check_cookie
0000008e09f3bff0 00007ffcdc083efb clr!CMDSemReadWrite::~CMDSemReadWrite+0x1c, calling clr!UTSemReadWrite::UnlockRead
0000008e09f3c060 00007ffcdc0bcd5a clr!EEClassHashTable::FindItem+0xb5, calling clr!EEClassHashTable::ConstructKeyFromData
0000008e09f3c0e0 00007ffcdc0bd42d clr!ClassLoader::GetClassValue+0x123, calling clr!EEClassHashTable::FindItem
0000008e09f3c160 00007ffcdc07ff08 clr!ClassLoader::LoadTypeDefThrowing+0x100, calling clr!_security_check_cookie
0000008e09f3c210 00007ffcdc07013e clr!SigPointer::GetTypeHandleThrowing+0x64a, calling clr!_security_check_cookie
0000008e09f3c2b0 00007ffcdc0bd34e clr!ClassLoader::LoadTypeHandleThrowing+0x137, calling clr!ClassLoader::LoadTypeDefThrowing
0000008e09f3c360 00007ffcdc070ebd clr!StgPoolReadOnly::GetString+0x3d
0000008e09f3c3b0 00007ffcdc0bd984 clr!Module::StoreTypeRef+0x44, calling clr!LookupMapBase::GetElementPtr
0000008e09f3c3f0 00007ffcdc0bd934 clr!ClassLoader::LoadTypeDefOrRefThrowing+0x3af, calling clr!Module::StoreTypeRef
0000008e09f3c430 00007ffce86ade98 ntdll!RtlpAllocateHeapInternal+0xf8, calling ntdll!RtlpLowFragHeapAllocFromContext
0000008e09f3c4b0 00007ffcdbf5bef2 MSVCR120_CLR0400!shortsort+0x62
0000008e09f3c4f0 00007ffcdbf5be78 MSVCR120_CLR0400!qsort+0x328, calling MSVCR120_CLR0400!_security_check_cookie
0000008e09f3c550 00007ffcd7c22918 *** WARNING: Unable to verify checksum for Datadog.Trace.ClrProfiler.Native.dll
*** ERROR: Symbol file could not be found.  Defaulted to export symbols for Datadog.Trace.ClrProfiler.Native.dll - 
Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x373d8, calling ntdll!RtlAllocateHeap
0000008e09f3c580 00007ffcd7befa03 Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x44c3, calling Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x37394
0000008e09f3c5e0 00007ffcd7c2140c Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x35ecc, calling kernel32!HeapFreeStub
0000008e09f3c610 00007ffcd7bc6cf4 Datadog_Trace_ClrProfiler_Native!DllCanUnloadNow+0x13d84, calling Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x4480
0000008e09f3c660 00007ffcd7c2140c Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x35ecc, calling kernel32!HeapFreeStub
0000008e09f3c690 00007ffcd7bc6ff2 Datadog_Trace_ClrProfiler_Native!DllCanUnloadNow+0x14082, calling Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x4480
0000008e09f3c720 00007ffcdc078ffb clr!CEEInfo::getArgClass+0x1bb, calling clr!SigPointer::GetTypeHandleThrowing
0000008e09f3c7b0 00007ffcd979c98f clrjit!Compiler::impImportBlock+0x7d [f:\dd\ndp\clr\src\jit\importer.cpp:16079], calling clrjit!Compiler::impImportBlockCode [f:\dd\ndp\clr\src\jit\importer.cpp:9841]
0000008e09f3c840 00007ffcd97bfbb5 clrjit!ExpandArray<unsigned char>::EnsureCoversInd+0x104 [f:\dd\ndp\clr\src\inc\expandarray.h:206], calling clrjit!memset
0000008e09f3c880 00007ffcd9799e04 clrjit!Compiler::impImport+0x2f7 [f:\dd\ndp\clr\src\jit\importer.cpp:17163], calling clrjit!Compiler::impImportBlock [f:\dd\ndp\clr\src\jit\importer.cpp:15994]
0000008e09f3c8d0 00007ffcd97a1609 clrjit!Compiler::compCompile+0x99 [f:\dd\ndp\clr\src\jit\compiler.cpp:4386], calling clrjit!Compiler::impImport [f:\dd\ndp\clr\src\jit\importer.cpp:17050]
0000008e09f3c910 00007ffcd97a1c8e clrjit!Compiler::fgFindBasicBlocks+0x175 [f:\dd\ndp\clr\src\jit\flowgraph.cpp:6316], calling clrjit!__security_check_cookie [f:\dd\vctools\crt\crtw32\misc\amd64\amdsecgs.asm:45]
0000008e09f3c9b0 00007ffcd97a1097 clrjit!Compiler::compCompileHelper+0x2a7 [f:\dd\ndp\clr\src\jit\compiler.cpp:6025], calling clrjit!Compiler::compCompile [f:\dd\ndp\clr\src\jit\compiler.cpp:4360]
0000008e09f3ca60 00007ffcd97a2243 clrjit!Compiler::compCompile+0x24b [f:\dd\ndp\clr\src\jit\compiler.cpp:5359], calling clrjit!Compiler::compCompileHelper [f:\dd\ndp\clr\src\jit\compiler.cpp:5708]
0000008e09f3caa0 00007ffcd979b92e clrjit!Compiler::compInit+0x48a [f:\dd\ndp\clr\src\jit\compiler.cpp:1970], calling clrjit!ExpandArray<LclSsaVarDsc>::Init [f:\dd\ndp\clr\src\inc\expandarray.h:52]
0000008e09f3cb20 00007ffcd979beba clrjit!jitNativeCode+0x26a [f:\dd\ndp\clr\src\jit\compiler.cpp:6666], calling clrjit!Compiler::compCompile [f:\dd\ndp\clr\src\jit\compiler.cpp:5108]
0000008e09f3cce0 00007ffcd9784dd2 clrjit!CILJit::compileMethod+0xa2 [f:\dd\ndp\clr\src\jit\ee_il_dll.cpp:315], calling clrjit!jitNativeCode [f:\dd\ndp\clr\src\jit\compiler.cpp:6540]
0000008e09f3cd50 00007ffcdc074e5c clr!invokeCompileMethodHelper+0xce
0000008e09f3cdc0 00007ffcdc0750df clr!invokeCompileMethod+0x97, calling clr!invokeCompileMethodHelper
0000008e09f3ce30 00007ffcdc074fb2 clr!CallCompileMethodWithSEHWrapper+0x52, calling clr!invokeCompileMethod
0000008e09f3cec0 00007ffcdc074bd6 clr!UnsafeJitFunction+0x7ea, calling clr!CallCompileMethodWithSEHWrapper
0000008e09f3d080 00007ffcd7bc054d Datadog_Trace_ClrProfiler_Native!DllCanUnloadNow+0xd5dd, calling Datadog_Trace_ClrProfiler_Native!IsProfilerAttached+0x25b38
0000008e09f3d330 00007ffce86ade98 ntdll!RtlpAllocateHeapInternal+0xf8, calling ntdll!RtlpLowFragHeapAllocFromContext
0000008e09f3d380 00007ffcdc08494d clr!CMiniMdTemplate<CMiniMdRW>::getSignatureOfStandAloneSig+0x6d
0000008e09f3d3c0 00007ffcdc0848b6 clr!MDInternalRW::GetSigFromToken+0x86, calling clr!CMiniMdTemplate<CMiniMdRW>::getSignatureOfStandAloneSig
0000008e09f3d400 00007ffcdc07abd4 clr!COR_ILMETHOD_DECODER::COR_ILMETHOD_DECODER+0x20c
0000008e09f3d410 00007ffcdc072c59 clr!ETW::MethodLog::MethodJitting+0x81, calling clr!CLRException::HandlerState::CleanupTry
0000008e09f3d430 00007ffcdc07753e clr!Module::GetIL+0x42, calling clr!PEDecoder::CheckILMethod
0000008e09f3d460 00007ffcdc51ca41 clr!EEToProfInterfaceImpl::JITCompilationStarted+0x85
0000008e09f3d4b0 00007ffcdc074188 clr!MethodDesc::MakeJitWorker+0x498, calling clr!UnsafeJitFunction
0000008e09f3d4f0 00007ffcdc084d14 clr!MDInternalRW::GetCountWithTokenKind+0x9c, calling clr!CMDSemReadWrite::~CMDSemReadWrite
0000008e09f3d5c0 00007ffcdc07286c clr!validateTokenSig+0xec, calling clr!validateOneArg
0000008e09f3d5f0 00007ffcdc0848b6 clr!MDInternalRW::GetSigFromToken+0x86, calling clr!CMiniMdTemplate<CMiniMdRW>::getSignatureOfStandAloneSig
0000008e09f3d630 00007ffcdc07ac00 clr!COR_ILMETHOD_DECODER::COR_ILMETHOD_DECODER+0x278, calling clr!validateTokenSig
0000008e09f3d6e0 00007ffcdc06ee3f clr!MethodDesc::DoPrestub+0x94c, calling clr!MethodDesc::MakeJitWorker
0000008e09f3d7f0 00007ffc7d3b4cc9 (MethodDesc 00007ffc7d392eb8 +0x9 System.Configuration.ConfigurationValues.GetConfigValue(Int32)), calling (MethodDesc 00007ffc7d303c28 +0 System.Collections.Specialized.NameObjectCollectionBase.BaseGet(Int32))
0000008e09f3d810 00007ffc7d3b4b83 (MethodDesc 00007ffc7d39c738 +0x13 System.Configuration.ConfigurationValues+ConfigurationElementsCollection+<System-Collections-IEnumerable-GetEnumerator>d__2..ctor(Int32)), calling (MethodDesc 00007ffc7caf6708 +0 System.Object..ctor())
0000008e09f3d818 00007ffc7ce040a0 (MethodDesc 00007ffc7cc5a028 +0x10 System.Runtime.InteropServices.HandleRef..ctor(System.Object, IntPtr)), calling clr!JIT_CheckedWriteBarrier
0000008e09f3d820 00007ffc7d3b4c89 (MethodDesc 00007ffc7d392f28 +0x9 System.Configuration.ConfigurationValues.get_Item(Int32)), calling (MethodDesc 00007ffc7d392eb8 +0 System.Configuration.ConfigurationValues.GetConfigValue(Int32))
0000008e09f3d830 00007ffc7d13a4ee (MethodDesc 00007ffc7d2db348 +0xae System.Web.ImpersonationContext.RestoreImpersonation()), calling (MethodDesc 00007ffc7cbb2a50 +0 System.IntPtr.op_Inequality(IntPtr, IntPtr))
0000008e09f3d840 00007ffc7d3b5059 (MethodDesc 00007ffc7d304a78 +0xa9 System.Configuration.ConfigurationElementCollection.SetReadOnly()), calling (MethodDesc 00007ffc7d304a78 +0xb1 System.Configuration.ConfigurationElementCollection.SetReadOnly())
0000008e09f3d8d0 00007ffcdc06e4db clr!PreStubWorker+0x3cc, calling clr!MethodDesc::DoPrestub
0000008e09f3d960 00007ffc7d169c51 (MethodDesc 00007ffc7d26a798 +0x91 System.Configuration.RuntimeConfigurationRecord.GetRuntimeObject(System.Object)), calling (MethodDesc 00007ffc7d26a798 +0x9f System.Configuration.RuntimeConfigurationRecord.GetRuntimeObject(System.Object))
0000008e09f3da50 00007ffc7cde1b35 (MethodDesc 00007ffc7cc5a5f8 +0x125 System.Collections.Hashtable.get_Item(System.Object))
0000008e09f3db10 00007ffc7cde1b35 (MethodDesc 00007ffc7cc5a5f8 +0x125 System.Collections.Hashtable.get_Item(System.Object))
0000008e09f3db20 00007ffc7d142239 (MethodDesc 00007ffc7d303c08 +0x9 System.Collections.Specialized.NameObjectCollectionBase.BaseGet(System.String)), calling (MethodDesc 00007ffc7d303b68 +0 System.Collections.Specialized.NameObjectCollectionBase.FindEntry(System.String))
0000008e09f3dc20 00007ffcdc064825 clr!ThePreStub+0x55, calling clr!PreStubWorker
0000008e09f3dcd0 00007ffc84afe77f (MethodDesc 00007ffc82402d88 +0x1f MySql.Web.Common.SchemaManager.CheckSchema(System.String, System.Collections.Specialized.NameValueCollection)), calling 00007ffc7e2f6628 (stub for MySql.Web.Common.SchemaManager.GetSchemaVersion(System.String))
0000008e09f3dd20 00007ffc84afe6c1 (MethodDesc 00007ffc824044b0 +0x1d1 MySql.Web.Security.MySQLRoleProvider.Initialize(System.String, System.Collections.Specialized.NameValueCollection)), calling 00007ffc7e2f6618 (stub for MySql.Web.Common.SchemaManager.CheckSchema(System.String, System.Collections.Specialized.NameValueCollection))
0000008e09f3dd70 00007ffc7d9223ae (MethodDesc 00007ffc7d304378 +0x2ae System.Web.Configuration.ProvidersHelper.InstantiateProvider(System.Configuration.ProviderSettings, System.Type))
0000008e09f3de00 00007ffc7d920cb8 (MethodDesc 00007ffc7d304398 +0x98 System.Web.Configuration.ProvidersHelper.InstantiateProviders(System.Configuration.ProviderSettingsCollection, System.Configuration.Provider.ProviderCollection, System.Type)), calling (MethodDesc 00007ffc7d304378 +0 System.Web.Configuration.ProvidersHelper.InstantiateProvider(System.Configuration.ProviderSettings, System.Type))
0000008e09f3de60 00007ffc7d9200cd (MethodDesc 00007ffc7d8870d8 +0x1fd System.Web.Security.Roles.Initialize()), calling (MethodDesc 00007ffc7d8870e8 +0 System.Web.Security.Roles.InitializeSettings(System.Web.Configuration.RoleManagerSection))
0000008e09f3ded0 00007ffc7d91fe99 (MethodDesc 00007ffc7d886eb8 +0x9 System.Web.Security.Roles.get_CacheRolesInCookie()), calling (MethodDesc 00007ffc7d8870d8 +0 System.Web.Security.Roles.Initialize())
0000008e09f3df00 00007ffc7d91fc3b (MethodDesc 00007ffc7d7b4bd0 +0xfb System.Web.Security.RoleManagerModule.OnEnter(System.Object, System.EventArgs)), calling (MethodDesc 00007ffc7d886eb8 +0 System.Web.Security.Roles.get_CacheRolesInCookie())
0000008e09f3df10 00007ffc7d8c8c99 (MethodDesc 00007ffc7d8b1478 +0x19 System.Web.EtwTrace.IsTraceEnabled(Int32, Int32)), calling clr!JIT_GetSharedNonGCStaticBase_InlineGetAppDomain
0000008e09f3df50 00007ffc7d8d1358 (MethodDesc 00007ffc7d7bbd10 +0x98 System.Web.HttpApplication+SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute())
0000008e09f3df60 00007ffc7d8c28d5 (MethodDesc 00007ffc7d8b2430 +0xa5 DomainNeutralILStubClass.IL_STUB_PInvoke(IntPtr, Int32 ByRef, Boolean ByRef, Int32 ByRef))
0000008e09f3dfa0 00007ffc7d8d1270 (MethodDesc 00007ffc7d5146c0 +0xe0 System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep))
0000008e09f3dfb0 00007ffc7d8d10f6 (MethodDesc 00007ffc7d43b558 +0x16 System.Web.HttpContext.BeginCancellablePeriod()), calling (MethodDesc 00007ffc7cbbb858 +0 System.Threading.Volatile.Read(Int64 ByRef))
0000008e09f3dfc0 00007ffc7d67ad73 (MethodDesc 00007ffc7d242cc0 +0x13 System.Web.HttpRuntime.get_UseIntegratedPipeline()), calling clr!JIT_GetSharedNonGCStaticBase_InlineGetAppDomain
0000008e09f3dfe0 00007ffc7d8d0e2b (MethodDesc 00007ffc7d5146d0 +0x5b System.Web.HttpApplication.ExecuteStep(IExecutionStep, Boolean ByRef)), calling (MethodDesc 00007ffc7d5146c0 +0 System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep))
0000008e09f3dff0 00007ffc7d8d0b9f (MethodDesc 00007ffc7d513e80 +0x1f System.Web.HttpApplication.get_CurrentModuleContainer()), calling (MethodDesc 00007ffc7d43b788 +0 System.Web.HttpContext.get_CurrentModuleIndex())
0000008e09f3e000 00007ffc7d8d0d81 (MethodDesc 00007ffc7d7bbbe0 +0x11 System.Web.PipelineModuleStepContainer.GetNextEvent(System.Web.RequestNotification, Boolean, Int32)), calling (MethodDesc 00007ffc7d7bbbc0 +0 System.Web.PipelineModuleStepContainer.GetStepArray(System.Web.RequestNotification, Boolean))
0000008e09f3e030 00007ffc7d8ca873 (MethodDesc 00007ffc7d8b4308 +0x4e3 System.Web.HttpApplication+PipelineStepManager.ResumeSteps(System.Exception)), calling (MethodDesc 00007ffc7d5146d0 +0 System.Web.HttpApplication.ExecuteStep(IExecutionStep, Boolean ByRef))
0000008e09f3e050 00007ffc7d91baf4 (MethodDesc 00007ffc7d7b4778 +0x14 System.Web.Security.WindowsAuthenticationModule.get_IsEnabled()), calling clr!JIT_GetSharedNonGCStaticBase_InlineGetAppDomain
0000008e09f3e130 00007ffc7d8c9eb0 (MethodDesc 00007ffc7d514820 +0x70 System.Web.HttpApplication.BeginProcessRequestNotification(System.Web.HttpContext, System.AsyncCallback)), calling (MethodDesc 00007ffc7d514700 +0 System.Web.HttpApplication.ResumeSteps(System.Exception))
0000008e09f3e140 00007ffc7d8c90a1 (MethodDesc 00007ffc7d750ed0 +0xc1 System.Web.Hosting.IIS7WorkerRequest.SynchronizeVariables(System.Web.HttpContext)), calling (MethodDesc 00007ffc7d43b748 +0 System.Web.HttpContext.get_AreResponseHeadersSent())
0000008e09f3e180 00007ffc7d8c2c57 (MethodDesc 00007ffc7d242cf0 +0x177 System.Web.HttpRuntime.ProcessRequestNotificationPrivate(System.Web.Hosting.IIS7WorkerRequest, System.Web.HttpContext)), calling (MethodDesc 00007ffc7d514820 +0 System.Web.HttpApplication.BeginProcessRequestNotification(System.Web.HttpContext, System.AsyncCallback))
0000008e09f3e190 00007ffcdc0650f9 clr!ThreadNative::GetCurrentThread+0x9, calling clr!GetThread
0000008e09f3e1b0 00007ffc7d8c2a9a (MethodDesc 00007ffc7d242ce0 +0x2a System.Web.HttpRuntime.ProcessRequestNotification(System.Web.Hosting.IIS7WorkerRequest, System.Web.HttpContext)), calling clr!JIT_GetSharedGCStaticBase_InlineGetAppDomain
0000008e09f3e1f0 00007ffc7d8c07e0 (MethodDesc 00007ffc7d6dd130 +0x360 System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr, IntPtr, IntPtr, Int32)), calling (MethodDesc 00007ffc7d242ce0 +0 System.Web.HttpRuntime.ProcessRequestNotification(System.Web.Hosting.IIS7WorkerRequest, System.Web.HttpContext))
0000008e09f3e350 00007ffc7d6a7123 (MethodDesc 00007ffc7d6dd120 +0x13 System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr, IntPtr, IntPtr, Int32)), calling (MethodDesc 00007ffc7d6dd130 +0 System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr, IntPtr, IntPtr, Int32))
0000008e09f3e390 00007ffc7d6a6a32 (MethodDesc 00007ffc7d8b0618 +0x52 DomainNeutralILStubClass.IL_STUB_ReversePInvoke(Int64, Int64, Int64, Int32))

Linux duration span issue still happens with .Net Core HttpClient

Describe the bug
The span duration issue with Linux that was outline in #293 appears to still be there when using HttpClient. Average duration on some service is 100 to 1000 times bigger in HttpClient when compared to duration of the service it is actually calling.

To Reproduce
Steps to reproduce the behavior:
Call an api from .Net Core on Linux using HttpClient with Datadog APM enabled.

Expected behavior
Durations in HttpClient should closely match the durations of the service they are calling.

Screenshots
Most of the HttpClient calls are calling other services in the list.
image

Runtime environment (please complete the following information):

  • Auto instrumentation, .Net Core Linux
  • Tracer version: 1.0.0
  • OS: Linux x64 (Debian
  • CLR: .NET Core 2.0

Let me know if more information is required. Original fix in 0.8.2 worked great, figured you might have just missed a spot.

Exception when instrumenting application with multiple AppDomains

Describe the bug
When creating an AppDomain in a C# application, the following exception is thrown:
System.Runtime.Serialization.SerializationException: Type 'Datadog.Trace.Scope' in assembly 'Datadog.Trace, Version=0.5.1.0, Culture=neutral, PublicKeyToken=def86d061d0d2eeb' is not marked as serializable.

To Reproduce
Steps to reproduce the behavior:

  1. Instrument an application which uses the AppDomain.CreateDomain(...)
    2.Run it
  2. Get the exception

Expected behavior
No exception should be thrown.

Screenshots

Runtime environment (please complete the following information):

  • Automatic with MSI install
  • Tracer version: 0.5.2-beta
  • OS: Windows Server 2016
  • CLR: .NET Framework 4.6.2

Additional context
Add any other context about the problem here.

Installing Datadog.Trace.1.1.0 via nuget fails to install LibLog 5.0.6

I have a .NET Framework 4.5.2 project that I'm upgrading from 0.7.1-beta to 1.1.0.

My installation is failing with the following:

Attempting to gather dependency information for multiple packages with respect to project 'SkySlope.DataFeed.Api', targeting '.NETFramework,Version=v4.5.2'
Gathering dependency information took 10.85 sec
Attempting to resolve dependencies for multiple packages.
Resolving dependency information took 0 ms
Resolving actions install multiple packages
Retrieving package 'Datadog.Trace 1.1.0' from 'nuget.org'.
Retrieving package 'LibLog 5.0.6' from 'nuget.org'.
Retrieving package 'MsgPack.Cli 1.0.1' from 'nuget.org'.
  GET https://api.nuget.org/v3-flatcontainer/datadog.trace/1.1.0/datadog.trace.1.1.0.nupkg
Removed package 'Datadog.Trace 0.7.1-beta' from 'packages.config'
  GET https://api.nuget.org/v3-flatcontainer/msgpack.cli/1.0.1/msgpack.cli.1.0.1.nupkg
Successfully uninstalled 'Datadog.Trace 0.7.1-beta' from SkySlope.DataFeed.Api
  OK https://api.nuget.org/v3-flatcontainer/msgpack.cli/1.0.1/msgpack.cli.1.0.1.nupkg 12ms
Installing MsgPack.Cli 1.0.1.
Removed package 'MsgPack.Cli 1.0.0' from 'packages.config'
Successfully uninstalled 'MsgPack.Cli 1.0.0' from SkySlope.DataFeed.Api
Install failed. Rolling back...
Package 'LibLog 5.0.6' does not exist in project 'SkySlope.DataFeed.Api'
Package 'MsgPack.Cli 1.0.0' already exists in folder 'C:\skyslope\skyslope-datafeed\packages'
Added package 'MsgPack.Cli 1.0.0' to 'packages.config'
Package 'Datadog.Trace 0.7.1-beta' already exists in folder 'C:\skyslope\skyslope-datafeed\packages'
Added package 'Datadog.Trace 0.7.1-beta' to 'packages.config'
Package 'LibLog 5.0.6' does not exist in folder 'C:\skyslope\skyslope-datafeed\packages'
Executing nuget actions took 527.85 ms
Update-Package : Could not install package 'LibLog 5.0.6'. You are trying to install this package into a project that targets '.NETFramework,Version=v4.5.2', but the package does not contain any 
assembly references or content files that are compatible with that framework. For more information, contact the package author.
At line:1 char:1
+ Update-Package Datadog.Trace
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Update-Package], Exception
    + FullyQualifiedErrorId : NuGetCmdletUnhandledException,NuGet.PackageManagement.PowerShellCmdlets.UpdatePackageCommand
 
Time Elapsed: 00:00:11.4693147

Any ideas on how to proceed? We'll fall back to 1.0.0 for now, since that doesn't have LibLog as a dependency.

AspNet integration creates spans that doesn't wrap any subspans

Description
We are running the AdoNet integration and AspNet integration together. For WebForms requests, spans are created that span the duration of the execution of the page, but it does not cover any of the executed database queries performed as a part of that execution. However, spans are created separately that cover the database queries, but they are orphans (no origin page/span cover them).

To Reproduce
Create a AspNet WebForms solution and run it in an IIS Application Pool set to Integrated Mode.

Expected behavior
Any spans/SQL queries executed as a part of a AspNet WebForms request should appear as subspans.

Screenshots
image

Runtime environment:

  • Instrumentation mode: automatic with msi installer and AspNet integration using the NuGet package
  • Tracer version: 1.0.0 (awaiting bug-fixed 1.1.0)
  • OS: Windows Server 2016 version 1607
  • CLR: .NET Framework 4.6.2

Additional context
This is probably due to AsyncLocal used incorrectly (in an AspNet WebForms context) to store the scope in the ScopeManager. The ExecutionContext doesn't flow between ASP.NET pipeline events, so in that scenario you could store the scope in HttpContext.Items instead. See aspnet/AspNetKatana#31 (comment) and https://stackoverflow.com/questions/43391498/asynclocal-value-is-null-after-being-set-from-within-application-beginrequest/43500585#43500585

about some suggestion

1ใ€.net framefork GAC and .net core DOTNET_ADDITIONAL_DEPS env can make none ref nuget.
2ใ€i think native c++ profiler replace methodcall il may be not a good way, we can just change profiler method il . ( consider some async method, we can use profiler api define method (copy profiler method and rename ) and change profiler method il to call managed profiler dll method , and use managed code call profiler api define method .)

Instrumentation for .NET Windows Service

Hi, I couldn't find this information in your docs, so I thought I'd ask here.

We have a .NET Windows Service (based on .NET Framework 4.7.2, built with TopShelf library, running under Windows Server 2016). The Windows Service performs background processing, which in-turn make database calls, web requests etc.

Is it possible to use the Datadog .NET tracer agent to automatically instrument a .NET Windows Service, so that its methods, internal calls and uncaught exceptions can be traced? If possible, then how should I go about it, and what are the steps involved?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.