Comments (6)
Done.
from aws-otel-community.
Hi joebowbeer,
It's on our roadmap to support AWS Lambda, but we do not support it today. Stay tuned for updates soon!
Meanwhile, see our public roadmap here: https://github.com/orgs/aws-observability/projects/4
from aws-otel-community.
Hello,
I spent some time working through integrating Lambda <> OpenTelemetry in a NodeJS environment. Below are some of the areas that I think could see improvement:
Timing of OTEL auto-instrumentation.
We followed a similar approach to what is described in the guide Tracing with the AWS Distro for OpenTelemetry JavaScript SDK, although my setup file is named telemetry.js
instead of tracing.js
.
When working with simple examples, it's easy to add a line similar to the following at the top of your code. This allows opentelemetry to intercept calls to require
in order to automatically instrument supported modules.
require('./telemetry.js');
Unfortunately this approach an fall apart pretty quickly if you add babel/webpack into the mix to minimize the size of your lambda functions. A recommended alternative is to launch your application with a node argument that allows telemetry.js
to load before your main code executes. The command typically looks something like this:
node --require ./telemetry.js src/index.js
This turned out to be mildly challenging in a Lambda environment. I was eventually able to launch our NodeJS lambda functions using this approach, but it involved:
- Creating a new lambda layer that included our
telemetry.js
setup, a customtelemetry-wrapper
execution wrapper, and all thenode_module
dependencies to run saidtelemetry.js
file. Note: if you are building a new layer from scratch, make sure your dependencies fall undernodejs/node_modules
.
telemetry-wrapper
#!/bin/bash
# the path to the interpreter and all of the originally intended arguments
args=("$@")
# the extra options to pass to the interpreter
extra_args=("--require" "/opt/telemetry.js")
# insert the extra options
args=("${args[@]:0:$#-1}" "${extra_args[@]}" "${args[@]: -1}")
# start the runtime with the extra options
exec "${args[@]}"
- Add a custom environment variable to our Lambda to execute the custom execution wrapper
AWS_LAMBDA_EXEC_WRAPPER=/opt/telemetry-wrapper
OTEL Exporters need to flush before shutting down
It's not clear to me if this is a responsibility for AWS or OTEL, but the OTEL exporters don't seem to have the oportuntiy to flush and pending spans before a lambda shuts down. I was able to sole this with a call await api.trace.getTracerProvider().getDelegate().shutdown();
While doing research to find a working solution to the timing issue, I found Sentry's approach to Lambda's pretty clean, hopefully this use case is something that OpenTelemetry gives some attention to in the future.
Connecting OTEL to the external X-Ray trace.
If you turn on X-Ray support for a Lambda, you automatically get high level instrumentation out-of-the-box. If you setup the aws-otel-collector
and use otel libraries to export traces to X-Ray, you can get detailed instrumentation. If you have both X-Ray support enabled at the Lambda level, and your own otel instrumentation, this will result in two separate traces being generated for every invocation of the Lambda function. I managed to work around this by abusing the AWSXRayPropagator
class. :-)
Recording exceptions does not seem to work
For some reason, recording exceptions does not seem to work as expected. I haven't had time to dig deeper into what's going on, only I don't see any indication of the error or stack trace in x-ray.
Other thoughts:
I found this discussion interesting, but I'm not sure if a custom tracer would have helped with the issues I ran into:
It would be nice if this opentelemetry plugin supported a Lambda environment:
Below is an example that demonstrates some of the topics discussed above:
import "core-js/stable";
import "regenerator-runtime/runtime";
const path = require('path');
const { NoRecordingSpan } = require('@opentelemetry/core');
const api = require('@opentelemetry/api');
const deployment = require('../package.json');
const axios = require('axios');
function instrumentHandler(handler) {
return async (event, context, callback) => {
const tracer = api.trace.getTracer(deployment.name, deployment.version);
// Currently AWSXRayPropagator expected to be passed HTTP headers,
// not a Lambda environment map.
const mockHttpRequestHeaders = {
'X-Amzn-Trace-Id': process.env._X_AMZN_TRACE_ID
};
// propagate remote AWS X-Ray span to current execution context
await api.context.with(api.propagation.extract(mockHttpRequestHeaders), async () => {
const remoteSpan = new NoRecordingSpan(api.context.active());
const handlerName = path.basename(process.env._HANDLER)
const handlerSpan = tracer.startSpan(handlerName, {
parent: remoteSpan,
kind: api.SpanKind.CONSUMER
});
let handlerReturn = null;
let handlerError = null;
try {
handlerReturn = await tracer.withSpan(handlerSpan, async () => {
return handler(event, context, callback);
});
} catch (error) {
handlerError = error;
// FIXME: Recorded exceptions to not make it to x-ray, attached to the respective
// span. By allowing the exception to bubble up, Lambda's x-ray integration
// will ultimately record the exception at a higher level span.
handlerSpan.recordException(error);
}
handlerSpan.end();
// ensure exporter(s) have a chance to flush spans before
// lambda fn shuts down or freezes
await api.trace.getTracerProvider().getDelegate().shutdown();
if (handlerError) {
throw handlerError
}
return handlerReturn;
});
};
}
// eslint-disable-next-line no-unused-vars
export const handler = instrumentHandler(async (event, context, callback) => {
console.log(process.env);
const response = await axios.get('https://ifconfig.co/json')
console.log(response.data);
});
from aws-otel-community.
After further investigation, it seems that the approach highlighted above likely does not handle the scenario in which Lambda's are frozen, and potentially never unfrozen. See the following open issue for details:
open-telemetry/opentelemetry-js#1739
from aws-otel-community.
A external-type lambda extension is needed?
from aws-otel-community.
@joebowbeer - Possibly, it looks like some effort has been put towards using Lambda extensions for OTEL support. TBH I'm not familiar enough with Lambda's life-cycle or how the OTEL Collector works to know what additional challenges might be involved in the extension approach.
Maybe someone from one of the following projects can provide recommendations.
https://github.com/open-telemetry/opentelemetry-lambda-extension
https://github.com/aws-observability/aws-otel-lambda
from aws-otel-community.
Related Issues (20)
- Use "Amazon EKS Amazon ECR private repositories" instead of public.ecr.aws for EKS ADOT Addon HOT 7
- How we can send metrics to the collector from things we can't put an endpoint on HOT 1
- Invalid xray traceid when exporting otlp traces HOT 1
- Go sample app: Support for configurable otlp endpoint HOT 3
- couldn't determine metrics port from configuration
- Enable tracing from Step Functions to flow to Open Telemetry Collector HOT 2
- .NET framework versions issues HOT 1
- Adding a Sample Example for FAST API Instrumentation in sample-apps folder HOT 1
- scrape_interval configuration to be updated while creating ADOT add on in EKS HOT 3
- Support for JavaScript auto-instrumentation HOT 1
- Build sample applications at PR Build TIme
- Manual tracing for auto-instrumentation-enabled Lambda fuctions HOT 6
- Service name is too arbitrary HOT 3
- attributes processor in ADOT lambda layer HOT 7
- Prevent AMP from collecting EKS cluster metrics (30$ per day)
- How to execute command within aws-otel-collector container such as shell commands to get information inside the container? HOT 1
- opentelemetry-operator-webhook-service HOT 1
- error 403 forbidden
- ADOT Collector/instrumentation not creating X-Ray spans on ECS Fargate, NodeJS app HOT 1
- Adding processors to the configuration-values.json file HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aws-otel-community.