Lambda Auto-Instrumentation

Automatically instrument your Lambdas with OpenTelemetry

The OpenTelemetry community provides standalone instrumentation Lambda layers for the following languages:

  • Java
  • JavaScript
  • Python

These can be added to your Lambda using the AWS portal to automatically instrument your application. These layers do not include the Collector which is a required addition unless you configure an external Collector instance to send your data.

Add the ARN of the OTel Collector Lambda layer

See the Collector Lambda layer guidance to add the layer to your application and configure the Collector. We recommend you add this first.

Language Requirements

The Lambda layer supports the Java 8, 11, and 17 (Corretto) Lambda runtimes. For more information about supported Java versions, see the OpenTelemetry Java documentation.

Note: The Java Auto-instrumentation agent is in the Lambda layer - Automatic instrumentation has a notable impact on startup time on AWS Lambda and you will generally need to use this along with provisioned concurrency and warmup requests to serve production requests without causing timeouts on initial requests while it initializes.

By default, the OTel Java agent in the Layer will try to auto-instrument all the code in your application. This can have a negative impact on the Lambda cold startup time.

We recommend that you only enable auto-instrumentation for the libraries/frameworks that are used by your application.

To enable only specific instrumentations, you can use the following environment variables:

  • OTEL_INSTRUMENTATION_COMMON_DEFAULT_ENABLED: when set to false, disables auto-instrumentation in the Layer, requiring each instrumentation to be enabled individually.

  • OTEL_INSTRUMENTATION_<NAME>_ENABLED: set to true to enable auto-instrumentation for a specific library or framework. Replace <NAME> by the instrumentation that you want to enable. For the list of available instrumentations, see Suppressing specific agent instrumentation.

For example, to only enable auto-instrumentation for Lambda and the AWS SDK, you would set the following environment variables:

OTEL_INSTRUMENTATION_COMMON_DEFAULT_ENABLED=false
OTEL_INSTRUMENTATION_AWS_LAMBDA_ENABLED=true
OTEL_INSTRUMENTATION_AWS_SDK_ENABLED=true

The Lambda layer supports Node.js v14+ Lambda runtimes. For more information about supported JavaScript and Node.js versions, see the OpenTelemetry JavaScript documentation.

The Lambda layer supports Python 3.8 and Python 3.9 Lambda runtimes. For more information about supported Python versions, see the OpenTelemetry Python documentation and the package on PyPi.

Configure AWS_LAMBDA_EXEC_WRAPPER

Change the entry point of your application by setting AWS_LAMBDA_EXEC_WRAPPER=/opt/otel-handler for Node.js or Java, and AWS_LAMBDA_EXEC_WRAPPER=/opt/otel-instrument for Python. These wrapper scripts will invoke your Lambda application with the auto instrumentation package applied.

Add the ARN of Instrumentation Lambda Layer

To enable the OTel auto-instrumentation in your Lambda function, you need to add and configure the instrumentation and Collector layers, and then enable tracing.

  1. Open the Lambda function you intend to instrument in the AWS console.
  2. In the Layers in Designer section, choose Add a layer.
  3. Under specify an ARN, paste the layer ARN, and then choose Add.

Find the most recent instrumentation layer release for your language and use it’s ARN after changing the <region> tag to the region your Lambda is in.

Note: Lambda layers are a regionalized resource, meaning that they can only be used in the Region in which they are published. Make sure to use the layer in the same region as your Lambda functions. The community publishes layers in all available regions.

Configure your SDK exporters

The default exporters used by the Lambda layers will work without any changes if there is an embedded Collector with gRPC / HTTP receivers. The environment variables do not need to be updated. However, there are varying levels of protocol support and default values by language which are documented below.

OTEL_EXPORTER_OTLP_PROTOCOL=grpc supports: grpc, http/protobuf and http/json OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317

OTEL_EXPORTER_OTLP_PROTOCOL env var is not supported The hard coded exporter uses the protocol http/protobuf OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318

OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf supports: http/protobuf and http/json OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318

Publish your Lambda

Publish a new version of your Lambda to deploy the new changes and instrumentation.