To get started with Engine for serverless environments, you will need to:

  1. Instrument your function to respond with a tracing extension that follows the Apollo Tracing format.
  2. Configure and deploy the Engine proxy docker container separately as a standalone server. Because Engine proxy is stateful, it should not be deployed along with your cloud function, but separately.
  3. Send requests to your service –– you’re all set up!

Note: The remainder of these docs are framed specifically for AWS Lambda, for which we have a special “origin” type, but other cloud functions are supported with the standard HTTP invocation. For non-AWS cloud functions, see the standalone docs for instructions on setting up the Engine proxy as a standalone API Gateway to your cloud function.

The Engine proxy will invoke the Lambda function as if it was called from API Gateway, and the function should return a value suitable for API Gateway.

We suggest using NodeJS, but any runtime supported by Lambda can be used.

Supported Node servers: Apollo Server (Express, Hapi, Koa, Restify, and Lambda)

The only available option for running the Engine proxy with a function on Lambda is to run the proxy in a standalone docker container. The Proxy is required as it is responsible for capturing, aggregating and then sending to Engine the trace data from each Lambda instance GraphQL response.

Enable Apollo Tracing

You will need to instrument your Lambda function with a tracing package that follows the Apollo Tracing format. Engine relies on receiving data in this format to create its performance telemetry reports.

For Node with Apollo Server, just pass the tracing: true option to the Apollo Server graphql integration function, just like in our Node setup instructions.

Configure the Proxy

Get your API Key

First, get your Engine API key by creating a service on You will need to log in and click “Add Service” to recieve your API key.

Create your Proxy’s Config.json

The proxy uses a JSON object to get configuration information.

Create a JSON configuration file:

  "apiKey": "<ENGINE_API_KEY>",
  "logging": {
    "level": "INFO"
  "origins": [
      "lambda": {
  "frontends": [
      "host": "",
      "port": 3001,
      "endpoints": ["/graphql"]

Configuration options:

  1. apiKey : The API key for the Engine service you want to report data to.
  2. logging.level : Logging level for the proxy. Supported values are DEBUG, INFO, WARN, ERROR.
  3. origin.lambda.functionArn : The Lambda function to invoke, in the form: arn:aws:lambda:xxxxxxxxxxx:xxxxxxxxxxxx:function:xxxxxxxxxxxxxxxxxxx
  4. origin.lambda.awsAccessKeyId : Your Access Key ID. If not provided the proxy will attempt AWS_ACCESS_KEY_ID/AWS_SECRET_KEY environment variables, and EC2 instance profile.
  5. origin.lambda.awsSecretAccessKey : Your Secret Access Key.
  6. : The hostname the proxy should be available on. For Docker, this should always be
  7. frontend.port : The port the proxy should bind to.
  8. frontend.endpoints : The endpoints on which your app serves GraphQL. This defaults to ["/graphql"].

For full configuration details see Proxy config.

2.3 Run the Proxy (Docker Container)

The Engine proxy is a docker image that you will deploy and manage separate from your server.

If you have a working docker installation, type the following lines in your shell (variables replaced with the correct values for your environment) to run the Engine proxy:

docker run --env "ENGINE_CONFIG=$(cat "${engine_config_path}")" \
  -p "${proxy_frontend_port}:${proxy_frontend_port}" \

This will make the Engine proxy available at http://localhost:3001.

It does not matter where you choose to deploy and manage your Engine proxy. We run our own on Amazon’s EC2 Container Service.

We recognize that almost every team using Engine has a slightly different deployment environment, and encourage you to contact us with feedback or for help if you encounter problems running the Engine proxy.

3. View Metrics in Engine

Once your server is set up, navigate your new Engine service on Start sending requests to your Node server to start seeing performance metrics!

Edit on GitHub
// search box