Output Exporter from a task

Note: This feature is primarily designed to help you process your task output and update variables in your playbook execution so that the updated values are used in any further task execution which uses those variables.

A transformer function can be written (is equivalent of a lambda function) that will parse your alert and extract relevant key-value pairs from it. These key value pairs can then be injected within a playbook during it's run as part of global variables.

Instructions while writing a transformer function:

  • Function must be named transform
  • Input variable will be dictionary (in this case, it'll be the task result)
  • Output of the function must be a dictionary
  • All import commands can be written within the function at it's start.
# payload is api task result
def transform(payload):
  response_body = payload['api_response']['response_body']
  workflow = response_body['workflows'][0]
  playbook = workflow['playbooks'][0]
  
  return {'playbook_name': playbook['name']}

Here's the API task result JSON & it's output basis the above transform() function:

{
  "type": "API_RESPONSE",
  "source": "API",
  "task_local_variable_set": {},
  "api_response": {
    "request_method": "POST",
    "request_url": "https://localhost/get",
    "response_status": "200",
    "response_headers": {
      "Vary": "Accept-Encoding, Accept, Cookie, Origin",
      "Content-Type": "application/json",
      "Transfer-Encoding": "chunked",
      "Connection": "keep-alive",
      "X-Content-Type-Options": "nosniff",
      "Allow": "POST, OPTIONS",
      "Referrer-Policy": "same-origin",
      "Content-Encoding": "gzip",
      "Cross-Origin-Opener-Policy": "same-origin"
    },
    "response_body": {
      "workflows": [
        {
          "type": "DYNAMIC_ALERT",
          "playbooks": [
            {
              "is_active": true,
              "name": "traffic_surge_dynamic_alert_use_case"
            }
          ],
          "is_active": true,
          "name": "traffic_surge_dynamic_alert_use_case_1"
        }
      ]
    }
  }
}
{
  "success": true,
  "output": {
    "$playbook_name": "traffic_surge_dynamic_alert_use_case"
  }
}

Let's take another example to show how context can be extracted from log sources like cloudwatch

def transform(payload):
  import json
  
  log_row = payload['logs']['rows'][0]
  log_row_columns = log_row['columns']
  log_value_dict = {}
  for c in log_row_columns:
    if c['name'] == '@message':
      log_value = c['value']
      log_value_dict = json.loads(log_value)
  return {'pod_name': log_value_dict['kubernetes']['pod_name']}

Here's the Log task result JSON & it's output basis the above transform() function

{
  "type": "LOGS",
  "source": "CLOUDWATCH",
  "task_local_variable_set": {},
  "logs": {
    "rows": [
      {
        "columns": [
          {
            "name": "@timestamp",
            "value": "2024-08-12 10:36:57.958"
          },
          {
            "name": "@message",
            "value": "{\"time\":\"2024-08-12T10:36:57.958739717Z\",\"stream\":\"stdout\",\"_p\":\"F\",\"log\":\"Consumer{rdkafka-38ae90cc-7f9c-4165-9a50-7e075a34c8b2}: Polling on client\",\"kubernetes\":{\"pod_name\":\"raw-monitor-transactions-consumer-ingest-6849f7d948-xhc9l\",\"namespace_name\":\"deployment\",\"pod_id\":\"6db45769-e426-4140-9f44-14f4cda5994c\",\"host\":\"ip-172-16-58-66.us-west-2.compute.internal\",\"container_name\":\"ingest\",\"docker_id\":\"fe93dbad50e5d897b18075e49fe88d3f4cc6072f5edd2d39aff57ed5b9401e7f\",\"container_hash\":\"277357190350.dkr.ecr.us-west-2.amazonaws.com/prototype@sha256:cde1460af3cc6d6ef7e9304b4f562c1c1ab406a849981e884707705e84c50500\",\"container_image\":\"277357190350.dkr.ecr.us-west-2.amazonaws.com/prototype:v1.567\"}}"
          }
        ]
      }
    ]
  }
}
{
  "success": true,
  "output": {
    "$pod_name": "raw-monitor-transactions-consumer-ingest-6849f7d948-xhc9l"
  }
}