Data Export to S3

Easily integrate OpsLevel data into your data warehouse to create your own reports.

Data Export works by regularly exporting full and incremental snapshots of your OpsLevel account data to an AWS S3 bucket of your choice. Data is exported as one .csv.gz file per table.

Setup

1. First email us at [email protected]. We will provide you with the following details:

  • Our AWS user (%OPSLEVEL-USER%)
  • Our AWS account ID (%OPSLEVEL-ACCOUNT-ID%)
  • A processing bucket name (%PROCESSING-BUCKET%)
  • An external ID (%EXTERNAL-ID%)

2. Log onto your AWS account.

3. Create a new S3 bucket where you want to receive files (%DESTINATION-BUCKET%).

4. Create a role in your account for OpsLevel to assume (%CUSTOMER-ROLE-TO-ASSUME%).

5. Add this policy to the new role:

{
     "Version": "2012-10-17",
     "Statement": [
         {
             "Effect": "Allow",
             "Action": [
                 "s3:PutObject"
             ],
             "Resource": [
                 "arn:aws:s3:::%DESTINATION-BUCKET%/*"
             ]
         },
         {
             "Effect": "Allow",
             "Action": [
                 "s3:GetObject"
             ],
             "Resource": [
                 "arn:aws:s3:::%PROCESSING-BUCKET%/*"
             ]
         }
     ]
}

6. Add this trust relationship to the new role:

{
   "Version": "2012-10-17",
   "Statement": [
     {
       "Effect": "Allow",
       "Principal": {
         "AWS": "arn:aws:iam::%OPSLEVEL-ACCOUNT-ID%:user/%OPSLEVEL-USER%"
       },
       "Condition": {"StringEquals": {"sts:ExternalId": "%EXTERNAL-ID%"}},
       "Action": "sts:AssumeRole"
     }
   ]
}

7. You will then need to provide us with:

  • Your AWS account ID
  • Destination bucket name (%DESTINATION-BUCKET%)
  • Role used for OpsLevel to send data to your bucket

For a more complete explanation for these steps you can read the AWS Guide.

Push Types

There are two types of pushes you can expect, full, and incremental. Full pushes are a complete dump of all the data associated with your account. Incremental pushes will contain only the changes since the last dump.

Full pushes are done on a weekly basis and incremental pushes are done on a daily basis.

After ingesting the initial full dump you should be able to keep your system up to date by ingesting only the incremental files. However we recommend refreshing from full dumps on a regular interval.

For incremental style pushes, deleted records will be specified by sending a row with the id with the deleted_at column specifying the time the record was deleted at. All other columns will be blank for the deleted entry.

Schema

We will upload files into the destination with a path taking the form of: /v1/:push_type/:table_name/:date/:timestamp/

For example: /v1/full/services/2020-07-14/22:00:00/services.csv.gz

Examples

Here are some examples of the files you will receive.

services.csv

idnamealiasproductdescriptionowner_idlanguageframeworktier_indexlifecycle_indexhtml_urlcreated_atupdated_atdeleted_at
83Transaction Processortransaction_processorNULLNULL5PythonFlaskNULLNULLhttps://app.opslevel.com/services/transaction_processor2020-02-19 13:57:112020-02-19 13:57:11
84ReconciliatorreconciliatorPaymentResponsible for reconciling various transactions5RubyRails1NULLhttps://app.opslevel.com/services/reconciliator2020-02-19 13:57:422020-02-19 13:57:42
852020-02-19 13:57:42

check_results.csv

idstatusmessagecheck_idservice_idis_appliedcreated_atupdated_atdeleted_at
1passedSuccess1112018-11-16 22:02:092018-11-16 22:02:09
2failedThe service does not have a logs tool.1102018-11-16 22:02:442018-11-16 22:02:44

teams.csv

idnamealiasmanager_idresponsibilitieshtml_urlcreated_atupdated_atdeleted_at
8Discovery Teamdiscovery_teamNULLNULLhttps://app.opslevel.com/teams/discovery_team2018-10-05 21:05:252019-03-12 18:20:52
11Supportsupport12Supporting our producthttps://app.opslevel.com/teams/support2018-11-06 14:58:052019-03-12 18:20:52

There are many more tables we export; essentially all the data in OpsLevel will be exported.

Next Steps

Now that you will be receiving regular data exports from OpsLevel you will need to configure your data pipeline to ingest the data. We can get the data to you; how you’ll use it is up to you.

With our Data Export feature, you can integrate OpsLevel’s service ownership and operational maturity data into your data warehouse. This lets you create fully customized reports that join OpsLevel data against other data in your business, such as support tickets, security incidents, or costs.