Cron Job Integration

When a new data subject request is made, you can setup a "Cron Job" integration that will block the request from completing until a script is run to process the request. Your script can list out the set of requests that currently need to be processed, run the job to erase, opt out or find data related to that user, and then POST back to the Transcend API to notify that the job has been completed for that request.

This integration is similar to the Server Webhook integration except for the fact that your script will use our API to list active requests instead of receiving a webhook notification when new requests are made. Using the "Cron Job" integration, your team can manually run the script when it is most convenient, or put the script on a schedule to batch jobs at a time when the database is not actively being used.

We have examples on GitHub. There is a full working Python example and a Ruby example.

  1. Go to your Integrations to connect the "Cron Job" integration type.
  2. Give your integration a title (e.g. "Data Warehouse Script")
  3. Click Connect to create the integration.
  4. Navigate to this integration in your Integrations and select the "Manage Datapoints" tab to configure what types of requests your server should be notified about.
  5. Create a new API key for this integration and copy the key. This will be used to authenticate your requests to the Transcend API.

Your script can list our the active data subject requests that need to be processed by your script.

See the API reference.

GET /v1/data-silo/<data-silo-id>/pending-requests/<action-type>

You can use offset based pagination if needed. The default page size is 50.

GET /v1/data-silo/<data-silo-id>/pending-requests/<action-type>?limit=20&offset=50

This route returns a JSON response:

{
  "items": [
    {
      "identifier": "test+attributes@transcend.io",
      "type": "email",
      "coreIdentifier": "test+attributes@transcend.io",
      "dataSiloId": "88971fdc-5570-4f0e-9481-a9936df1fc09",
      "requestId": "f41670b1-2439-4389-9943-4cfaa8553381",
      "requestCreatedAt": "2022-09-28T06:14:58.043Z",
      "nonce": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c",
      "daysUntilOverdue": 28,
      "attributes": [
        {
          "key": "client_handle",
          "values": ["acme"]
        }
      ]
    },
    {
      "identifier": "+123412345",
      "type": "phone",
      "coreIdentifier": "test+access@transcend.io",
      "dataSiloId": "88971fdc-5570-4f0e-9481-a9936df1fc09",
      "requestId": "ebb31ea5-0e9b-4aae-85f6-00a4527af137",
      "nonce": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lenp6enp6eiIsImlhdCI6MTUxNjIzOTAyMn0.shMmFXcM3VqXGZryrgoIWlY_7Rhbpk76CeFh98iP4M8",
      "requestCreatedAt": "2022-09-30T06:14:57.824Z",
      "daysUntilOverdue": 30,
      "attributes": []
    },
    {
      "identifier": "email",
      "type": "test+erasure@transcend.io",
      "coreIdentifier": "test+erasure@transcend.io",
      "dataSiloId": "88971fdc-5570-4f0e-9481-a9936df1fc09",
      "nonce": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c",
      "requestId": "6e093ca0-e73d-401d-9c00-2f051894533a",
      "requestCreatedAt": "2022-09-30T06:14:57.945Z",
      "daysUntilOverdue": 30,
      "attributes": []
    }
  ]
}

Here identifier is the lookup key, the type is the unique name of the identifier defined in settings, and the nonce is an ID you will need to pass back when you go to upload data.

Using the items[*].type and items[*].identifier, implement the request type on your database or internal system. This part of the process is going to be a unique to your business. This may involve:

  • returning or removing rows from a database
  • returning or removing file from a filesystem
  • replacing fields containing personal data with anonymized placeholders

Please consult with your Transcend account representative on recommendations or guidelines for this process.

Once your server has successfully completed the processing of the request, you must send a POST request to Transcend that indicates that processing has been completed. In the case of a Data Subject Access Request, this will also include uploading any data associated with the end user.

Please refer to this guide for information on responding to DSRs.

Depending on the type of request and region where the data subject resides, the law requires the response to these requests within some specified period of time. This normally ranges between the range of 2 weeks to 2 months. With this in mind, if you have the ability to run your script at least once per week, this should normally be sufficient for any type of request.

Please consult with your Transcend account representative on recommendations or guidelines for this process. We recommend running the script once per day during the hours in which the database is under the least amount of stress.