Cron Job integration

When a new Data Subject RequestData Subject Request - (DSR) This is a request by the data subject to access, erase, port, or rectify their personal data. It also includes objections to the processing of personal data and requests to restrict the processing of personal data. is made, you can setup a "Cron Job" integration that will block the request from completing until a script is run to process the request. Your script can list out the set of requests that currently need to be processed, run the job to erase, opt out or find data related to that user, and then POST back to the Transcend API to notify that the job has been completed for that request.

This integration is similar to the Server Webhook integration except for the fact that your script will use our API to list active requests instead of receiving a webhook notification when new requests are made. Using the "Cron Job" integration, your team can manually run the script when it is most convenient, or put the script on a schedule to batch jobs at a time when the database is not actively being used.

👍

Check out our examples

We have examples on GitHub. There is a full working python example.

What's involved

There are four steps to integrating your server with Transcend:

1. Create the data silo on Transcend
Using the Transcend Admin Dashboard, create a new data silo for your script.

2. List active requests using our API
Your script can list our the active Data Subject RequestData Subject Request - (DSR) This is a request by the data subject to access, erase, port, or rectify their personal data. It also includes objections to the processing of personal data and requests to restrict the processing of personal data.s that need to be processed by your script.

3. Look up and operate on user data
Your script will need to find the user specified by the API and perform an operation such as retrieving or deleting their personal data.

4. Notify the Transcend API of completion
Use our API to notify Transcend when the server has completed processing. For an access request, this means uploading data. For an erasure or opt out request, this means notifying Transcend that the job has been completed.

5. Hosting or running your script
Once you've verified that your script works, you will need to setup a reminder to run your script manually, or host that script on whatever cron job scheduler your team uses.

1. Create the data silo on Transcend

  1. Go to your Data Map to connect the "Cron Job" integration type.
  2. Give your silo a title (e.g. "Data Warehouse Script")
  3. Click Connect to create the data silo and store the newly created API key
  4. Click "View in Data Map" and drop down the "Manage Datapoints" tab to configure what types of requests your server should be notified about.

2. List active requests using our API

GET <<yourOrganizationSombraURL>>/v1/data-silo/<data-silo-id>/pending-requests/<action-type>

You can use offset based pagination if needed. The default page size is 50.

GET <<yourOrganizationSombraURL>>/v1/data-silo/<data-silo-id>/pending-requests/<action-type>?limit=20&offset=50

This route returns a JSON response:

{
    "items": [
        {
            "identifier": "[email protected]",
            "type": "email",
            "nonce": "eyJhbGciOiJIUzM4NCIsInR5cCI6IkpXVCJ9.eyJlbmNyeXB0ZWRDRUtDb250ZXh0IjoiZXlKaGJHY2lPaUpJVXpNNE5DSXNJblI1Y0NJNklrcFhWQ0o5LmV5SmxibU55ZVhCMFpXUkRSVXNpT2lKRVFuWXdZMGxSZWtOR09Hb3ZNbXRHTTJ4TE5XTnRZemxVYkZCRFJsbDRTRXA1Y2xCNmRHSnpPRUkwU21veWJVMXJhMjFxZFZFOVBTSXNJbkpsY1hWbGMzUkpaQ0k2SWprd01UQTNNMlJqTFRjeU1EY3ROR0ZqTUMwNE1EQTJMVEl4TVdJNU9HTTNPVEU1TWlJc0ltTnZjbVZKWkdWdWRHbG1hV1Z5SWpvaWRHVnpkRUIwY21GdWMyTmxibVF1YVc4aUxDSnBZWFFpT2pFMk1UazVNVEV5TnpNc0ltVjRjQ0k2TVRZek5UY3lNalEzTTMwLnFteExiRTZ5R3JrMjIyRjJITU4xT082VXFrZ2pnV2dNS2RBMGx0VXZHY18xVGxjQUt1RXFyZFlWdGpqUUdqX28iLCJyZXF1ZXN0SWQiOiI5MDEwNzNkYy03MjA3LTRhYzAtODAwNi0yMTFiOThjNzkxOTIiLCJkYXRhU2lsb0lkIjoiOTk5YjMwMjItNWFlZC00NjE1LTlmN2UtY2Q3ZGNlOTUwYWQ3IiwiaWF0IjoxNjIwMDY5MzkzLCJleHAiOjE2MzU4ODA1OTN9.ECNPZtemuPFkCqI0CqwZ2shK7UHk9hOzrc8uwQygDOkcB5o_HHNckJEqHarUXr3P"
        }
    ]
}

Here identifier is the lookup key, the type is the unique name of the identifier defined in settings, and the nonce is an ID you will need to pass back when you go to upload data.

3. Look up and operate on user data

Using the items[*].type and items[*].identifier, implement the request type on your database or internal system. This part of the process is going to be a unique to your business. This may involve:

  • returning or removing rows from a database
  • returning or removing file from a filesystem
  • replacing fields containing personal data with anonymized placeholders

Please consult with your Transcend account representative on recommendations or guidelines for this process.

4. Notify the Transcend API of completion

Once your server has successfully completed the processing of the request, you must send a POST request to Transcend that indicates that processing has been completed. In the case of a Data Subject Access Request, this will also include uploading any data associated with the end user.

Please refer to this guide for information on responding to DSRs.

5. Hosting or running your script

Depending on the type of request and region where the data subject resides, the law requires the response to these requests within some specified period of time. This normally ranges between the range of 2 weeks to 2 months. With this in mind, if you have the ability to run your script at least once per week, this should normally be sufficient for whatever type of request is being made.

Please consult with your Transcend account representative on recommendations or guidelines for this process. The standard recommendation would be to run the script once per day during the hours where the database is under the least amount of stress.