Responding to a Data Subject Request

Send your server's processed DSR response to Transcend.

After you have setup your Server Webhook Integration or a Cron Job Integration to receive notifications for new data subject requests, you need to POST back to our API to indicate that the request was successfully completed. For most requests, this is a simple notifications. In the case of a Data Subject Access RequestData Subject Access Request - (DSAR) A request by the data subject to access their personal data from your organization. Often called DSARs, or just SARs. We'll call them DSARs., you should respond with any data tied to that particular user.

End-to-End Encryption

In an access request, the data from your systems get bundled by Transcend for the end-user to download. The data is encrypted before arriving at Transcend by our encryption gateway, called Sombra. You don't need to know more than this, but if you're curious to learn about how Sombra works, you can read more here: Sombra (End-to-End Encryption)

API domain

Rather than integrating with an endpoint like api.transcend.io, you'll be using a URL specific to your domain. If you're hosting Sombra on premise, then it is unique to your company. You can find this URL here. Otherwise you will send requests to <<yourOrganizationSombraURL>>.

Uploading data for an Data Subject Access Request

For data you control in your databases, we expect your server to POST data back to your sombra gateway and that data will be included in the data download report sent back to the data subject.

A data silo contains a set of datapoints, which are categories of data on your server. You need to upload one file per datapoint, or report that datapoint as not found. You can upload to many datapoints at once with a JSON Bulk Upload, or you can upload to a single datapoint with a Single File Upload. JSON Bulk Upload only works for JSON data, whereas Single File Upload works for any file type such as an image or CSV.

JSON Bulk Upload

Endpoint reference: Respond to an access request (JSON bulk upload)

You can upload to many datapoints at once when the files uploaded to each DatapointDatapoint - A data silo contains a set of datapoints, which are labeled fields of data. In a DSR, files related to datapoints are accessed or erased. is valid JSON.

POST <<yourOrganizationSombraURL>>/v1/data-silo

JSON Body Fields

Parameter

Type

Description

profiles

Array

A list of JSON objects with keys profileId and profileData. Your server may find one profile for the data subject, several profiles for the data subject, or none at all. Thus the profiles array can have zero, one, or many entries.

profiles[i].profileId

String

A unique identifier for this profile. This should be the same as extras.profile.identifier from the webhook body.

profiles[i].profileData

Object

An arbitrary JSON object with the data payload. The keys of the JSON should match the DatapointDatapoint - A data silo contains a set of datapoints, which are labeled fields of data. In a DSR, files related to datapoints are accessed or erased. keys you defined for this data silo.

status (optional)

"READY"

Override the silo to be in a "ready" state. This is also automatically set once all datapoints have been reported.

Request Headers

Header

Value

authorization

"Bearer DATA_SILO_API_KEY"

x-sombra-authorization (on premise sombra only)

"Bearer SOMBRA_API_KEY"

x-transcend-nonce

The nonce from the original webhook. This is in the webhook request header x-transcend-nonce

content-type

"application/json"

Reporting values not found

For a data silo to be considered "ready" for the user to download, Transcend expects each DatapointDatapoint - A data silo contains a set of datapoints, which are labeled fields of data. In a DSR, files related to datapoints are accessed or erased. of a data silo to receive data, or otherwise be reported as having no data found.

In a JSON Bulk Upload, you can indicate that a datapoint was not found by returning null, undefined, [], {}, or None. See Example A below.

You can also override the silo to be "ready" by passing the status: "READY" option. See Example B below.

const request = require('request');

/***
 For a data silo with four datapoints:
 - name
 - score
 - interests
 - resume
 ***/

// Example A: Post all data including null values for non-existing fields
request.post('<<yourOrganizationSombraURL>>/v1/data-silo', {
  headers: {
    'authorization': 'Bearer <<dataSiloApiKey>>',
    'x-sombra-authorization': 'Bearer <<sombraApiKey>>',
    'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
    'content-type': 'application/json'
  },
  body: {
    'profiles': [{
      'profileId': 'ben.farrell', // this should be the same as extras.profile.identifier from the webhook body
      'profileData': {
        'name': 'Ben Farrell', // these top-level keys are registered datapoints
        'score': 3.8,
        'interests': 'Privacy Tech',
        'resume': null,
      }
    }]
  },
  json: true
});

// Example B: you can set `isCompleted=true` to fill in the remainder as null
request.post('<<yourOrganizationSombraURL>>/v1/data-silo', {
  headers: {
    'authorization': 'Bearer <<dataSiloApiKey>>',
    'x-sombra-authorization': 'Bearer <<sombraApiKey>>',
    'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
    'content-type': 'application/json'
  },
  body: {
    'profiles': [{
      'profileId': 'ben.farrell', // this should be the same as extras.profile.identifier from the webhook body
      'profileData': {
        'name': 'Ben Farrell',
        'score': 3.8,
        'interests': 'Privacy Tech', // resume is missing
      }
    }],
    'status': 'READY' // request status is overridden as ready
  },
  json: true
});
curl -X 'POST' <<yourOrganizationSombraURL>>/v1/data-silo
 -H "authorization: Bearer <<dataSiloApiKey>>" 
 -H "x-sombra-authorization: Bearer <<sombraApiKey>>" 
 -H "x-transcend-nonce: NONCE_FROM_WEBHOOK_REQUEST_HEADERS"  
 -H "content-type: application/json" 
 -d '{"profiles": [{"profileId": "ben.farrell", "profileData":{"name": "Ben Farrell", "score": "3.8"}}]}'

📘

Be sure to authenticate your request

Read more about authenticating with the Transcend API.

Single File Upload

Endpoint reference: Respond to an access request (file upload)

You can also upload a file to a specific DatapointDatapoint - A data silo contains a set of datapoints, which are labeled fields of data. In a DSR, files related to datapoints are accessed or erased. for a given profile.

POST <<yourOrganizationSombraURL>>/v1/datapoint

Request Headers

Header

Value

authorization

"Bearer DATA_SILO_API_KEY"

x-sombra-authorization (on premise sombra only)

"Bearer SOMBRA_API_KEY"

x-transcend-nonce

The nonce from the original webhook. This is in the webhook request header x-transcend-nonce

x-transcend-datapoint-name

The datapoint to upload this file to (e.g. profile_picture). The keys should match a datapoint key you defined for this data silo

x-transcend-profile-id

A unique identifier for this profile. This should be the same as extras.profile.identifier from the webhook body.

const request = require('request');
const { createReadStream } = require('fs'); 

const transcendRequest = request({
  url: '<<yourOrganizationSombraURL>>/v1/datapoint',
  method: 'POST',
  headers: {
    'authorization': 'Bearer <<dataSiloApiKey>>',
    'x-sombra-authorization': 'Bearer <<sombraApiKey>>',
    'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
    'x-transcend-datapoint-name': 'profile_picture',
    'x-transcend-profile-id': 'ben_farrell' // this should be the same as extras.profile.identifier from the webhook body
  }
});

# From File
createReadStream('path/to/profile_picture.jpg').pipe(transcendRequest);

# From Amazon S3
import AWS from 'aws-sdk';

const s3 = new AWS.S3();
s3
    .getObject({ Bucket: 'my-bucket', Key: 'file-key' })
    .createReadStream()
    .on('error', (err) => console.error(err))
    .pipe(transcendRequest);
curl -X 'POST' <<yourOrganizationSombraURL>>/v1/data-silo
 -H "authorization: Bearer <<dataSiloApiKey>>" 
 -H "x-sombra-authorization: Bearer <<sombraApiKey>>" 
 -H "x-transcend-nonce: NONCE_FROM_WEBHOOK_REQUEST_HEADERS"  
 -H 'x-transcend-datapoint-name: cool_video'
 -H 'x-transcend-profile-id: ben_farrell'
 -H "content-type: video/mp4" 
 -d @frag_bunny.mp4

Mixed Example

In this example below, we are uploading a combination of table data from a database and files from a storage system like S3. In our scenario:

  1. A database column, profile_picture refers to a key where an image file is stored on Amazon S3
  2. Our database query does not return object keys where data is not found. Thus, some datapoints are not being reported to Transcend (see: Reporting Values Not Found). We set the explicit 'status': 'READY' flag to indicate the unreported datapoints have no data found.
// HTTP request library
const request = require('request-promise');

// Verify jwts
const jwt = require('jsonwebtoken');

// Amazon sdk used to connect to s3
import AWS from 'aws-sdk';
const s3 = new AWS.S3();

// Your custom db implementation
const db = require('./db');

// Store your secrets somewhere secure
const TRANSCEND_AUTH = {
  'authorization': 'Bearer <<dataSiloApiKey>>',
  'x-sombra-authorization': 'Bearer <<sombraApiKey>>',
};

/**
 * Resolve the datapoints for a given user
 * 
 * @param coreIdentifier - The id of the user
 * @param nonce - The nonce we send you in the webhook, used to map your reponse back to the datapoint metadata
 * @returns A promise that resolves the access request back to Transcend
 */
async function performAccessRequest(profileIdentifier, nonce) {
  // Lookup the user in the db
  const user = db.get('user').byId(profileIdentifier);

  // Split off profile picture from remaining data
  const { profile_picture, ...remainingUser } = user;
         
  // First we post the profile picture jpeg image
  await s3
    .getObject({ Bucket: 'my-bucket', Key: profilePicture })
    .createReadStream()
    .on('error', (err) => console.error(err))
    .pipe(request({
      url: '<<yourOrganizationSombraURL>>/v1/datapoint',
      method: 'POST',
      headers: {
        ...TRANSCEND_AUTH,
        'x-transcend-nonce': nonce,
        'x-transcend-datapoint-name': 'profile_picture',
        'x-transcend-profile-id': profileIdentifier,
        'content-type': 'image/jpeg', // indicate the file type
      }
    }));
 
  // Only after the first request completes do we send the remaining
  await request.post('<<yourOrganizationSombraURL>>/v1/data-silo', {
    headers: {
      ...TRANSCEND_AUTH,
      'x-transcend-nonce': nonce,
      'content-type': 'application/json'
    },
    body: {
      'profiles': [{
        'profileId': profileIdentifier, // same as extras.profile.identifier from webhook body 
        'profileData': remainingUser
      }],
      'status': 'READY' // Our db does not return undefined values so we explicitly indicate completion
    },
    json: true
  });
}

Adding New Datapoints

At some point the data you collect may change and you'll want to add a new DatapointDatapoint - A data silo contains a set of datapoints, which are labeled fields of data. In a DSR, files related to datapoints are accessed or erased. so you can upload new types of files to the download report. The easiest way to do this is to include a new unique key in the "profileData” object see: JSON Bulk Upload. We will automatically detect when your data silo starts responding with new data types, and prompt the maintainer of that data silo to add a description and categorization of that datapoint, as well as make the final call as to whether that datapoint should be shown in the download report.

Fulfill an Erasure Request (DSER) or Opt Out Request

Endpoint reference: Respond to an erasure request

When a DSER or opt out is received, Transcend sends your server a webhook notification, and your server then deletes data or opts out a data subject from your own databases. When your server is done, it should notify Transcend about which profiles were erased or modified.

PUT <<yourOrganizationSombraURL>>/v1/data-silo

JSON Body Fields

Parameter

Type

Description

profiles

Array

A list of JSON objects with key profileId

profiles[i].profileId

String

A unique identifier for this profile. This should be the same as extras.profile.identifier from the webhook body.

Request Headers

Header

Value

authorization

"Bearer DATA_SILO_API_KEY"

x-sombra-authorization (on premise sombra only)

"Bearer SOMBRA_API_KEY"

x-transcend-nonce

The nonce from the original webhook. This is in the webhook request header x-transcend-nonce

content-type

"application/json"

const request = require('request');

request.put('<<yourOrganizationSombraURL>>/v1/data-silo', {
  headers: {
    Authorization: '<<dataSiloApiKey>>'
  },
  body: {
    profiles: [{ // Profiles without the data
      profileId: 'ben.farrell', // this should be the same as extras.profile.identifier from the webhook
    }]
  },
  json: true
});

Additional Status Codes

🚧

Handling Duplicate Erasure Requests

Occasionally, there may be a reason that a webhook for an ERASURE request has to be re-attempted. Transcend will automatically re-send webhooks every day when your server has not reported back a response. When an ERASURE webhook is re-sent after you have successfully erased that person's account, you may no longer be able to look them up by their coreIdentifier. It is important that your server responds back to the webhook with HTTP status code 204, indicating that no user was found and thus no ERASURE needs to be performed again. Not you can respond back to the webhook directly with a 204 instead of a 200. You do not need to follow up with another POST request.