Respond to a privacy request

After you have setup your Server Webhook Integration or a Cron Job Integration to receive notifications for new data subject requests, you need to POST back to our API to indicate that the request was successfully completed. For most requests, this is a simple notification. In the case of an access request, you should respond with any data tied to that particular user.


In an access request, the data from your systems get bundled by Transcend for the end-user to download. The data is encrypted before arriving at Transcend by our encryption gateway, called Sombra. You don't need to know more than this, but if you're curious to learn about how Sombra works, you can read more here: Sombra (End-to-End Encryption).


Rather than integrating with an endpoint like api.transcend.io, you'll be using a URL specific to your domain. If you're self-hosting Sombra, then it is unique to your company. You can find this URL here. Otherwise, you will send requests to {{yourOrganizationSombraURL}}.


For data you control in your databases, we expect your server to POST data back to your Sombra gateway, and that data will be included in the data download report sent back to the data subject.

A data silo contains a set of datapoints, which are categories of data on your server. You need to upload one file per datapoint, or report that datapoint as not found. You can upload to many datapoints at once with a JSON Bulk Upload, or you can upload to a single datapoint with a Single File Upload. JSON Bulk Upload only works for JSON data, whereas Single File Upload works for any file type such as an image or CSV.


Endpoint reference: Respond to an access request (JSON bulk upload)

You can upload to many datapoints at once when the files uploaded to each datapoint is valid JSON.

POST {{yourOrganizationSombraURL}}/v1/data-silo

ParameterTypeDescription
profilesArrayA list of JSON objects with keys profileId and profileData. Your server may find one profile for the data subject, several profiles for the data subject, or none at all. Thus the profiles array can have zero, one, or many entries.
profiles[i].profileIdStringA unique identifier for this profile. This should be the same as extras.profile.identifier from the webhook body.
profiles[i].profileDataObjectAn arbitrary JSON object with the data payload. The keys of the JSON should match the datapoint keys you defined for this data silo.

HeaderValue
authorization"Bearer {{dataSiloApiKey}}"
x-sombra-authorization (self-hosted Sombra only)"Bearer {{sombraApiKey}}"
x-transcend-nonceThe nonce from the original webhook. This is in the webhook request header x-transcend-nonce

For a data silo to be considered "ready" for the user to download, Transcend expects each datapoint of a data silo to receive data, or otherwise be reported as having no data found.

In a JSON Bulk Upload, you can indicate that a datapoint was not found by returning null, undefined, [], {}, or None. See Example A below.

You can also override the silo to be "ready" by passing the status: "READY" option. See Example B below.

const request = require('request');
/***
For a data silo with four datapoints:
- name
- score
- interests
- resume
***/
// Example A: Post all data including null values for non-existing fields
request.post('{{yourOrganizationSombraURL}}/v1/data-silo', {
headers: {
authorization: 'Bearer {{dataSiloApiKey}}',
'x-sombra-authorization': 'Bearer {{sombraApiKey}}',
'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
'content-type': 'application/json',
},
body: {
profiles: [
{
// this should be the same as extras.profile.identifier from the webhook body
profileId: 'ben.farrell',
profileData: {
name: 'Ben Farrell', // these top-level keys are registered datapoints
score: 3.8,
interests: 'Privacy Tech',
resume: null,
},
},
],
},
json: true,
});
// Example B: you can set `isCompleted=true` to fill in the remainder as null
request.post('{{yourOrganizationSombraURL}}/v1/data-silo', {
headers: {
authorization: 'Bearer {{dataSiloApiKey}}',
'x-sombra-authorization': 'Bearer {{sombraApiKey}}',
'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
'content-type': 'application/json',
},
body: {
profiles: [
{
// this should be the same as extras.profile.identifier from the webhook body
profileId: 'ben.farrell',
profileData: {
name: 'Ben Farrell',
score: 3.8,
interests: 'Privacy Tech', // resume is missing
},
},
],
status: 'READY', // request status is overridden as ready
},
json: true,
});
curl -X 'POST' {{yourOrganizationSombraURL}}/v1/data-silo
-H "authorization: Bearer {{dataSiloApiKey}}"
-H "x-sombra-authorization: Bearer {{sombraApiKey}}"
-H "x-transcend-nonce: NONCE_FROM_WEBHOOK_REQUEST_HEADERS"
-H "content-type: application/json"
-d '{"profiles": [{ \
"profileId": "ben.farrell", \
"profileData": {"name": "Ben Farrell", "score": "3.8"}} \
]}'

Read more about authenticating with the Transcend API.


Endpoint reference: Respond to an access request (file upload)

In some cases, you may have a lot of data you need to upload for a particular datapoint. You may have some large table that has many of rows, i.e., an activities or audit table. You can page over your database and upload the data in chunks. Transcend will compose those chunks together into a single large file that will be streamed from Transcend's cloud to the data subject's file system.

POST {{yourOrganizationSombraURL}}/v1/datapoint-chunked

HeaderValue
authorization"Bearer {{dataSiloApiKey}}"
x-sombra-authorization (self-hosted Sombra only)"Bearer {{sombraApiKey}}"
x-transcend-nonceThe nonce from the original webhook. This is in the webhook request header x-transcend-nonce
const request = require('request');
async function batchAndSendDataToTranscend(
db, // your database or data store
identifier, // the user identifier to query (extras.profile.identifier in webhook)
nonce // from webhook headers
) {
let hasMore = true;
let offset = 0;
const PAGE_SIZE = 1000;
while (hasMore) {
const data = await db.model('activity').findAll({
where: { userId: identifier },
order: [['createdAt', 'DESC']],
limit: PAGE_SIZE,
offset,
});
hasMore = data.length === PAGE_SIZE;
offset += PAGE_SIZE;
await request.post({
url: 'https://multi-tenant.sombra.transcend.io/v1/datapoint-chunked',
json: true,
headers: {
authorization: 'Bearer {{dataSiloApiKey}}',
'x-sombra-authorization': 'Bearer {{sombraApiKey}}',
'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
'content-type': 'application/json',
},
body: {
dataPointName: 'activities',
data,
isLastPage: !hasMore,
},
});
}
}

Endpoint reference: Respond to an access request (file stream upload)

You can also upload a file to a specific datapoint for a given profile.

POST {{yourOrganizationSombraURL}}/v1/datapoint

HeaderValue
authorization"Bearer {{dataSiloApiKey}}"
x-sombra-authorization (self-hosted Sombra only)"Bearer {{sombraApiKey}}"
x-transcend-nonceThe nonce from the original webhook. This is in the webhook request header x-transcend-nonce
x-transcend-datapoint-nameThe datapoint to upload this file to (e.g. profile_picture). The keys should match a datapoint key you defined for this data silo
const request = require('request');
const { createReadStream } = require('fs');
const transcendRequest = request({
url: '{{yourOrganizationSombraURL}}/v1/datapoint',
method: 'POST',
headers: {
authorization: 'Bearer {{dataSiloApiKey}}',
'x-sombra-authorization': 'Bearer {{sombraApiKey}}',
'x-transcend-nonce': 'NONCE_FROM_WEBHOOK_REQUEST_HEADERS',
'x-transcend-datapoint-name': 'profile_picture',
// this should be the same as extras.profile.identifier from the webhook body
'x-transcend-profile-id': 'ben_farrell',
},
});

createReadStream('path/to/profile_picture.jpg').pipe(transcendRequest);

import AWS from 'aws-sdk';
const s3 = new AWS.S3();
s3.getObject({ Bucket: 'my-bucket', Key: 'file-key' })
.createReadStream()
.on('error', (err) => console.error(err))
.pipe(transcendRequest);
curl -X 'POST' {{yourOrganizationSombraURL}}/v1/data-silo
-H "authorization: Bearer {{dataSiloApiKey}}"
-H "x-sombra-authorization: Bearer {{sombraApiKey}}"
-H "x-transcend-nonce: NONCE_FROM_WEBHOOK_REQUEST_HEADERS"
-H 'x-transcend-datapoint-name: cool_video'
-H 'x-transcend-profile-id: ben_farrell'
-H "content-type: video/mp4"
-d @frag_bunny.mp4

In this example below, we are uploading a combination of table data from a database and files from a storage system like S3. In our scenario:

  1. A database column, profile_picture refers to a key where an image file is stored on Amazon S3
  2. Our database query does not return object keys where data is not found. Thus, some datapoints are not being reported to Transcend (see: Reporting Values Not Found). We set the explicit 'status': 'READY' flag to indicate the unreported datapoints have no data found.
// HTTP request library
const request = require('request-promise');
// Verify jwts
const jwt = require('jsonwebtoken');
// Amazon sdk used to connect to s3
import AWS from 'aws-sdk';
const s3 = new AWS.S3();
// Your custom db implementation
const db = require('./db');
// Store your secrets somewhere secure
const TRANSCEND_AUTH = {
'authorization': 'Bearer DATA_SILO_API_KEY',
'x-sombra-authorization': 'Bearer SOMBRA_API_KEY',
};
/**
* Resolve the datapoints for a given user
*
* @param coreIdentifier - The id of the user
* @param nonce - The nonce we send you in the webhook,
* used to map your response back to the datapoint metadata
* @returns A promise that resolves the access request back to Transcend
*/
async function performAccessRequest(profileIdentifier, nonce) {
// Lookup the user in the db
const user = db.get('user').byId(profileIdentifier);
// Split off profile picture from remaining data
const { profile_picture, ...remainingUser } = user;
// First we post the profile picture jpeg image
await s3
.getObject({ Bucket: 'my-bucket', Key: profilePicture })
.createReadStream()
.on('error', (err) => console.error(err))
.pipe(request({
baseUrl: 'https://multi-tenant.sombra.transcend.io',
uri: '/v1/datapoint',
method: 'POST',
headers: {
...TRANSCEND_AUTH,
'x-transcend-nonce': nonce,
'x-transcend-datapoint-name': 'profile_picture',
'x-transcend-profile-id': profileIdentifier,
'content-type': 'image/jpeg', // indicate the file type
}
}));
// Only after the first request completes do we send the remaining
await request.post({
baseUrl: 'https://multi-tenant.sombra.transcend.io',
uri: '/v1/data-silo',
headers: {
...TRANSCEND_AUTH,
'x-transcend-nonce': nonce,
'content-type': 'application/json'
},
body: {
'profiles': [{
// same as extras.profile.identifier from webhook body
'profileId': profileIdentifier,
'profileData': remainingUser
}],
// Our db does not return undefined values so we explicitly indicate completion
'status': 'READY'
},
json: true
});
}

At some point the data you collect may change and you'll want to add a new datapoint so you can upload new types of files to the download report. The easiest way to do this is to include a new unique key in the "profileData” object see: JSON Bulk Upload. We will automatically detect when your data silo starts responding with new data types, and prompt the maintainer of that data silo to add a description and categorization of that datapoint, as well as make the final call as to whether that datapoint should be shown in the download report.


Endpoint reference: Respond to an erasure request

When a DSER or opt out is received, Transcend sends your server a webhook notification, and your server then deletes data or opts out a data subject from your own databases. When your server is done, it should notify Transcend about which profiles were erased or modified.

PUT {{yourOrganizationSombraURL}}/v1/data-silo

ParameterTypeDescription
profilesArrayA list of JSON objects with key profileId

HeaderValue
authorization"Bearer {{dataSiloApiKey}}"
x-sombra-authorization (self-hosted Sombra only)"Bearer {{sombraApiKey}}"
x-transcend-nonceThe nonce from the original webhook. This is in the webhook request header x-transcend-nonce
const request = require('request');
request.put('{{yourOrganizationSombraURL}}/v1/data-silo', {
headers: {
Authorization: '{{dataSiloApiKey}}',
},
body: {
profiles: [
// Profiles without the data
{
// this should be the same as extras.profile.identifier from the webhook
profileId: 'ben.farrell',
},
],
},
json: true,
});


Occasionally, there may be a reason that a webhook for an ERASURE request has to be re-attempted. Transcend will automatically re-send webhooks every day when your server has not reported back a response. When an ERASURE webhook is re-sent after you have successfully erased that person's account, you may no longer be able to look them up by their coreIdentifier. It is important that your server responds back to the webhook with HTTP status code 204, indicating that no user was found and thus no ERASURE needs to be performed again. Not you can respond back to the webhook directly with a 204 instead of a 200. You do not need to follow up with another POST request.