DSR Metrics API
As you build out your DSR Automation workflows, you may find a need to programmaically pull in health stats and metrics. For example, you may want to set of a metric for your on-call program that can alert when certain integrations in your workflow start erroring out and need to be addressed.
This document will walk you through using Transcend's GraphQL API to pull in metrics that you can use to monitor DSR Automation health.
You can test any of these queries in your browser by first logging into the your Transcend account, then visting our GraphQL Playground and paste in the queries in this guide. The Transcend EU backend is the default backend and lives at https://api.transcend.io/graphql. US backend lives at https://api.ustranscend.io/graphql.
https://api.transcend.io/graphql
In order to complete the following actions in this guide, you will need to create an API key under Developer Tools -> API Keys with the following scopes:
- View Data Map
- View Incoming Requests
You will then need to construct a GraphQL client using your language of choice. In JavaScript, this may look like:
import { GraphQLClient } from 'graphql-request'; const transcendUrl = 'https://api.transcend.io'; // for EU hosting // const transcendUrl = 'https://api.us.transcend.io'; // for US hosting const client = new GraphQLClient(`${transcendUrl}/graphql`, { headers: { Authorization: `Bearer ${process.env.TRANSCEND_API_KEY}`, }, });
Many metrics use the analyticsData
query. This query can take in the name of the metric, and a date range. It then returns the list of metrics of that type.
This is done with a single GraphQL query:
query DsrMetricsAnalyticsData($input: AnalyticsInput!) { analyticsData(input: $input) { series { name points { key value } } metadata { summaryValue } } }
In JavaScript you would define this as a gql query:
import { gql } from 'graphql-request'; const DSR_METRICS_QUERY = gql` query DsrMetricsAnalyticsData($input: AnalyticsInput!) { analyticsData(input: $input) { series { name points { key value } } metadata { summaryValue } } } `;
The first thing you may want to monitor is the number of DSRs that came in for a specific request type within a specific date range. For example, how many erasure requests vs access requests came in the last week.
await client.request(DSR_METRICS_QUERY, { input: { dataSource: 'REQUESTS_BY_TYPE', startDate: '2024-10-15T07:00:00.000Z', endDate: '2024-10-25T06:59:59.999Z', isTestRequest: false, }, });
This will return a respose like this:
{ "data": { "analyticsData": { "series": [ { "name": "requests", "points": [ { "key": "ACCESS", "value": 1266 }, { "key": "CONTACT_OPT_OUT", "value": 105337 }, { "key": "ERASURE", "value": 10265 }, { "key": "SALE_OPT_OUT", "value": 2566 } ] } ], "metadata": { "summaryValue": 119434 } } } }
Another stat you may want to monitor is the number of requests in a specific status. For example, of the requests that came in the last week, how many requests are completed, canceled or pending deletion.
await client.request(DSR_METRICS_QUERY, { input: { dataSource: 'REQUESTS_BY_STATUS', startDate: '2024-10-15T07:00:00.000Z', endDate: '2024-10-25T06:59:59.999Z', isTestRequest: false, }, });
This will return a respose like this:
{ "data": { "analyticsData": { "series": [ { "name": "requests", "points": [ { "key": "APPROVING", "value": 5 }, { "key": "CANCELED", "value": 7197 }, { "key": "COMPILING", "value": 107 }, { "key": "COMPLETED", "value": 105757 }, { "key": "DOWNLOADABLE", "value": 72 }, { "key": "ENRICHING", "value": 34 }, { "key": "FAILED_VERIFICATION", "value": 4006 }, { "key": "REQUEST_MADE", "value": 2 }, { "key": "REVOKED", "value": 228 }, { "key": "SECONDARY", "value": 1235 }, { "key": "SECONDARY_COMPLETED", "value": 791 } ] } ], "metadata": { "summaryValue": 119434 } } } }
Another stat you may want to monitor is the number of requests submitted for different data subject types. For example, of the requests that came in the last week, how many came from employees vs end users.
await client.request(DSR_METRICS_QUERY, { input: { dataSource: 'REQUESTS_BY_SUBJECT', startDate: '2024-10-15T07:00:00.000Z', endDate: '2024-10-25T06:59:59.999Z', isTestRequest: false, }, });
This will return a respose like this:
{ "data": { "analyticsData": { "series": [ { "name": "requests", "points": [ { "key": "Customer", "value": 119317 }, { "key": "Employee", "value": 117 } ] } ], "metadata": { "summaryValue": 119434 }, "cachedAt": "2024-10-25T04:40:04.988Z" } } }
Going a step deeper than request-level stats, you may want to understand which of your DSR integrations are running successfully vs encountering issues.
The first step is to grab all of the Data Silo/Integration IDs that are used in your DSR Automation workflows:
query GetDataSilosForMetrics($offset: Int!) { dataSilos(first: 100, offset: $offset, filterBy: { isLive: true }) { nodes { id title type connectedActions outerType teams { name } owners { name email } } } }
This will return a respose like this:
{ "data": { "dataSilos": { "nodes": [ { "id": "6d69c5a1-3e32-44e5-8f4c-8ef1bb0fe73b", "title": "Server Webhook", "type": "server", "connectedActions": ["ACCESS", "ERASURE"], "outerType": null, "teams": [], "owners": [] }, { "id": "3ee53d18-6322-4f18-8bb4-4e6aed3ca67b", "title": "Typeform", "type": "typeform", "connectedActions": ["ACCESS"], "outerType": null, "teams": [ { "name": "IT" } ], "owners": [ { "name": "Skip B", "email": "skip@acme.com" } ] }, { "id": "af23e505-316e-4be5-8cdf-c92a89df7aca", "title": "TikTok Ads", "type": "promptAPerson", "connectedActions": [], "outerType": "tiktokAds", "teams": [], "owners": [] } ] } } }
This route is paginated with a maximum page size of 100. You likely will have less than 100 integrations enabled for DSR Automation, but if you need to paginate it may look like
const GET_DATA_SILOS = gql` query GetDataSilosForMetrics(first: Int!, $offset: Int!) { dataSilos(first: $first, offset: $offset, filterBy: { isLive: true }) { nodes { id title type connectedActions outerType teams { name } owners { name email } } } } `; const data = []; const MAX_PAGE = 100; let offset = 0; let pageSize = MAX_PAGE; while (pageSize === MAX_PAGE) { const paged = await client.request(GET_DATA_SILOS, { first: MAX_PAGE, offset, }); data.push(...paged.dataSilos.nodes); pageSize = paged.dataSilos.nodes.length; offset += MAX_PAGE; }
For each integration you can then query the dataSiloSummary
endpoint to determine the statuses of requests for that integration. This would allow you to understnd how many integration jobs are erroring out vs actively processing vs successfully processed.
query DataSiloSummary($dataSiloId: ID!) { dataSiloSummary(id: $dataSiloId) { requestsSucceededTotal requestsAwaitingTotal requestsQueuedTotal requestsErrorsTotal pluginErrorsTotal lookupProcessErrorsTotal lookupProcessIndexedTotal } }
This will return a respose like this:
{ "data": { "dataSiloSummary": { "requestsSucceededTotal": 1146, "requestsAwaitingTotal": 0, "requestsQueuedTotal": 4, "requestsErrorsTotal": 12, "pluginErrorsTotal": 0, "lookupProcessErrorsTotal": 0, "lookupProcessIndexedTotal": 0 } } }