Databricks Integration Set Up for DSR Automation

With the Transcend Databricks and Databricks Lakehouse integration, you can fulfill DSRs directly against a Databricks or Databricks Lakehouse database by running Databricks operations with our custom JSON payload for the desired data actions on each datapoint.

The first step to setting up DSRs against the Databricks Lakehouse database is creating the datapoints in the data silo that should be queried. We typically recommend creating a datapoint for each collection in the database that stores personal data (or any collections you want to action DSRs against). For example, let's say there is a collection called Chat History that contains all the messages sent back and forth from a customer. You could create a datapoint for Chat History in the data silo and enable the specific data actions needed. If you're using Structured Discovery, you can enable the Datapoint Schema Discovery plugin to create the datapoints for you automatically.

Pro tip: Check out the Transcend Terraform Provider for options on managing data silos and data points in code.