Google Cloud Platform and BigQuery Integration
Transcend supports a full GCP integration that scans GCP projects to identify data storage systems that may contain personal data. Transcend also supports a DSR integration for BigQuery database as well as scanning your database to discover and classify your data. This guide provides an overview on how the integrations work as well as detailed setup instructions.
Transcend’s GCP integration automates the process of identifying data stores across Google cloud infrastructure. It includes things like BigQuery, CloudSQL, Cloud Storage, etc.
The integration's data silo discovery plugin works by programmatically scanning each project to surface the cloud services configured for each using Google's list:services method.
For each service discovered, the integration will recommend a data silo representing the service. It's probable that more than one data silo will be recommended for the same service if it's used in multiple projects. For example, if two BigQuery instances are used in two different projects, the integration will recommend two BigQuery data silos. In this way a silo is recommended for each distinct data store.
The integration is authenticated with a Service User created for a dedicated GCP Project. Using a service user account to connect the integration is a more secure option for this integration, as it allows for sensitive permissions to be assigned without giving a person user the same permissions. Additionally, it doesn't count as a user seat in the Google Organization. Continue to the next section for additional details about authentication and setting up the integration.
Transcend uses a client credentials method to connect to your organization's Google Cloud Platform projects. There are a few steps involved to generate credentials specific to your Google organization.
- You have access to your organization's Google Cloud Console, and have permissions to create a new project, and provision a service account.
- You have access to the Google Admin Console, with permissions to modify Security Settings for your organization.
Provision a Project and Service Account
-
Create a new project. Create a dedicated project for the integration in your organization's Google Cloud Console, and enable the following APIs depending on which integration you are trying to connect:
- GCP:
- BigQuery:
If a project was previously created for another Transcend Google integration, there's no need to create another project. Feel free to use the existing project.
-
Create a service user account. Transcend recommends creating a dedicated service user account to connect this integration, even if another service user has been configured for another Transcend integration. Creating a service user with limited scope for each integration reduces the risk of superpowered accounts.
Navigate to the "IAM & Admin" tab for the desired project and select "Service Accounts" > select Create Service Account. Give the service account a name you'll remember, for example, "transcend-integration".
- You don't need to grant this service account any specific IAM roles or permissions.
- Make note of the email address associated with this service account — you'll need it when connecting the integration.
-
[GCP] Source the client ID. Once the service account is created, select "Enable G Suite Domain-wide Delegation", and make note of the unique Client ID, as you will need to refer to this later.
-
Generate a private key. A set of public-private key pairs for this account is needed to be used in the Transcend Connection form. You can create the key by:
- Visiting the "Key" tab in the service account's settings page and selecting Add Key. Make sure to select JSON as the key type.
- This will download a key file to your computer. You will need the JSON key file during the connection phase for the integration - Transcend only supports key files generated in the JSON format.
-
Grant permissions. Give the newly created service user access to every GCP project you would like Transcend to scan or which BigQuery you want to query.
- For each project, navigate to the
IAM
section and select + Add to add a user for the project. - Enter the email address of the user account (not the service account!) and assign a role:
- GCP: resourcemanager.projects.get and serviceusage.services.list permissions.
- BigQuery: More information can be found on the next section
- Save the permissions and repeat for each additional project desired.
- For each project, navigate to the
BigQuery has many predefined roles that can be used in order to fit your many needs, such as:
BigQuery Admin
— Grants full admin accessBigQuery Data Viewer
— Grants read-only accessBigQuery Data Editor
— Grants read-only + write accessBigQuery Data Owner
— Grants full accessBigQuery Job User
- Grants access to query jobs
These are just a sample of the predefined roles that Google has provided. There is also the option of creating your own custom role:
- Navigate to the
Roles
section, which can be found underIAM & Admin
- Click
Create Role
- Edit all the necessary information (title, description, etc.)
- Add the permissions that you want the account to have. You can find the common permissions here
More information regarding BigQuery Roles can be found in their Documentation. At the minimum the service account requires at least:
BigQuery Job User
— In order to create queries with the BigQuery APIBigQuery Data Viewer
— To have access to read BigQuery datasets and tables, to enable schema discovery and classification, and access-based privacy requests
Note: Privacy requests that require modifying data, will require the BigQuery Data Editor
role instead of the BigQuery Data Viewer
role, to allow both read and write access
[GCP] Allowlist the Service Account
Once a dedicated service account is provisioned, the next step is to give it access to call the appropriate APIs in the Google organization.
- Go to your organization's Google Admin Console
- From the navigation menu, select Security > Access and data controls > API Controls.
- Select Manage Domain Wide Delegation.
- Add a new "API Client", and in the form enter the Client ID of the Service Account noted in Step 3 of the previous section.
- Add the following OAuth scope depending on what integration you are trying to connect, and then click "Authorize":
- GCP:
https://www.googleapis.com/auth/cloud-platform.read-only
- BigQuery:
https://www.googleapis.com/auth/bigquery
Complete Transcend's Connection Form
To complete authentication for the integration, navigate back to the Transcend dashboard and enter the following fields in the integration connection form:
_ GCP _
- Administrator Account Email Address
- Email address of a user that can access Google Cloud resources and service usage. This is usually an admin, account owner, or someone who has been granted access to your GCP resources.
- Service Account Email Address
- This is the for the service user that was created for the integration. It looks similar to
gcp-project@gcp-project.iam.gserviceaccount.com
.
- This is the for the service user that was created for the integration. It looks similar to
- Service Account Private Key
- This comes from the JSON key downloaded in setup. The integration connection form does not take the entire JSON object in the file, only the value for private key. To obtain the private key:
- Open the File in a text editor (TextEdit, VScode, etc.)
- Look for the private key field and copy everything between the quotes.
- Copy the key value into the connection form.
- This comes from the JSON key downloaded in setup. The integration connection form does not take the entire JSON object in the file, only the value for private key. To obtain the private key:
_ BigQuery _
- Google Cloud Project ID
- Enter the project ID that contains your BigQuery Database
- Service Account's JSON Key File
Connect the integration.
Once the integration is authenticated, navigate to the Configuration tab and enable the data silo discovery plugin to programmatically discover the GCP resources used across projects in your organization's account. The plugin is specifically looking for data storage systems like databases, data warehouses and object/file storage systems.
Once the scan is complete, select View Data Inventory to review and approve the discovered GCP resources.
The discovered resources are available for review by selecting X Resources Found. From there, review each service to decide if it should be approved as a data silo. Resources can be configured for content classification and privacy requests after they have been approved.
Once a discovered data silo has been approved and added to Data Inventory, it can be configured to further scan the individual resources to identify and classify information stored within. This is particularly valuable for databases and data storage systems, where Content Classification can programmatically identify datapoints, provide classification recommendations and identify personal data. To enable content classification for a resource, simply navigate to the Configuration tab of desired data silo and enable the Datapoint Discovery plugin.