Setting up BigQuery in Google GCP


BigQuery is a server-less data warehouse provided by Google that allows you to ingest, store, analyze and visualize large amounts of data, quickly and easily, at a low cost.

The CloudM / BigQuery integration allows you to send auditable logs from CloudM Automate to BigQuery.


In order to set up the integration, you will need to:

  • Select a Project within the Google Cloud Console,
  • Enable the BigQuery API for the Project,
  • Create a Service Account and IAM policy so that CloudM can access BigQuery. We do that using a JSON file as authentication.


Enable the BigQuery API in the Google Cloud Console

  1. Sign into Google Cloud Console (as an admin),
  2. Select the relevant Project (using the drop down arrow next to the currently selected project name),
  3. Use the menu to go to APIs and Services > Library,
  4. Search for BigQuery API,
  5. Select the Enable option.
    • If the API is already enabled, the option will say Automate instead and you will see a tick icon confirming the API is enabled.


Create a Service Account

  1. In the Google Cloud Console, and with the same project that you previously enable the API for selected, use the menu to go to IAM & admin > Service Accounts,
  2. Click on + Create Service Account,
  3. Give the Service Account a unique name
  4. Copy the Service Account ID (it will appear like an email address).
  5. Select Create and Continue.
  6. Skip the two optional sections and select Done.
  7. The Account will appear on the Service Account screen.
  8. Click on the Service Account name.
  9. Select the Keys tab
  10. Select Add Key > Create New Key
  11. Select JSON
  12. Select Create
  13. Save the JSON file to your desktop.


Add Permissions

  1. Go to IAM & admin > IAM,
  2. Select ADD,
  3. Paste the Service Account ID in the New Principals field,
  4. Add BigQuery Admin as the Role (Project Browser role must also be set, if not already enabled)
  5. Select Save.


How to get the Project Id

To obtain the Project ID (that you will need to configure BigQuery in CloudM Automate), navigate to your Google Cloud Console's Automate Resources List.

Then, choose the Project ID of the project where your BigQuery is located.


Configure BigQuery in CloudM Automate

  1. As an admin (with permissions to see BigQuery - Edit Global Settings), select Settings > BigQuery
  2. Select the Enable button,
  3. Add the BigQuery Project ID
  4. Upload the JSON Key - This will populate the Service Account field.
  5. Select Test Connection
  6. Select Populate. CloudM will populate the data sets from BigQuery
  7. Select the Default Dataset that you want to export to from the drop down menu.
  8. Select Update
  9. Go to Logs > BigQuery Export
  10. Use the Default Dataset drop down to choose which dataset the logs will be exported to.
  11. If Force Overwrite is checked, then the table will be truncated and the whole audit log will be exported. Otherwise, only the recent log events will be appended to those already exported.
  12. Select Export to export the logs to BigQuery.
    • Please note that this is a manual process only. You will need to repeat this process periodically to keep the exported results up to date.
Was this article helpful?
0 out of 1 found this helpful