Google BigQuery

As an alternative to Elasticsearch exports, Cyberwatch can export data to Google BigQuery. Data exported into BigQuery can then be reused by various data visualization tools. The exported tables have the same name and data as the Elasticsearch indexes exported by Cyberwatch.

Configuring the Google Cloud Platform environment

First of all, you need to create a dataset for receiving Cyberwatch’s data:

  1. Go to the Google Cloud Platform console.
  2. Select the project that will own the exported data.
  3. Go to the SQL workspace in product BigQuery.
  4. In the Explorer, click the 3 points right to the project’s name, and pick Create dataset.
  5. Fill in the form, and write down the ID of the dataset.

Next, you need a service account for Cyberwatch with at least the following roles:

  • BigQuery Data Editor,
  • BigQuery Job User.

The roles can be attributed on account creation, or afterwards in product IAM & Admin, menu IAM.

If no service account exists for Cyberwatch:

  1. Go to menu Service Accounts in product IAM & Admin.
  2. Create a service account with the roles listed above.
  3. Add a JSON API key and save it.
  4. In Cyberwatch, go to menu Settings > Stored credentials.
  5. Click Add, and select type Google Cloud Platform.
  6. Paste the content of the JSON key and save.

Configure Cyberwatch to export into BigQuery

  1. Go to section Administration, sub-menu Google BigQuery.
  2. Fill in the form with your service account key, the ID of your project, and the ID of your dataset.

On updating the configuration, the data export will be immediately triggered. The data exports may take a few dozen of minutes, depending on the amount of data to export.

As long as a configuration is present, the export will be triggered daily. To disable the BigQuery export, click Delete on the configuration page.