Metabase Query



v0.39.0.1 / Administration Guide / Bigquery

Working with Google BigQuery in Metabase

We are using the python requests module to query Metabase, and trying to figure out how to pass variables to the respective query. The issue is, that while we have been able to successfully pass parameters to Metabase, if the resulting query is greater than 2000 rows, the response from Metabase is limited to 2000 rows. Panoply automatically organizes data into query-ready tables and connects to popular BI tools like Metabase as well as analytical notebooks. From executives to analysts, your entire team will have access to the most up-to-date data and insights they need to drive.

Metabase

This page provides information on how to create and manage a connection to a Google BigQuery dataset.

Prerequisites

You’ll need to have a Google Cloud Platform account with a project you would like to use in Metabase. Consult the Google Cloud Platform documentation for how to create and manage a project. This project should have a BigQuery dataset for Metabase to connect to.

Metabase Query Definition

Metabase Query

Google Cloud Platform: creating a service account and JSON file

You’ll first need a service account JSON file that Metabase can use to access your BigQuery dataset. Service accounts are intended for non-human users (such as applications like Metabase) to authenticate (who am I?) and authorize (what can I do?) their API calls.

To create the service account JSON file, follow Google’s documentation on setting up a service account for your BigQuery dataset. Here’s the basic flow:

Query
  1. Create service account. From your Google Cloud Platform project console, open the main sidebar menu on the left, go to the IAM & Admin section, and select Service account. The console will list existing service accounts, if any. At the top of the screen, click on + CREATE SERVICE ACCOUNT.

  2. Fill out the service account details. Name the service account, and add a description (the service account ID will populate once you add a name). Then click the Create button.

  3. Grant the service account access to this project. You’ll need to add roles to the service account so that Metabase will have permission to view and run queries against your dataset. Make sure you add the following roles to the service account:

    • BigQuery Data Viewer
    • BigQuery Metadata Viewer
    • BigQuery Job User (distinct from BigQuery User)

    For more information on roles in BigQuery, see Google Cloud Platform’s documentation.

  4. Create key. Once you have assigned roles to the service account, click on the Create Key button, and select JSON for the key type. The JSON file will download to your computer.

You can only download the key once. If you delete the key, you’ll need to create another service account with the same roles.

Metabase: adding a BigQuery dataset

Once you have created and downloaded your service account JSON file for your BigQuery dataset, head over to your Metabase instance, click on the settings cog, and select Admin to bring up Admin mode. In the Databases section, click on the Add database button in the upper right.

On the ADD DATABASE page, select BigQuery from the Database type dropdown. Metabase will present you with the relevant configuration settings to fill out:

Settings

Name

Name is the title of your database in Metabase.

Dataset ID

Each BigQuery dataset will have a Dataset ID. You can find this ID via the Google Cloud Console. If you’re not sure where to find the Dataset ID, see Google’s documentation on getting information on datasets.

When entering the Dataset ID, omit the Project ID prefix. For example, if your ID is project_name:dataset_id, only enter dataset_id.

Service account JSON file

Upload the service account JSON file you created when following the steps above. The JSON file contains the credentials your Metabase application will need to read and query your dataset, as defined by the roles you added to the service account. If you need to add additional roles, you have to create another service account, download the JSON file, and upload the file to Metabase.

Use the Java Virtual Machine (JVM) timezone

Default: Disabled

We suggest you leave this off unless you’re doing manual timezone casting in many or most of your queries with this data.

Automatically run queries when doing simple filtering and summarizing.

Default: Enabled

When this slider is on, Metabase will automatically run queries when users do simple explorations with the Summarize and Filter buttons when viewing a table or chart. You can turn this off if querying this database is slow. This setting doesn’t affect drill-throughs or SQL queries.

This is a large database, so let me choose when Metabase syncs and scans

Default: Disabled

By default, Metabase does a lightweight hourly sync and an intensive daily scan of field values. If you have a large database, we recommend turning this on and reviewing when and how often the field value scans happen.

Save your database configuration

Metabase Query Json

When you’re done, click the Save button. A modal should pop up, informing you that your database has been added.

You can click on Explore this data to see some automatic explorations of your data, or click I’m good thanks to stay in the Admin Panel.

Give Metabase some time to sync with your BigQuery dataset, then exit Admin mode, click on Browse Data, find your database, and start exploring your data.

Using Legacy SQL

As of version 0.30.0, Metabase tells BigQuery to interpret SQL queries as Standard SQL. If you prefer using Legacy SQL instead, you can tell Metabase to do so by including a #legacySQL directive at the beginning of your query, for example:

Metabase Query Parameters

Troubleshooting

Metabase Query Language

If you’re having trouble with your BigQuery connection, you can check out this troubleshooting guide, or visit Metabase’s discussion forum to see if someone has encountered and resolved a similar issue.

Further reading

Metabase Query Format

  • Managing databases.
  • Metadata editing.
  • Creating segments and metrics.
  • Setting data access permissions.