Supported editions for this feature: Enterprise Standard and Enterprise Plus; Education Standard and Education Plus. Compare your edition
Gmail logs store records for each stage of a message in the Gmail delivery process. To analyze Gmail flow through the delivery process, assign Gmail logs to a dataset in a BigQuery project. After the Gmail logs are assigned, you can review reports.
Note: Gmail logs created before you set up Gmail Logs in BigQuery can't be exported to BigQuery.
Assign Gmail logs to a BigQuery dataset
-
Sign in to your Google Admin console.
Sign in using your administrator account (does not end in @gmail.com).
-
From the Admin console home page, go to AppsGoogle WorkspaceGmailSetup
Email Logs in BigQuery.
- Click Enable.
- Enter a description that will appear within the setting’s summary.
- Select the BigQuery project you want to use for Gmail logs. Select a project with write access.
- Enter a dataset name where Gmail logs are stored, or use the default name gmail_logs_dataset.
- (Optional) Click Restrict the dataset to a specific geographic locationselect a location (for example, United States).
-
Click Save.
- After saving settings, go back to your BigQuery project. A dataset with this information is now in the project:
- The standard roles: project owners, project editors, and project viewers
- Four service accounts that are designated dataset editors:
- [email protected]: Writes the logs.
- [email protected]: Writes the logs.
- [email protected]: Automatically restores the template table if it's accidentally removed.
- [email protected]: Updates the schema in the future.
Note: Do not remove these service accounts or change their roles. These are required accounts.
- To verify these service accounts are added, point to the new dataset and click Down next to the dataset name.
- Click Share dataset. Daily email logs are now exported to BigQuery.
Changes can take up to 24 hours but typically happen more quickly. Learn more
daily_ table
After you turn on email logs in BigQuery, a new table named daily_ is added to the dataset. This table is a template that provides the schema for the daily tables. After you create the daily_ template, daily tables are automatically created in your dataset. The logs are then available for use.
What you should know about the daily_ table:
- It's always empty and never expires.
- Don't remove, modify, rename, or add data to this table.
-
It's a date-partitioned table. Actual data is written to a table named daily_YYYYMMDD, based on the GMT time when an event occurs.
Gmail log queries
Example queries
Try some example queries for Gmail logs in BigQuery. The examples are common use cases for Gmail logs.
Custom Queries
Compose your own, custom queries using the schema for Gmail logs in BigQuery.
SQL dialects for queries
BigQuery supports two SQL dialects for queries
Data might be truncated for some fields
It’s important to note that BigQuery has a maximum row size limit of 1MB. For this reason, some fields are truncated to make the log shorter than 1MB - 1KB, so that it can be inserted successfully into BigQuery. The 1KB is intentionally left as a buffer.
The following fields might be truncated if the log is too long, or the number of triggered rules (triggered_rule_info) in the log is too big:
message_info.subject
message_info.source.from_header_displayname
message_info.triggered_rule_info.string_match.match_expression
message_info.triggered_rule_info.string_match.matched_string
For more information, see Schema for Gmail logs in BigQuery.
Related information
Sandbox expiration
The expiration time for these BigQuery sandbox objects is 60 days:
- Tables
- Partitions
- Partitions in partitioned tables
- Views
You can change the default table expiration time for tables.
If a table expires or is removed, it can be restored within 2 days.