3

According to the BigQuery docs, partitioned tables can have 2500 partitions:

Maximum number of partitions per partitioned table — 2,500

Yet:

$ bq query --destination_table grey-sort-challenge:partitioning_magic.nyc_taxi_trips_partitioned --time_partitioning_field pickup_datetime --use_legacy_sql=false "select * from \`grey-sort-challenge.partitioning_magic.nyc_taxi_trips\`" 

Error in query string: Error processing job 'grey-sort-challenge:bqjob_r37b076ef0d3c4338_000001626c539e6a_1': Too many partitions produced by query, allowed 2000, query produces at least 2372 partitions

Is it 2000, or 2500?

2
  • 1
    I filed a bug to track the issue. Commented Mar 28, 2018 at 15:29
  • See the following question to see how to get around this limit by partitioning by week/month/year: stackoverflow.com/a/56125049/132438 Commented May 14, 2019 at 8:06

1 Answer 1

7

The maximum number of partitions supported in a table is 4000.
2000 is the number of partitions we allow a single operation (in this case, a query job) to produce.

Quotas & Limits - Partitioned tables

  • Maximum number of partitions per partitioned table — 4,000

  • Maximum number of partitions modified by a single job — 2,000

    Each job operation (query or load) can affect a maximum of 2,000 partitions. Any query or load job that affects more than 2,000 partitions is rejected by Google BigQuery.

  • Maximum number of partition modifications per day per table — 5,000 You are limited to a total of 5,000 partition modifications per day for a partitioned table. A partition can be modified by using an operation that appends to or overwrites data in the partition. Operations that modify partitions include: a load job, a query that writes results to a partition, or a DML statement (INSERT, DELETE, UPDATE, or MERGE) that modifies data in a partition.

    More than one partition may be affected by a single job. For example, a DML statement can update data in multiple partitions (for both ingestion-time and partitioned tables). Query jobs and load jobs can also write to multiple partitions but only for partitioned tables. Google BigQuery uses the number of partitions affected by a job when determining how much of the quota the job consumes. Streaming inserts do not affect this quota.

  • Maximum rate of partition operations — 50 partition operations every 10 seconds

Sign up to request clarification or add additional context in comments.

5 Comments

That's too restrictive though. Imagine I have a standard table with 10 years worth of data and I want to partition it! With these limits I'd have to split it into two tables of 5 years.
Is there any workaround to this that doesn't involve slicing and dicing the original table?
The 2500 limit on a single table is something we are actively looking to increase across the board. At present, we are doing it on an as needed basis and are happy to do it for your project if you can provide us the project number.
Today this has been increased to 4000. cloud.google.com/bigquery/quotas#partitioned_tables
See the following question to see how to get around this limit by partitioning by week/month/year: stackoverflow.com/a/56125049/132438

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.