Idan February 2016

BigQuery "copy table" not working for small tables

I am trying to copy a BigQuery table using the API from one table to the other in the same dataset. While copying big tables seems to work just fine, copying small tables with a limited number of rows (1-10) I noticed that the destination table comes out empty (created but 0 rows). I get the same results using the API and the BigQuery management console.

The issue is replicated for any table in any dataset I have. Looks like a bug or a designed behavior.

Could not find any "minimum lines" directive in the docs.. am I missing something?

EDIT: Screenshots

Original table: video_content_events with 2 rows

Copy table: copy111 with 0 rows

Answers


DoIT International February 2016

There is no minimum records limit to copy the table within the same dataset or over a different dataset. This applies both for the API and the BigQuery UI. I just replicated your scenario of creating a new table with just 2 records and I was able to successfully copy the table to another table using UI.

Attaching screenshot


Sean Chen February 2016

How are you populating the small tables? Are you perchance using streaming insert (bq insert from the command line tool, tabledata.insertAll method)? If so, per the documentation, data can take up to 90 minutes to be copyable/exportable:

https://cloud.google.com/bigquery/streaming-data-into-bigquery#dataavailability

I won't get super detailed, but the reason is that our copy and export operations are optimized to work on materialized files. Data within our streaming buffers are stored in a completely different system, and thus aren't picked up until the buffers are flushed into the traditional storage mechanism. That said, we are working on removing the copy/export delay.

If you aren't using streaming insert to populate the table, then definitely contact support/file a bug here.

Post Status

Asked in February 2016
Viewed 2,712 times
Voted 10
Answered 2 times

Search




Leave an answer