Database Reference
In-Depth Information
with BigQuery, export some data from BigQuery into this bucket. Using the
command-line client execute:
$ bq extract bigquery-e2e:reference.zip_codes
gs://<bucket>/zip_codes.csv
Waiting on bqjob_r14fa7ecfc2d8c12d_00000140a539e06f_1
… (25s)
Current status: DONE
Refresh the content lists by reloading the page. You can see that a file named
zip_codes.csv appears in the bucket. Clicking this file downloads it to
your machine. This is how you can use Google Cloud Storage to get data in
and out of BigQuery.
Like BigQuery, Google Cloud Storage has a Python-based command-line
tool that is convenient to use for some tasks. The tool is installed as part of
the Google Cloud SDK and uses the credentials you configured when you set
up the SDK. You can use the tool to inspect the file you created by extracting
data from BigQuery.
$ gsutil ls
gs://317752944021/
$ gsutil ls gs://317752944021/
gs://317752944021/zip_codes.csv
$ gsutil cat -r 0-300 gs://317752944021/zip_codes.csv
zip,type,primary_city,state,county,timezone,area_codes,latitude,longitu>
00501,UNIQUE,Holtsville,NY,Suffolk County,America/
New_York,631,40.81,-7>
00544,UNIQUE,Holtsville,NY,Suffolk County,America/
New_York,631,40.81,-7>
Now you have both BigQuery and Google Cloud Storage set up with the
ability to move data between them. For additional information on Google
Cloud
Storage,
see
the
service
documentation
at
https://developers.google.com/storage/ .
Development Environment
This section covers additional libraries and tools required to work through
the code examples in this topic. Because most of the examples in this topic
Search WWH ::




Custom Search