Database Reference
In-Depth Information
destination table specified will do the trick. The only downside is that you
will have to pay for storage and queries (currently, queries over AdSense
data are free) over your copy of the data. However, given the volume of data
this should not be significant.
Google Cloud Storage
Throughout this topic we have used GCS as a way to move data in and
out of the Google Cloud. The most common use case for GCS is to host
static content in web applications and for sharing large binary files. In these
scenarios it is useful to inspect GCS access logs to understand how the
content is used. GCS supports configuring buckets so that operations on
objects in the bucket are eventually exported to a different logging bucket.
These files can be imported into BigQuery, so you can analyze GCS logs
without having to download and process the files. This section discusses
how to set up and manage this process.
The process for setting up logging in GCS requires choosing a bucket that
will be used to store your logs and then using the gsutil tool to update the
bucket you need to track with configuration indicating that access and usage
logs should be written to the logging bucket. The full details are available at:
https://developers.google.com/storage/docs/accesslogs
Here are the gsutil commands you need to start collecting logs:
$ LOG_BUCKET="bigquery-e2e"
$ gsutil mb gs://${LOG_BUCKET}
$ gsutil acl ch -g
cloud-storage-analytics@google.com:W \
gs://${LOG_BUCKET}
$ LOG_PREFIX="chapters/14/log"
$ SERVING_BUCKET="my-serving-bucket"
$ gsutil logging set on \
- b gs://${LOG_BUCKET} \
- o ${LOG_PREFIX} \
gs://${SERVING_BUCKET}
The first command is required only if the bucket does not exist. The second
command grants access to a service group that will be writing the logs,
Search WWH ::




Custom Search