Database Reference
In-Depth Information
Waiting on bqjob_r1acbcee37ed9abeb_000001407034129d_1
… (26s)
Current status: DONE
$ curl -H "$(python auth.py)" \
- H "If-None-Match: ${ETAG}" \
"${TABLE_URL}?fields=etag,lastModifiedTime"
{
"etag": "\"yBc8hy8wJ370nDaoIj0ElxNcWUg/
rlB4v5eu0LBEfFBRBW7-oRFArSQ\"",
"lastModifiedTime": "1376270939965"
}
Batch Requests
It is common to want to request a number of results from an API in parallel.
Maybe you have a list of jobs or tables that you'd like to look up. Although
you could do this by setting up and sending multiple API requests, there is
also a built-in Google API mechanism called a batch request . Batch requests
enable you to send multiple separate requests and get the responses back for
all of them at once.
The raw HTTP mechanism to deal with batch requests is a little bit tricky.
You can read more about it at https://developers.google.com/
api-client-library/python/guide/batch . Here you just see the
Python version, which wraps a lot of the complexity in an easy-to-use API.
Creating a batch request is simple; just create a new
BatchHttpRequest() , add the individual API requests you want to make
to the batch request, and then call execute() . If you provide a callback
method, you get one callback for each response, providing the ID of the
request and the response. This can be convenient so that you don't have to
try to separate the responses on your own.
The following Python code can read the metadata for two tables in a single
HTTP request:
>>> from apiclient.http import BatchHttpRequest
>>> batch = BatchHttpRequest()
>>> def batch_callback(request_id, response, exception) :
if exception is not None :
print "Exception: %s" % (exception,)
Search WWH ::




Custom Search