Sunday, 24 January 2016

Google API Access Using OAUTH2.0, Python and Raspberry Pi

This is a post about using Python to retrieve data from a Google API.  For a long time I've been aware that there were lots of Google APIs that gave access to lots of delicious data but I've not got around to playing with them.  Finally I found the time.

I decided to try to get access to the blogger API (i.e. to give information about this blog!) using OAUTH2.0 and Python.  The process of accessing the API is similar as for the Fitbit API I blogged about recently.  So in simple terms the procedure is:

  1. Register your app, specify access to the Blogger API and get your credentials.
  2. Using OAUTH2.0 to get permission from the user to access their data (and get an authorisation code).
  3. Swap your authorisation code for access and refresh tokens and use them to access the API.
  4. Periodically get a new access token using the refresh token.

On their overview pages, Google recommend the use of pre-defined libraries to authenticate and access the APIs.  Who am I to argue so this is what I did!

Part 1 - Register Your App
To do this go to Google Developer Console and login (I assume you've got a Google account).  You see a screen that looks like the image below.  Click on the project list and select "Create a project":

From the Developer Console select "Enable and manage APIs".  Select the API you want (in my case Blogger V3) and then "Enable API".

You'll then be prompted to create credentials for the project (these are the OAUTH2.0 credentials).  To do this I basically followed the Wizard that Google takes you through to specify what you want.  In summary this was:
  • Select "Go to Credentials"
  • When asked "Where will you be calling the API from?" specified "Other UI".
  • When asked "What data will you be accessing?" specified "User data".
  • This told me I needed OAUTH2.0 credentials and so I clicked "Create client ID"
This took me to a place where I could define what details the user (in this case always me) would see when they're asked to authenticate access from my app.  I specified "Blogger API Application" as the name of the application.

Click "Continue" and your OAUTH2.0 credentials are created.  At this point there's the option to download a file containing your credentials.  Do this and rename the file to be "client_secrets.json", (you'll need this later).  

Part 2- Everything Else!
As stated before, the nice people from Google recommend that you use pre-defined software modules to authenticate and access the API.  Sounds like a cunning plan so, as a Python man, the first thing I did was ran this command on my Raspberry Pi to download and install the Python module for Google API access:

sudo pip install --upgrade google-api-python-client

(I'm pretty sure pip was either already installed on my Pi or I installed it in the dim and distant past).

I then create a directory to hold my Python script to authenticate and use the API.  For me this was:


In this directory I placed the client_secrets.json file I downloaded in the earlier step.

I then set about writing the Python script required to authenticate and access the API.  However to my resounding joy I found that there were loads of pre-written scripts on the interweb, including one for the blogger API.  Never one to look a gift horse in the mouth* I set about cribbing** a pre-written script.

(*-An English saying meaning if someone offers you something then take it.  **-Another English saying meaning flagrant copying).

There is stacks of documentation here telling you how to use the Google Python module for API access.  After a quick look at this I selected "Samples" then "Sample Applications" which took me to a Github page.  On here there is a stack of pre-written Python scripts for Google API access including one for Blogger API use.  It's written by Joe Gregorio ( so all credit to him and zero credit to me.

I copied the script and pasted it into a Nano editor.  To create this file I did:

sudo nano

Then I ran the script using the command:

sudo python --noauth_local_webserver

(The --noauth_local_webserver switch was because I was running the script from a remote SSH session).

Here's a screenshot of what I saw in the SSH session:

So I've redacted some sensitive stuff on the image above but you copy the URL that the script presents and paste it into a browser.  You're shown a page like that shown below and you click "Allow" to permit access from your application to your data:

You're then served a page with an authorisation code on it (redacted below).

Copy this and paste it back onto the command line for the Python script.  The script then continue, accesses the API and prints the result!

So there you go. So the script looks at my blogger account, for my user object gets my blogs object then for each of my blogs objects prints out all the posts objects.

If you look in the directory from which you ran the script you can see that a file called blogger.dat is created.  This contains your current access and refresh tokens and so is used and overwritten when new tokens are needed.

To show I'm not a complete cribber and to learn a bit more I set about adding extra lines to the script to get stats relating to my blog.

For this you need the Python API module reference which is here.  Here's a screen shot:

In the script you can see an object called "service" is created using this line:

service, flags = sample_tools.init(
      argv, 'blogger', 'v3', __doc__, __file__,

This can then be used to create objects to access different methods within the API.  Such as:

blogs = service.blogs()

...which is then used to get the lists of blogs on the users Blogger account using:

# Retrieve the list of Blogs this user has write privileges on
thisusersblogs = blogs.listByUser(userId='self').execute()

Then it ierates through each blog using:

# List the posts for each blog this user has
for blog in thisusersblogs['items']:

So before it does this you can create an object for page views using:

pageviews = service.pageViews()

(Reference the Python module documentation references above).  Then within the "for blog" loop you can do:

print('The stats for %s:' % blog['name'])
request = pageviews.get(blogId=blog['id'],range='all')
views_doc = request.execute()
print (views_doc)

The range='all' parameter specifies that you want stats for the lifetime of the blog.  The options are:

      30DAYS - Page view counts from the last thirty days.
      7DAYS - Page view counts from the last seven days.
      all - Total page view counts from all time.

...and overall you get output like this:

The stats for Paul's Geek Dad Blog:
{u'kind': u'blogger#page_views', u'counts': [{u'count': u'94521', u'timeRange': u'ALL_TIME'}], u'blogId': u'123456678902'}

The stats for Paul's Blog:
{u'kind': u'blogger#page_views', u'counts': [{u'count': u'22', u'timeRange': u'ALL_TIME'}], u'blogId': u'123456678902'}