Hi fellow geeks, today we’ll be using flask a web framework written in python for developing web applications and flask restful for creating Restful API which would save and retrieve data from azure storage.

While we can alternatively write our API only using Flask but Flask restful provides additional stuff which aids in quick implementation of REST based services and more. Let’s get started.

We’ll be creating this application on a windows machine but regardless of which operating system one would use, using a virtual environment which can be installed as a python package to install your dependencies can be wonderful way to develop applications in python.

To configure virtual environment for operating system of your choice visit here and setup your virtual environment. A typical setup after activation could look something like below:

(Your environment name) C:\Users\

We’ve already setup one so we’ll be running below commands for installing needed python dependencies.

pip install flask flask-restful pillow azure azure-storage

Please note that to install azure you might have to first install Microsoft Visual C++ Compiler for Python on a windows machine which can be downloaded from here. We’ll be using Visual studio code for developing this API so if you haven’t installed it yet I’d highly recommend that you do. You can download Visual studio code program setup from here. After installing Visual studio code, you may have to configure support for python in Visual studio code to achieve that take a look at this url. Finally, you would also need to configure Visual studio code to handle debugging Flask applications so that you could step in and out and observe objects in greater detail. For configuring flask debuggers visit this url. Well that takes care of all the necessary configuration stuff we would’ve needed.

We’ll be creating this API to hold a user registration model in azure storage and we’ll be saving this information in azure blobs. We would also be saving the data for lookup / searches from azure table so that we can find a particular user quickly. By being quick it’s meant exactly that “QUICK” azure storage is unlike any relational table stores. The important concept to get around here is there aren’t any relationships here and that data can be saved redundantly, but otherwise can be done as well, see here and here. In azure table, one would just have a combination of keys known as a partition key and row key. This term in fact relates to how we are scaling our cloud storage systems. So, it is important to think and decide upfront which objects we would use as one of these keys. Any ways our model schema could be something simple like below:

User Model

  • User Name
  • First Name
  • Last Name
  • Email Address
  • Cell No

It is worth mentioning that azure table storage isn’t the only means of storage available in azure we can also use blobs, file shares to save data. In our demo application, we would be saving a JSON file to hold our model data. Azure blobs can save any type of data whether they’d be in format of audio or video files, JSON or text files and so forth. Azure blobs are of three types namely: block blobs, page blobs, append blobs, we’ll be using block blobs in our application. To save a blob we first need to create a container which will hold these files. To create blob container and table we can either use templates from within azure portal or use azure explorer software (download it from here)

If you’re creating storage using portal this can be done by going to All Resources -> Storage -> Storage Account – blob, file, table, queue. You would have to name your storage account I’ll be naming mine as “pythonstorageaccount” and I’ll also be creating a new resource group which will hold this storage account, name that as “Peyton resource group“. Additionally, remember to check pin to dashboard so that you would be able to quickly navigate to it in future. Click the pin on dashboard to get to your “pythonstorageaccount“. Click on the access keys from here you can get your connection strings and account keys. This key can be added in azure storage explorer by right clicking on Local and attached account tree and then selecting connecting to azure storage. This will allow you to use your azure explorer software to verify your storage operations or even perform them from your desktop. Next, Right Click on Blob containers from your azure explorer software and name it as “pythonblobstorecontainer“, that’s it we are ready to upload a new blob in this container.

Here’s the complete code listing:

from flask import Flask, jsonify, request, json
from flask_restful import Resource, Api
import werkzeug
import uuid, os
import json
from tempfile import NamedTemporaryFile
from os import remove
from import BlockBlobService
from import TableService, Entity

accountname = 'pythonstorageaccount'
accountkey = ''
tablename = 'flaskrestdemo'

block_blob_service = BlockBlobService(account_name=accountname, account_key=accountkey)
table_service = TableService(account_name=accountname, account_key=accountkey)

app = Flask(__name__)
app.config['DEBUG'] = True
api = Api(app)

class FlaskRestApi(Resource):

    container = 'pythonblobstorecontainer'
    def get(self):
        bloblist = []
        generator = block_blob_service.list_blobs(self.container)
        for blob in generator:
            data = block_blob_service.get_blob_to_text(self.container,
            user = json.loads(str(data.content))
                'user': user, 
        blobres = bloblist
        tasklist = []    
        tasks = table_service.query_entities(tablename, filter=None, select=None)
        for task in tasks: 
            'UserName': task['RowKey'],    
            'FirstName': task['FirstName'],
            'LastName': task['LastName'],
            'EmailAddress': task['EmailAddress'],
            'CellNo': task['CellNo']
        tableres = tasklist 
        return jsonify({
            'BlobResponse': blobres,
            'TableResponse': tableres

    def post(self):
        res = {}
            # get the posted user data
            userdata =
            userModel = json.loads(userdata)
            # save to blob
            blob_file_name = userModel['userName'] + '.json'        

            #save to table
            task = Entity()
            task.PartitionKey = userModel['userName']
            task.RowKey = userModel['userName']
            task.FirstName = userModel['firstName']
            task.LastName = userModel['lastName']
            task.EmailAddress = userModel['emailAddress']
            task.CellNo = userModel['cellNo']
            table_service.insert_entity(tablename, task)

            res = {
                'message' : 'save successful'
        except expression as identifier:
            res = {
                'message' : 'save failed'
        return jsonify(res)
        def put(self):

        userModel = json.loads(
        userName = str(userModel['userName'])
        #save to table
        task = Entity()
        task.PartitionKey = userModel['userName']        
        task.RowKey = userModel['userName']
        task.FirstName = userModel['firstName']
        task.LastName = userModel['lastName']
        task.EmailAddress = userModel['emailAddress']
        task.CellNo = userModel['cellNo']
        table_service.update_entity(tablename, task, if_match='*')       

api.add_resource(FlaskRestApi, '/flaskrestapi', endpoint = 'flaskrest')

if __name__ == '__main__':,  port=8080)

That’s it for today’s blog we’ll be covering up more interesting topics in future. Till then happy coding!!

Reference Links:

By |2018-10-25T08:03:50+00:00September 3rd, 2018|Technology|0 Comments

About the Author:

Leave A Comment