Running Dredd Hooks as a Flask App in the Open Event Server

The Open Event Server is based on the micro-framework Flask from its initial phases. After implementing API documentation, we decided to implement the Dredd testing in the Open Event API.

After isolating each request in Dredd testing, the real challenge is now to bind the database engine to the Dredd Hooks. And as we have been using Flask-SQLAlchemy db.Model Baseclass for building all the models and Flask, being a micro framework itself, came to our rescue as we could easily bind the database engine to the Flask app. Conventionally dredd hooks are written in pure Python, but we will running them as a self contained Flask app itself.

How to initialise this flask app in our dredd hooks. The Flask app can be initialised in the before_all hook easily as shown below:

def before_all(transaction):
    app = Flask(__name__)
    app.config.from_object('config.TestingConfig')

 

The database can be binded to the app as follows:

def before_all(transaction):
app = Flask(__name__)
app.config.from_object('config.TestingConfig')
db.init_app(app)
Migrate(app, db)

 

The challenge now is how to bind the application context when applying the database fixtures. In a normal Flask application this can be done as following:

with app.app_context():
#perform your operation

 

While for unit tests in python:

with app.test_request_context():
#perform tests

 

But as all the hooks are separate from each other, Dredd-hooks-python supports idea of a single stash list where you can store all the desired variables(a list or the name stash is not necessary).

The app and db can be added to stash as shown below:

@hooks.before_all
def before_all(transaction):
app = Flask(__name__)
app.config.from_object('config.TestingConfig')
db.init_app(app)
Migrate(app, db)
stash['app'] = app
stash['db'] = db

 

These variables stored in the stash can be used efficiently as below:

@hooks.before_each
def before_each(transaction):
with stash['app'].app_context():
db.engine.execute("drop schema if exists public cascade")
db.engine.execute("create schema public")
db.create_all()

 

and many other such examples.

Related Links:
1. Testing Your API Documentation With Dredd: https://matthewdaly.co.uk/blog/2016/08/08/testing-your-api-documentation-with-dredd/
2. Dredd Tutorial: https://help.apiary.io/api_101/dredd-tutorial/
3. Dredd Docs: http://dredd.readthedocs.io/

Continue ReadingRunning Dredd Hooks as a Flask App in the Open Event Server

Building the Meilix Generator with Flask

Meilix Generator is a webapp which is used to trigger the Travis build of Meilix and mail the user the link of the iso. Meilix Generator webapp is based on Flask. This blog shows that how easy is to build a webapp and take the HTML files to render it into the webapp as well as to call and pass various function. Here I used Flask, the Python framework to render the HTML templates and send requests for various purposes (mentioned later in the article) without coding everything from scratch because of import facility of the Flask.

What is Flask?

Flask is a Python micro web framework based on Werkzeug, Jinja 2 template engine. It is used as the backbone of the webapp. It features us with a whole set of Python from which we can easily generate webapp. It is micro as it has no tools and no library itself. It come up with minimum requirements and one who needs can import different library and use it. And I used several import function for Meilix Generator like render_template, send_from_directory, etc.

Implementation (The use case in Meilix Generator)

First of all, the installation process: We will do the installation in a virtual environment. We prefer virtual environment to differentiate the Python working environment since few programs are there which require different Python versions to work.
Install virtual environment 

sudo pip install virtualenv

Now go to the folder (project) and activate it using

. venv/bin/activate

Now install Flask

pip install flask
Creating your project

Now it’s time to create a simple project in the directory.
Let’s use HTML as the frontend. In the folder create styles.css for styling and index.html template for the frontend of the page.We will make one app.py file which would look similar to this: 

from flask import Flask, render_template
app = Flask(__name__)
@app.route('/')
def index():
	"""Index page"""
	return render_template("index.html")
if __name__ == '__main__':
    app.run()

Flask looks for the / (root) path and here the root return the main template (index.html) which is the main function.

Compiling it to view the page:

export FLASK_DEBUG=1 FLASK_APP=app.py
flask run

You will find your page at http://127.0.0.1:5000

More options (how more it can help you)

  • Add more HTML template options and refer it in app.py
  • Easily use Github API  from a different .py file (this file should get import to app.py) to fetch data like: https://api.github.com/users/user_name : It will fetch user name, repos, followers and many more important information.

How I used this idea for FOSSASIA (Meilix Generator)

I used Flask for the backbone of project Meilix Generator. First, I used from function to import various library needed for the project and then made several functions for the same. Let’s understand the concept using few example:

from flask import Flask, render_template
@app.route('/about')
def about():
		#About page
		return render_template("about.html")

or

from flask import Flask, send_from_directory
@app.route('/uploads/<filename>')
def uploaded_file(filename):
		return send_from_directory(app.config['UPLOAD_FOLDER'],filename)

For more details file app.py can be found here of the Meilix Generator repository where we used the above idea.

Important Links and Repositories:

Continue ReadingBuilding the Meilix Generator with Flask

Mailing Attachments Using Terminal in Open Event Android

The latest version of Open Event Android App Generator, v2 lacked the feature of mailing the generated APK to the email ID that is entered at the start of the app generation process. This also included mailing the error logs in case of APK failure.

This is an important feature for app generator because the process of app generation is a time taking one. The users have to wait for the app to be generated so that they can download the generated APK. To avoid this, the generator can automatically email the APK as soon as it is generated.

I took up this issue a few days back and started working on it. I started with thinking about the ways through which it will be implemented. This required some discussions with the mentors and co-developers. We finalised on the following ways:

  • Using Sendgrid
  • Using SMTP

I will be discussing the implementation of both of them in this blog. The code for APK mailing starts with the function call Notification.send in generator.py

if completed and apk_path and not error:
   Notification.send(
       to=self.creator_email,
       subject='Your android application for %s has been generated ' % self.event_name,
       message='Hi,<br><br>'
               'Your android application for the \'%s\' event has been generated. '
               'And apk file has been attached along with this email.<br><br>'
               'Thanks,<br>'
               'Open Event App Generator' % self.event_name,
       file_attachment=apk_path,
       via_api=self.via_api
   )
else:
   Notification.send(
       to=self.creator_email,
       subject='Your android application for %s could not generated ' % self.event_name,
       message='Hi,<br><br> '
               'Your android application for the \'%s\' event could not generated. '
               'The error message has been provided below.<br><br>'
               '<code>%s</code><br><br>'
               'Thanks,<br>'
               'Open Event App Generator' % (self.event_name, str(error) if error else ''),
       file_attachment=apk_path,
       via_api=self.via_api
   )

This leads me to the class Notification.py. It has three functions:-

1. send(to, subject, message, file_attachment, via_api)
2. send_mail_via_smtp(payload):
3. send_email_via_sendgrid(payload):

As the name suggests, the first function:

send(to, subject, message, file_attachment, via_api)

mainly decides which service (out of smtp and sendgrid) should be used to send the email, on the basis of the input parameters (especially, the ‘EMAIL_SERVICE’ parameter that has to be set in config.py).
The function looks like as follows:

send(to, subject, message, file_attachment, via_api)

It is in the send() that the other two functions are called. If the email_service is smtp, it calls the Notification.send_mail_via_smtp(payload). Otherwise, the Notification.send_email_via_sendgrid(payload) is called.
The sendgrid function is pretty straightforward:

@staticmethod
def send_email_via_sendgrid(payload):

   key = current_app.config['SENDGRID_KEY']
   if not key:
       logger.info('Sendgrid key not defined')
       return
   headers = {
       "Authorization": ("Bearer " + key)
   }
   requests.post(
       "https://api.sendgrid.com/api/mail.send.json",
       data=payload,
       headers=headers
   )

It requires a personalised sendgrid key which is accessed from the config.py file. Apart from that it handles some errors by giving logs in celery tasks. The main line in the function that initiates the email is a POST request made using the python library ‘requests’. The request is made as follows:

 requests.post(
       "https://api.sendgrid.com/api/mail.send.json",
       data=payload,
       headers=headers
   )

The send_mail_via_smtp(payload): function looks for some configurations before sending the mail:

@staticmethod
def send_mail_via_smtp(payload):
   """
   Send email via SMTP
   :param config:
   :param payload:
   :return:
   """
   smtp_encryption = current_app.config['SMTP_ENCRYPTION']
   if smtp_encryption == 'tls':
       smtp_encryption = 'required'
   elif smtp_encryption == 'ssl':
       smtp_encryption = 'ssl'
   elif smtp_encryption == 'tls_optional':
       smtp_encryption = 'optional'
   else:
       smtp_encryption = 'none'
   config = {
       'host': current_app.config['SMTP_HOST'],
       'username': current_app.config['SMTP_USERNAME'],
       'password': current_app.config['SMTP_PASSWORD'],
       'encryption': smtp_encryption,
       'port': current_app.config['SMTP_PORT'],
   }
   mailer_config = {
       'transport': {
           'use': 'smtp',
           'host': config['host'],
           'username': config['username'],
           'password': config['password'],
           'tls': config['encryption'],
           'port': config['port']
       }
   }

   mailer = Mailer(mailer_config)
   mailer.start()
   message = Message(author=payload['from'], to=payload['to'])
   message.subject = payload['subject']
   message.plain = strip_tags(payload['message'])
   message.rich = payload['message']
   message.attach(payload['attachment'])
   mailer.send(message)
   mailer.stop()

It is using the Marrow Mailer Python library to email with attachments(APK). This Python library can be installed using
pip install marrow.mailer
To use Marrow Mailer you instantiate a marrow.mailer.Mailer object with the configuration, then pass Message instances to the Mailer instance’s send() method.

You can refer to the following guides for more information about sending emails through command line:
https://github.com/marrow/mailer is the official repo of Marrow Mailer repository.
https://pypi.python.org/pypi/marrow.mailer
More detailled information on sending emails using Sendgrid can be found here https://www.daveperrett.com/articles/2013/03/19/setting-up-sendmail-with-sendgrid-on-ubuntu/

Continue ReadingMailing Attachments Using Terminal in Open Event Android

Keep updating Build status in Meilix Generator

One of the problems we faced while working Meilix Generator was to provide user with the status of the custom ISO build in the Meilix Generator web app so we came up with the idea of checking the status of the link generated by the web app. If the link is available the status code would be 200 otherwise it would be 404.

We have used python script for checking the status of URL. For generating URL, we use the tag name which will be used as a variable to generate the URL of the unique event user wants the ISO for and the date will help in generation of link rest of the link remains the same.

tag = os.environ["TRAVIS_TAG"]
date = datetime.datetime.now().strftime('%Y%m%d')
url=https://github.com/xeon-zolt/meilix/releases/download/"+tag+"/meilix-zesty-"+date+"-i386.iso"

 

Now we will use urllib for monitoring the status of link.

req = Request(url)
    try:
        response = urlopen(req)
    except HTTPError as e:
        return('Building Your Iso')
    except URLError as e:
        return('We failed to reach the server.')
    else:
        return('Build Sucessful : ' + url)

 

After monitoring the status the next step was to update the status dynamically on the status page.

So we’llll use a status function in the flask app which is used by JavaScript to get status of the link after intervals of time.

Flask :

@app.route('/now')
def status_url():
    return (status())

 

Javascript:

<script type ="text/javascript">
let url ="/now"
function getstatus(url)
{
    fetch(url).then(function(response){
        return response.text()
    }).then(function(text){
        console.log("status",text)
        document.querySelector("div#status")
        .innerHTML = text
    })
    }
window.onload = function(){
    fetch(url).then(function(response){
        return response.text()
    }).then(function(text){
        console.log("status",text)
        document.querySelector("div#status")
        .innerHTML = text
    })
    window.setInterval(getstatus.bind(null,url),30*1000)
}
/*setInterval(function,interval in millsecs)*/
</script>

 

This covers various steps to prompt user whether the build is ready or not.

Resource

Continue ReadingKeep updating Build status in Meilix Generator

Flask App to Upload Wallpaper On the Server for Meilix Generator

We had a problem of getting a wallpaper from the user using Meilix Generator and use the wallpaper with the Meilix build scripts to generate the ISO. So, we were required to host the wallpaper on the server and downloaded by Travis CI during the build to include it in the ISO.

A solution is to render HTML templates and access data sent by POST using the request object from the flask. Redirect and url_for will be used to redirect the user once the upload is done and send_from_directory will help us to host the file under the /uploads that the user just uploaded which will be downloaded by the Travis for building the ISO.

We start by creating the HTML form marked with enctype=multipart/form-data.

<form action="upload" method="post" enctype="multipart/form-data">
        <input type="file" name="file"><br /><br />
        <input type="submit" value="Upload">
 </form>

 

First, we need imports of modules required. Most important is werkzeug.secure_filename().

import os
from flask import Flask, render_template, request, redirect, url_for, send_from_directory
from werkzeug import secure_file

 

Now, we’ll define where to upload and the type of file allowed for uploading. The path to upload directory on the server is defined by the extensions in app.config which is uploads/ here.

app.config['UPLOAD_FOLDER'] = 'uploads/'
app.config['ALLOWED_EXTENSIONS'] = set(['png', 'jpg', 'jpeg'])

 

This functions will check for valid extension for the wallpaper which are png, jpg and jpeg in this case defined above in app.config.

def allowed_file(filename):
    return '.' in filename and \
           filename.rsplit('.', 1)[1] in app.config['ALLOWED_EXTENSIONS']

 

After, getting the name of uploaded file from the user then using above function check if there are allowed file type and store it in a variable filename after that it move the files to the upload folder to save it.

Upload function check if the file name is safe and remove unsupported characters (line 3) after that moves it from a temporal folder to the upload folder. After moving, it renames the file as wallpaper so that the download link is same always which we have used in Meilix build script to download from server.

def upload():
    file = request.files['file']
    if file and allowed_file(file.filename):
        filename = secure_filename(file.filename)
        file.save(os.path.join(app.config['UPLOAD_FOLDER'], filename))
         os.rename(UPLOAD_FOLDER + filename, UPLOAD_FOLDER+'wallpaper')
         filename = 'wallpaper'

 

At this point, we have only uploaded the wallpaper and renamed the uploaded file to ‘wallpaper’ only. We cannot access the file outside the server it will result in 403 error so to make it available, the uploaded file need to be registered and then hosted using below code snippet.

We can also register uploaded_file as build_only rule and use the SharedDataMiddleware.

@app.route('/uploads/<filename>')
def uploaded_file(filename):
    return send_from_directory(app.config['UPLOAD_FOLDER'],filename)

The hosted wallpaper is used by Meilix in Travis CI to generate ISO using the download link which remains same for the uploaded wallpaper.

Why should we use secure secure_filename() function?

just imagine someone sends the following information as the filename to your app.

filename = "../../../../home/username/.sh"

 

If the number of ../ is correct and you would join this with your UPLOAD_FOLDER the hacker might have the ability to modify a file on the server’s filesystem that he or she should not modify.

Now, let’s look how the function works.

secure_filename('../../../../home/username/.sh')
'home_username_.sh'

Improving the uploads

We can add validation to the size of the file to be uploaded so that in case a user tries to upload a file too much big that may increase load on the server.

from flask import Flask, Request
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 16 * 1024 * 1024

Resources

Continue ReadingFlask App to Upload Wallpaper On the Server for Meilix Generator

Managing Related Endpoints in Permission Manager of Open Event API Server

Open Event API Server has its permission manager to manage all permission to different endpoints and some of the left gaps were filled by new helper method has_access. The next challenge for permission manager was to incorporate a feature many related endpoints points to the same resource.
Example:

  • /users-events-roles/<int:users_events_role_id>/user or
  • /event-invoices/<int:event_invoice_id>/user

Both endpoints point to Users API where they are fetching the record of a single user and for this, we apply the permission “is_user_itself”. This permission ensures that the logged in user is the same user whose record is asked through the API and for this we need the “user_id” as the “id” in the permission function, “is_user_itself”
Thus there is need to add the ability in permission manager to fetch this user_id from different models for different endpoints. For example, if we consider above endpoints then we need the ability to get user_id from UsersEventsRole and EventInvoice models and pass it to permission function so that it can use it for the check.

Adding support

To add support for multiple keys, we have to look for two things.

  • fetch_key_url
  • model

These two are key attributes to add this feature, fetch_key_url will take the comma separated list which will be matched with view_kwargs and model receives the array of the Model Classes which will be used to fetch the related records from the model
This snippet provides the main logic for this:

for index, mod in enumerate(model):
   if is_multiple(fetch_key_url):
       f_url = fetch_key_url[index]
   else:
       f_url = fetch_key_url
   try:
       data = mod.query.filter(getattr(mod, fetch_key_model) == view_kwargs[f_url]).one()
   except NoResultFound, e:
       pass
   else:
       found = True

if not found:
   return NotFoundError({'source': ''}, 'Object not found.').respond()

From the above snippet we are:

  • We iterate through the models list
  • Check if fetch_key_url has multiple keys or not
  • Get the key from fetch_key_url on the basis of multiple keys or single key in it.
  • We try to attempt to get object from model for the respective iteration
  • If there is any record/object in the database then it’s our data. Skipping further process
  • Else continue iteration till we get the object or to the end.

To use multiple mode

Instead of providing the single model to the model option of permission manager, provide an array of models. Also, it is optional to provide comma separated values to fetch_key_url
Now there can be scenario where you want to fetch resource from database model using different keys present on your view_kwargs
for example, consider these endpoints

  1. `/notifications/<notification_id>/event`
  2. `/orders/<order_id>/event`

Since they point to same resource and if you want to ensure that logged in user is organizer then you can use these two things as:

  1. fetch_key_url=”notification_id, order_id”
  2. model=[Notification, Order]

Permission manager will always match indexes in both options, the first key of fetch_key_url will be only used for the first key of the model and so on.
Also, fetch_key_url is an optional parameter and even in multiple mode you can provide a single value as well.  But if you provide multiple commas separated values make sure you provide all values such that no of values in fetch_key_url and model must be equal.

Resources

Continue ReadingManaging Related Endpoints in Permission Manager of Open Event API Server

Custom Data Layer in Open Event API Server

Open Event API Server uses flask-rest-jsonapi module to implement JSON API. This module provides a good logical abstraction in the data layer.
The data layer is a CRUD interface between resource manager and data. It is a very flexible system to use any ORM or data storage. The default layer you get in flask-rest-jsonapi is the SQLAlchemy ORM Layer and API Server makes use of default alchemy layer almost everywhere except the case where I worked on email verification part.

To add support for adding user’s email verification in API Server, there was need to create an endpoint for POST /v1/users/<int:user_id>/verify
Clearly here we are working on a single resource i.e, specific user record. This requires us to use ResourceDetail and the only issue was there is no any POST method or view in ResourceDetail class. To solve this I created a custom data layer which enables me to redefine all methods and views by inheriting abstract class. A custom data layer must inherit from flask_rest_jsonapi.data_layers.base.Base.

Creating Custom Layer

To solve email verification process, a custom layer was created at app/api/data_layers/VerifyUserLayer.py

def create_object(self, data, view_kwargs):
   user = safe_query(self, User, 'id', view_kwargs['user_id'], 'user_id')
   s = get_serializer()
   try:
       data = s.loads(data['token'])
   except Exception:
       raise UnprocessableEntity({'source': 'token'}, "Invalid Token")

   if user.email == data[0]:
       user.is_verified = True
       save_to_db(user)
       return user
   else:
       raise UnprocessableEntity({'source': 'token'}, "Invalid Token")

Using custom layer in API

We can easily provide custom layer in API Resource using one of the properties of the Resource Class

data_layer = {
   'class': VerifyUserLayer,
   'session': db.session
}

This is all we have to provide in the custom layer, now all CRUD method will be directed to our custom data layer.

Solution to our issue
Setting up custom layer provides us the ability to create our custom resource methods, i.e, modifying the view for POST request and allowing us to verify the registered users in API Server.
On Setting up the data layer all I need to do is create a ResourceList with using this layer and with permissions

class VerifyUser(ResourceList):

   methods = ['POST', ]
   decorators = (jwt_required,)
   schema = VerifyUserSchema
   data_layer = {
       'class': VerifyUserLayer,
       'session': db.session
   }

This enables me to use the custom layer, VerifyUserLayer for ResourceList resource.

Resources

Continue ReadingCustom Data Layer in Open Event API Server

Using Flask SocketIO Library in the Apk Generator of the Open Event Android App

Recently Flask SocketIO library was used in the apk generator of the Open Event Android App as it gave access to the low latency bi-directional communications between the client and the server side. The client side of the apk generator was written in Javascript which helped us to use a SocketIO official client library to establish a permanent connection to the server.

The main purpose of using the library was to display logs to the user when the app generation process goes on. This gives the user an additional help to check what is the mistake in the file uploaded by them in case an error occurs. Here the library established a connection between the server and the client so that during the app generation process the server would send real time logs to the client so that they can be viewed by the user displayed in the frontend.

To use this library we first need to download it using pip command:

pip install flask-socketio

This library depends on the asynchronous services which can be selected amongst the following listed below:

  1. Eventlet
  2. Gevent
  3. Flask development server based on Werkzeug

Amongst the three listed, eventlet is the best performant option as it has support for long polling and WebSocket transports.

The next thing was importing this library in the flask application i.e the apk generator of the app as follows:

from flask_socketio import SocketIO

current_app = create_app()
socketio = SocketIO(current_app)

if __name__ == '__main__':
    socketio.run(current_app)

The main use of the above statement (socket.run(current_app)) is that with this the startup of the web server is encapsulated. When the application is run in debug mode it is preferred to use the Werkzeug development server. However in production mode the use of eventlet web server or gevent web server is recommended.

We wanted to show the status messages currently shown to the user in the form of logs. For this firstly the generator.py file was looked upon which had the status messages and these were sent to the client side of the generator by establishing a connection between them using this library. The code written on the server side to send messages to the client side was as follows:

def handle_message(message):
    if message not None:
        namespace = “/” + self.identifier;
        send(message, namespace = namespace)

As we can see from the above code the message had to be sent to that particular namespace. The main idea of using namespace was that if there were more than one users using the generator at the same time, it would mean the send() method would send the messages to all the clients connected which would lead to the mixing up of status messages and hence it was important to use namespace so that a message was sent to that particular client.

Once the server sent the messages to the client we needed to add functionality to the our client side to receive the messages from them and also update the logs area with suitable content received. For that first the socket connection was established once the generate button was clicked which generated the identifier for us which had to be used as the namespace for that process.

socket = io.connect(location.protocol + "//" + location.host + "/" + identifier, {reconnection: false});

This piece of code established the connection of the socket and helps the client side to receive and send messages from and to the server end.

Next thing was receiving messages from the server. For this the following code snippet was added:

socket.on('message', function(message) {
    $buildLog.append(message);
});

This piece of code receives the messages from the server for unnamed events. Once the connection is established and the messages are received, the logs are updated with each message being appended so as to show the user the real time information about their build status of their app.

This was how the idea of logs being shown in the apk generator was incorporated by making the required changes in the server and client side code using the Flask SocketIO library.

Related Links:

Continue ReadingUsing Flask SocketIO Library in the Apk Generator of the Open Event Android App

Using Marshmallow Fields in Open Event API Server

The nextgen Open Event API Server  provides API endpoints to fetch the data, and to modify and update it. These endpoints have been written using flask-rest-jsonapi, which is a flask extension to build APIs around the specifications provided by JSONAPI 1.0. This extension helps you, quoting from their website:

flask-rest-jsonAPI’s data abstraction layer lets us expose the resources in a flexible way. This is achieved by using Marshmallow fields by marshmallow-jsonapi. This blog post explains how we use the marshmallow fields for building API endpoints in Open Event API Server.

The marshmallow library is used to serialize, deserialize and validate input data. Marshmallow uses classes to define output schemas. This makes it easier to reuse and configure the and also extend the schemas. When we write the API Schema for any database model from the Open Event models, all the columns have to be added as schema fields in the API class.

This is the API Server’s event schema using marshmallow fields:

These are the Marshmallow Field classes for various types of data. You can pass the following parameters when creating a field object. The ones which are used in the API Server as described below. For the rest, you can read more on marshmallow docs.

Let’s take a look at each of these fields. Each of the following snippets sample writing fields for API Schema.

identifier = fields.Str(dump_only=True)
  • This is a field of data-type String.
  • dump_only :  This field will be skipped during deserialization as it is set to True here. Setting this true essentially means marking `identifier` as read-only( for HTTP API) 
name = fields.Str(required=True)
  • This again is a field of data-type String.
  • This is a required field and a ValidationError is raised if found missing during deserialization. Taking a look at the database backend:

Since this field is set to non-nullable in the database model, it is made required in API Schema.

 external_event_url = fields.Url(allow_none=True)
  • This is a field of datatype URL.
  • Since this is not a required field, NULL values are allowed for this field in the database model. To reflect the same in the API, we have to add allow_none=True. If missing=None is unset, it defaults to false.
ends_at = fields.DateTime(required=Truetimezone=True)
  • Field of datatype DateTime
  • It is a required field for an event and the time used here is timezone aware.
latitude = fields.Float(validate=lambda n: -90 <= n <= 90allow_none=True)
  • Field of datatype Float.
  • In marshmallow fields, we can use validator clauses on the input value of a field using the validate: parameter. It returns a boolean, which when false raises a Validation Error. These validators are called during deserialization.
is_map_shown = fields.Bool(default=False)
  • Field of datatype boolean.
  • Default value for the marshmallow fields can be set by defining using default: Here, is_map_shown attribute is set to false as default for an event.
privacy = fields.Str(default="public")
  • privacy is set to default “public”.

When the input value for a field is missing during serialization, the default value will be used. This parameter can either be a value or a callable.

As described in the examples above, you can write the field as field.<data-type>(*parameters to marshmallow.fields.Field constructor*).

The parameters passed to the class constructor must reflect the column definition in the database model, else you might run into unexpected errors. An example to quote from Open Event development would be that null values were not being allowed to be posted even for nullable columns. This behavior was because allow_none defaults to false in schema, and it has to be explicitly set to True in order to receive null values. ( Issue for the same: Make non-required attributes nullable and the Pull Request made for fix.)

Fields represent a database model column and are serialized and deserialized, so that these can be used in any format, like JSON objects which we use in API server. Each field corresponds to an attribute of the object type like location, starts-at, ends-at, event-url for an event. marshmallow allows us to define data-types for the fields, validate input data and reinforce column level constraints from database model.

This list is not exhaustive of all the parameters available for marshmallow fields. To read further about them and marshmallow, check out their documentation.

Additional Resources

Code involved in API Server:

Continue ReadingUsing Marshmallow Fields in Open Event API Server

Working With Inter-related Resource Endpoints In Open Event API Server

In this blogpost I will discuss how the resource inter-related endpoints work. These are the endpoints which involve two resource objects which are also related to another same resource object.


The discussion in this post is of the endpoints related to Sessions Model. In the API server, there exists a relationship between event and sessions. Apart from these, session also has relationships with microlocations, tracks, speakers and session-types. Let’s take a look at the endpoints related with the above.

`/v1/tracks/<int:track_id>/sessions` is a list endpoint which can be used to list and create the sessions related to a particular track of an event. To get the list we define the query() method in ResourceList class as such:

The query method is executed for GET requests, so this if clause looks for track_id in view_kwargs dict. When the request is made to `/v1/tracks/<int:track_id>/sessions`, track id will be present as ‘track_id in the view_kwargs. The tracks are filtered based on the id passed here and then joined on the query with all sessions object from database.

For the POST method, we need to add the track_id from view_kwargs to pass into the track_id field of database model. This is achieved by using flask-rest-jsonapi’s before_create_object() method. The implementation for track_id is the following:

When a POST request is made to `/v1/tracks/<int:track_id>/sessions`, the view_kwargs dict will have ‘track_id’ in it. So if track_id is present in the url params, we first ensure that a track with the passed id exists, then only proceed to create a sessions object under the given track. Now the safe_query() method is a generic custom method written to check for such things. The model is passed along with the filter attribute, and a field to include in the error message. This method throws an ObjectNotFound exception if no such object exists for a given id.

We also need to take care of the permissions for these endpoints. As the decorators are called even before schema validation, it was difficult to get the event_id for permissions unless adding highly endpoint-specific code in the permission manager core, leading to loss of generality.  So the leave_if parameter of permission check was used to overcome this issue. Since the permissions manager isn’t fully developed yet, this is to be changed in the improved implementation of the permissions manager.

Similar implementations for micro locations and session types was done. All the same is not explained in this blogpost. For extended code, take a look at the source code of this schema.

For speaker relation, a few things were different. This is because speakers-sessions is a many-to-many relationship. Let’s take a look at this:

As it is a many-to-many relationship, a association_table was used with flask-sqlalchemy. So for the query() method, the same association table is queried after extracting the speaker_id from view_kwargs dict.

For the POST request on `/v1/speakers/<int:speaker_id>/sessions` , flask-rest-jsonapi’s after_create_object() method is used to insert the request in association_table. In this method the parameters are the following: self, obj, data, view_kwargs

Now view_kwargs contains the url parameters, so we make a check for speaker_id in view_wargs. If it is present, then before proceeding to insert data, we ensure that a speaker exists with that id using the safe_query() method as described above. After that, the obj argument of the method is used. This contains the object that was created in  previous method. So now once the sessions object has been created and we are sure that a speaker exists with given speaker_id, this is just to be appended to obj.speakers  so that this relationship tuple is inserted into the association table.

This updates the association table ‘speakers_session’ in this case. The other such endpoints are being worked upon in a similar fashion and will be consolidated as part of a set of robust APIs, along with the improved permissions manager for the Open Event API Server.

Additional Resources

Code, Issues and Pull Request involved

Continue ReadingWorking With Inter-related Resource Endpoints In Open Event API Server