Open Event Server – Pages API

This article illustrates how the Pages API has been designed and implemented on the server side, i.e., FOSSASIA‘s Open Event Server. Pages endpoint is used to create static pages such as “About Page” or any other page that doesn’t need to be updated frequently and only a specific content is to be shown.

Parameters

  1. name – This stores the name of the page.
      1. Type – String
      2. Required – Yes
  2. title – This stores the title of the page.
      1. Type – String
      2. Required – No
  3. url – This stores the url of the page.
      1. Type – String
      2. Required – Yes
  4. description – This stores the description of the page.
      1. Type – String
      2. Required – Yes
  5. language – This stores the language of the page.
      1. Type – String
      2. Required – No
  6. index – This stores the position of the page.
      1. Type – Integer
      2. Required – No
      3. Default – 0
  7. place – Location where the page will be placed.
      1. Type – String
      2. Required – No
      3. Accepted Values – ‘footer’ and ‘event’

These are the allowed parameters for the endpoint.

Model

Lets see how we model this API. The ORM looks like this :

__tablename__ = 'pages'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String, nullable=False)
title = db.Column(db.String)
url = db.Column(db.String, nullable=False)
description = db.Column(db.String)
place = db.Column(db.String)
language = db.Column(db.String)
index = db.Column(db.Integer, default=0)

As you can see, we created a table called “pages”. This table has 8 columns, 7 of which are the parameters that I have mentioned above. The column “id” is an Integer column and is the primary key column. This will help to differentiate between the various entries in the table.

The visualisation for this table looks as follows :

API

We support the following operations:

  1. GET all the pages in the database
  2. POST create a new page
  3. GET details of a single page as per id
  4. PATCH a single page by id
  5. DELETE a single page by id

To implement this we first add the routes in our python file as follows :

api.route(PageList, 'page_list', '/pages')
api.route(PageDetail, 'page_detail', '/pages/<int:id>')

Then we define these classes to handle the requests. The first route looks as follows:

class PageList(ResourceList):
   """
   List and create page
   """
   decorators = (api.has_permission('is_admin', methods="POST"),)
   schema = PageSchema
   data_layer = {'session': db.session,
                 'model': Page}

As can be seen above, this request requires the user to be an admin. It uses the Page model described above and handles a POST request.

The second route is:

class PageDetail(ResourceDetail):
   """
   Page detail by id
   """
   schema = PageSchema
   decorators = (api.has_permission('is_admin', methods="PATCH,DELETE"),)
   data_layer = {'session': db.session,
                 'model': Page}

This route also requires the user to be an admin. It uses the Page model and handles PATCH, DELETE requests.

To summarise our APIs are:

GET

/v1/pages{?sort,filter}

POST

/v1/pages{?sort,filter}

GET

/v1/pages/{page_id}

PATCH

/v1/pages/{page_id}

DELETE

/v1/pages/{page_id}

References

Continue ReadingOpen Event Server – Pages API

Open Event Server – Change a Column from NULL to NOT NULL

FOSSASIA‘s Open Event Server uses alembic migration files to handle all database operations and updating. Whenever the database is changed a corresponding migration python script is made so that the database will migrate accordingly for other developers as well. But often we forget that the automatically generated script usually just add/deletes columns or alters the column properties. It does not handle the migration of existing data in that column. This can lead to huge data loss or error in migration as well.

For example :

def upgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.alter_column('ticket_holders', 'lastname',
                    existing_type=sa.VARCHAR(),
                    nullable=False)
    # ### end Alembic commands ###

Here, the goal was to change the column “ticket_holders” from nullable to not nullable. The script that alembic autogenerated just uses op.alter_column().

It does not count for the already existing data. So, if the column has any entries which are null, this migration will lead to an error saying that the column contains null entries and hence cannot be “NOT NULL”.

How to Handle This?

Before altering the column definition we can follow the following steps :

  1. Look for all the null entries in the column
  2. Give some arbitrary default value to those
  3. Now we can safely alter the column definition

Let’s see how we can achieve this. For connecting with the database we will use SQLAlchemy. First, we get a reference to the table and the corresponding column that we wish to alter.

ticket_holders_table = sa.sql.table('ticket_holders',
                                        sa.Column('lastname', sa.VARCHAR()))

 

Since we need the “last_name” column from the table “ticket_holders”, we specify it in the method argument.

Now, we will give an arbitrary default value to all the originally null entries in the column. In this case, I chose to use a space character.

op.execute(ticket_holders_table.update()
               .where(ticket_holders_table.c.lastname.is_(None))
               .values({'lastname': op.inline_literal(' ')}))

op.execute() can execute direct SQL commands as well but we chose to go with SQLAlchemy which builds an optimal SQL command from our modular input. One such example of a complex SQL command being directly executed is :

op.execute('INSERT INTO event_types(name, slug) SELECT DISTINCT event_type_id, lower(replace(regexp_replace(event_type_id, \'& |,\', \'\', \'g\'), \' \', \'-\')) FROM events where not exists (SELECT 1 FROM event_types where event_types.name=events.event_type_id) and event_type_id is not null;'))

Now that we have handled all the null data, it is safe to alter the column definition. So we proceed to execute the final statement –

op.alter_column('ticket_holders', 'lastname',
                    existing_type=sa.VARCHAR(),
                    nullable=False)

Now the entire migration script will run without any error. The final outcome would be –

  1. All the null “last_name” entries would be replaced by a space character
  2. The “last_name” column would now be a NOT NULL column.

References

Continue ReadingOpen Event Server – Change a Column from NULL to NOT NULL

Implementing Database Migrations to Badgeyay

Badgeyay project is divided into two parts i.e front-end of Ember JS and back-end with REST-API programmed in Python.

We have integrated PostgreSQL as the object-relational database in Badgeyay and we are using SQLAlchemy SQL Toolkit and Object Relational Mapper tools for working with databases and Python. As we have Flask microframework for Python, so we are having Flask-SQLAlchemy as an extension for Flask that adds support for SQLAlchemy to work with the ORM.

One of the challenging jobs is to manage changes we make to the models and propagate these changes in the database. For this purpose, I have added Added Migrations to Flask SQLAlchemy for handling database changes using the Flask-Migrate extension.

In this blog, I will be discussing how I added Migrations to Flask SQLAlchemy for handling Database changes using the Flask-Migrate extension in my Pull Request.

First, Let’s understand Database Models, Migrations, and Flask Migrate extension. Then we will move onto adding migrations using Flask-Migrate. Let’s get started and understand it step by step.

What are Database Models?

A Database model defines the logical design and structure of a database which includes the relationships and constraints that determine how data can be stored and accessed. Presently, we are having a User and file Models in the project.

What are Migrations?

Database migration is a process, which usually includes assessment, database schema conversion. Migrations enable us to manipulate modifications we make to the models and propagate these adjustments in the database. For example, if later on, we make a change to a field in one of the models, all we will want to do is create and do a migration, and the database will replicate the change.

What is Flask Migrate?

Flask-Migrate is an extension that handles SQLAlchemy database migrations for Flask applications using Alembic. The database operations are made available through the Flask command-line interface or through the Flask-Script extension.

Now let’s add support for migration in Badgeyay.

Step 1 :

pip install flask-migrate

 

Step 2 :

We will need to edit run.py and it will look like this :

import os
from flask import Flask
from flask_migrate import Migrate  // Imported Flask Migrate

from api.db import db
from api.config import config

......

db.init_app(app)
migrate = Migrate(app, db) // It will allow us to run migrations
......

@app.before_first_request
def create_tables():
    db.create_all()

if __name__ == '__main__':
    app.run()

 

Step 3 :

Creation of Migration Directory.

 export FLASK_APP=run.py
 flask db init

 

This will create Migration Directory in the backend API folder.

└── migrations
    ├── README
    ├── alembic.ini
    ├── env.py
    ├── script.py.mako
    └── versions

 

Step 4 :

We will do our first Migration by the following command.

flask db migrate

 

Step 5 :

We will apply the migrations by the following command.

flask db upgrade

 

Now we are all done with setting up Migrations to Flask SQLAlchemy for handling database changes in the badgeyay repository. We can verify the Migration by checking the database tables in the Database.

This is how I have added Migrations to Flask SQLAlchemy for handling Database changes using the Flask-Migrate extension in my Pull Request.

Resources:

  • PostgreSQL Docs    – Link
  • Flask Migrate Docs  – Link
  • SQLAlchemy Docs  – Link
  • Flask SQLAlchemy Docs – Link
Continue ReadingImplementing Database Migrations to Badgeyay

Database Listener for User Centric Events

Badgeyay is an open-source utility developed by FOSSASIA to generate badges for conferences and events. The project is separated into two components to ease maintainability. First is the frontend part which is in ember and second part is backend which is in Flask. The choice of database to support backend is PostgreSQL.

Now comes the problem, whenever a user is registered in the database, he should receive  a verification mail, that he is successfully registered on the platform. For this case we have to listen to the database events on User model. This issue has greater extendibility than only sending greeting or verification mail to the user. We can extend this to trigger services that are dependent on user registration, like subscribing the user to some set of services based on the plan he opted while registration and many more.

These type of issues cannot be handled by normal relationship with tables and other entities, there has to be logic in place to support such functionalities. So the challenges for tackling the problem are as follows:

  • Listen to the insert_action in User model
  • Extracting the details necessary for the logic
  • Execute particular logic

Procedure

  1. Attaching insert_action listener to the User model. This function will get triggered whenever an entity is saved in the User model.

<!– HTML generated using hilite.me –>

@db.event.listens_for(User, "after_insert")
def logic(mapper, connection, target): {
......
}
  1. When the function gets triggered, extract the details of the saved user that is necessary for the logic. As currently we are sending greeting mail to the user,we only need the email of the user. Target is the actual saved user passed as argument to the listening function from the library.

<!– HTML generated using hilite.me –>

msg = {}
msg['subject'] = "Welcome to Badgeyay"
msg['receipent'] = target.email
msg['body'] = "It's good to have you onboard with Badgeyay. Welcome to " \
"FOSSASIA Family."
sendMail(msg)
  1. Now the details are passed to sendMail() function for sending mail which uses flask-mail library to send mail to the recipient.
    def sendMail(message):
    if message and message.receipent:
    try:
    msg = Message(
    subject=message.subject,
    sender=app.config['MAIL_USERNAME'], Response(200).generateMessage(
    recipients=[message.receipent],
    body=message.body)
    Mail(app).send(msg)
    except Exception as e:
    return jsonify(
    Response(500).exceptWithMessage(
    str(e),
    'Unable to send the mail'))
    return jsonify(
    Response(200).generateMessage(
    'Mail Sent'))
    else:
    return jsonify(
    Response(403).generateMessage(
    'No data received')) 'No data received'))
    
  2. This will send mail to the user who has been registered to the application.

Similarly we can use separate logics according to the need of the application.

 

The Pull Request for the above functionality is at this Link

Topics Involved

Working on the issue involve following topics:

  • Configuring mail service to allow insecure apps access.
  • Sending mail from the flask-mail to end user
  • Attaching listener to listen for database change
  • Extraction of data from saved object in database sqlalchemy.

Resources

  • Sending Mails Programmatically –  Link
  • Flask Mail Documentation – Link
  • Listening to database events – Link
  • Enabling access to GMAIL to send mails to recipient – Link
Continue ReadingDatabase Listener for User Centric Events

Adding a Last Modified At column in Open Event Server

This blog article will illustrate how, with the help of SQLAlchemy, a last modified at column, with complete functionality can be added to the Open Event Server database. To illustrate the process, the blog article will discuss adding the column to the sessions api. Since last modified at is a time field, and will need to be updated each time user successfully updates the session, the logic to implement will be a slightly more complex than a mere addition of a column to the table.

The first obvious step will comprise of adding the column to the database table. To achieve the same, the column will have to be added to the model for the sessions table, as well as the schema.

In app/api/schema/sessions.py:

...
class SessionSchema(Schema):
   """
   Api schema for Session Model
   """
   ...
   last_modified_at = fields.DateTime(dump_only=True)
   ...

And in app/models/sessions.py:

import pytz
...

class Session(db.Model):
   """Session model class"""
   __tablename__ = 'sessions'
   __versioned__ = {
       'exclude': []
   }
   ...
   last_modified_at = db.Column(db.DateTime(timezone=True),   
   default=datetime.datetime.utcnow)
   def init(self, ..., last_modified_at=None))
     #inside init method
     ...
     self.last_modified_at = datetime.datetime.now(pytz.utc)
     ...

NOTE: The users for the open event organiser server will be operating in multiple time zones and hence it is important for all the times to be in sync, hence the open event database maintains all the time in UTC timezone (python’s pytz module takes care of converting user’s local time into UTC time while storing, thus unifying the timezones.) From this, it directly follows that the time needs to be timezone aware hence timezone=true is passed, while defining the column.

Next, while initialising an object of this class, the last modified time is the time of creation, and hence

datetime.now(pytz.utc) is set as the initial value which basically stores the current time in UTC timezone format.

Finally, the logic for updating the last modified at column every time any other value changes for a session record needs to be implemented. SQLAlchemy provides an inbuilt support for detecting update and insert events which have been used to achieve the goal. To quote the official SQLAlchemy Docs,  “SQLAlchemy includes an event API which publishes a wide variety of hooks into the internals of both SQLAlchemy Core and ORM.

@event.listens_for(Session, 'after_update')
def receive_after_update(mapper, connection, target):
  target.last_modified_at = datetime.datetime.now(pytz.utc)

The listens_for() decorator is used to register the event according to the arguments passed to it. In our case, it will register any event on the Session API (sessions table), whenever it updates.

The corresponding function defined below the decorator, receive_after_update(mapper, connection, target) is then called, and session model (table) is the the registered target with the event. It sets the value of the last_modified_at to the current time in the UTC timezone as expected.

Lastly, since the changes have been made to the database schema, the migration file needs to be generated, and the database will be upgraded to alter the structure.

The sequence of steps to be followed on the CLI will be

> python manage.py db migrate
> python manage.py db upgrade

Resources

Continue ReadingAdding a Last Modified At column in Open Event Server

Copying Event in Open Event API Server

The Event Copy feature of Open Event API Server provides the ability to create a xerox copy of event copies with just one API call. This feature creates the complete copy of event by copying the related objects as well like tracks, sponsors, micro-locations, etc. This API is based on the simple method where an object is first removed is from current DB session and then applied make_transient. Next step is to remove the unique identifying columns like “id”, “identifier” and generating the new identifier and saving the new record. The process seems simple but becomes a little complex when you have to generate copies of media files associated and copies of related multiple objects ensuring no orders, attendees, access_codes relations are copied.

Initial Step

The first thing to copy the event is first to get the event object and all related objects first

if view_kwargs.get('identifier').isdigit():
   identifier = 'id'

event = safe_query(db, Event, identifier, view_kwargs['identifier'], 'event_'+identifier)

Next thing is to get all related objects to this event.

Creating the new event

After removing the current event object from “db.session”, It is required to remove “id” attribute and regenerate “identifier” of the event.

db.session.expunge(event)  # expunge the object from session
make_transient(event)
delattr(event, 'id')
event.identifier = get_new_event_identifier()
db.session.add(event)
db.session.commit()

Updating related object with new event

The new event created has new “id” and “identifier”. This new “id” is added into foreign keys columns of the related object thus providing a relationship with the new event created.

for ticket in tickets:
   ticket_id = ticket.id
   db.session.expunge(ticket)  # expunge the object from session
   make_transient(ticket)
   ticket.event_id = event.id
   delattr(ticket, 'id')
   db.session.add(ticket)
   db.session.commit()

Finishing up

The last step of Updating related objects is repeated for all related objects to create the copy. Thus a new event is created with all related objects copied with the single endpoint.

References

How to clone a sqlalchemy object
https://stackoverflow.com/questions/28871406/how-to-clone-a-sqlalchemy-db-object-with-new-primary-key

Continue ReadingCopying Event in Open Event API Server

How User Event Roles relationship is handled in Open Event Server

Users and Events are the most important part of FOSSASIA‘s Open Event Server. Through the advent and upgradation of the project, the way of implementing user event roles has gone through a lot many changes. When the open event organizer server was first decoupled to serve as an API server, the user event roles like all other models was decided to be served as a separate API to provide a data layer above the database for making changes in the entries. Whenever a new role invite was accepted, a POST request was made to the User Events Roles table to insert the new entry. Whenever there was a change in the role of an user for a particular event, a PATCH request was made. Permissions were made such that a user could insert only his/her user id and not someone else’s entry.

def before_create_object(self, data, view_kwargs):
        """
        method to create object before post
        :param data:
        :param view_kwargs:
        :return:
        """
        if view_kwargs.get('event_id'):
            event = safe_query(self, Event, 'id', view_kwargs['event_id'], 'event_id')
            data['event_id'] = event.id

        elif view_kwargs.get('event_identifier'):
            event = safe_query(self, Event, 'identifier', view_kwargs['event_identifier'], 'event_identifier')
            data['event_id'] = event.id
        email = safe_query(self, User, 'id', data['user'], 'user_id').email
        invite = self.session.query(RoleInvite).filter_by(email=email).filter_by(role_id=data['role'])\
                .filter_by(event_id=data['event_id']).one_or_none()
        if not invite:
            raise ObjectNotFound({'parameter': 'invite'}, "Object: not found")

    def after_create_object(self, obj, data, view_kwargs):
        """
        method to create object after post
        :param data:
        :param view_kwargs:
        :return:
        """
        email = safe_query(self, User, 'id', data['user'], 'user_id').email
        invite = self.session.query(RoleInvite).filter_by(email=email).filter_by(role_id=data['role'])\
                .filter_by(event_id=data['event_id']).one_or_none()
        if invite:
            invite.status = "accepted"
            save_to_db(invite)
        else:
            raise ObjectNotFound({'parameter': 'invite'}, "Object: not found")


Initially what we did was when a POST request was sent to the User Event Roles API endpoint, we would first check whether a role invite from the organizer exists for that particular combination of user, event and role. If it existed, only then we would make an entry to the database. Else we would raise an “Object: not found” error. After the entry was made in the database, we would update the role_invites table to change the status for the role_invite.

Later it was decided that we need not make a separate API endpoint. Since API endpoints are all user accessible and may cause some problem with permissions, it was decided that the user event roles would be handled entirely through the model instead of a separate API. Also, the workflow wasn’t very clear for an user. So we decided on a workflow where the role_invites table is first updated with the particular status and after the update has been made, we make an entry to the user_event_roles table with the data that we get from the role_invites table.

When a role invite is accepted, sqlalchemy add() and commit() is used to insert a new entry into the table. When a role is changed for a particular user, we make a query, update the values and save it back into the table. So the entire process is handled in the data layer level rather than the API level.

The code implementation is as follows:

def before_update_object(self, role_invite, data, view_kwargs):
        """
        Method to edit object
        :param role_invite:
        :param data:
        :param view_kwargs:
        :return:
        """
        user = User.query.filter_by(email=role_invite.email).first()
        if user:
            if not has_access('is_user_itself', id=user.id):
                raise UnprocessableEntity({'source': ''}, "Only users can edit their own status")
        if not user and not has_access('is_organizer', event_id=role_invite.event_id):
            raise UnprocessableEntity({'source': ''}, "User not registered")
        if not has_access('is_organizer', event_id=role_invite.event_id) and (len(data.keys())>1 or 'status' not in data):
            raise UnprocessableEntity({'source': ''}, "You can only change your status")

    def after_update_object(self, role_invite, data, view_kwargs):
        user = User.query.filter_by(email=role_invite.email).first()
        if 'status' in data and data['status'] == 'accepted':
            role = Role.query.filter_by(name=role_invite.role_name).first()
            event = Event.query.filter_by(id=role_invite.event_id).first()
            uer = UsersEventsRoles.query.filter_by(user=user).filter_by(event=event).filter_by(role=role).first()
            if not uer:
                uer = UsersEventsRoles(user, event, role)
                save_to_db(uer, 'Role Invite accepted')


In the above code, there are two main functions –
before_update_object which gets executed before the entry in the role_invites table is updated, and after_update_object which gets executed after.

In the before_update_object, we verify that the user is accepting or rejecting his own role invite and not someone else’s role invite. Also, we ensure that the user is allowed to only update the status of the role invite and not any other sensitive data like the role_name or email. If the user tried to edit any other field except status, then an error is shown to him/her. However if the user has organizer access, then he/she can edit the other fields of the role_invites table as well. The has_access() helper permission function helps us ensure the permission checks.

In the after_update_object we make the entry to the user event roles table. In the after_update_object from the role_invite parameter we can get the exact values of the newly updated row in the table. We use the data of this role invite to find the user, event and role associated with this role. Then we create a UsersEventsRoles object with user, event and role as parameters for the constructor. Then we use save_to_db helper function to save the new entry to the database. The save_to_db function uses the session.add() and session.commit() functions of flask-sqlalchemy to add the new entry directly to the database.

Thus, we maintain the flow of the user event roles relationship. All the database entries and operation related to users-events-roles table remains encapsulated from the client user so that they can use the various API features without thinking about the complications of the implementations.

 

Reference:

Continue ReadingHow User Event Roles relationship is handled in Open Event Server

DetachedInstanceError: Dealing with Celery, Flask’s app context and SQLAlchemy in the Open Event Server

In the open event server project, we had chosen to go with celery for async background tasks. From the official website,

What is celery?

Celery is an asynchronous task queue/job queue based on distributed message passing.

What are tasks?

The execution units, called tasks, are executed concurrently on a single or more worker servers using multiprocessing.

After the tasks had been set up, an error constantly came up whenever a task was called

The error was:

DetachedInstanceError: Instance <User at 0x7f358a4e9550> is not bound to a Session; attribute refresh operation cannot proceed

The above error usually occurs when you try to access the session object after it has been closed. It may have been closed by an explicit session.close() call or after committing the session with session.commit().

The celery tasks in question were performing some database operations. So the first thought was that maybe these operations might be causing the error. To test this theory, the celery task was changed to :

@celery.task(name='lorem.ipsum')
def lorem_ipsum():
    pass

But sadly, the error still remained. This proves that the celery task was just fine and the session was being closed whenever the celery task was called. The method in which the celery task was being called was of the following form:

def restore_session(session_id):
    session = DataGetter.get_session(session_id)
    session.deleted_at = None
    lorem_ipsum.delay()
    save_to_db(session, "Session restored from Trash")
    update_version(session.event_id, False, 'sessions_ver')


In our app, the app_context was not being passed whenever a celery task was initiated. Thus, the celery task, whenever called, closed the previous app_context eventually closing the session along with it. The solution to this error would be to follow the pattern as suggested on http://flask.pocoo.org/docs/0.12/patterns/celery/.

def make_celery(app):
    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    task_base = celery.Task

    class ContextTask(task_base):
        abstract = True

        def __call__(self, *args, **kwargs):
            if current_app.config['TESTING']:
                with app.test_request_context():
                    return task_base.__call__(self, *args, **kwargs)
            with app.app_context():
                return task_base.__call__(self, *args, **kwargs)

    celery.Task = ContextTask
    return celery

celery = make_celery(current_app)


The __call__ method ensures that celery task is provided with proper app context to work with.

 

Continue ReadingDetachedInstanceError: Dealing with Celery, Flask’s app context and SQLAlchemy in the Open Event Server