Using Flask-REST-JSONAPI’s Resource Manager In Open Event API Server

For the nextgen Open Event API Server, we are using flask-rest-jsonapi to write all the API endpoints. The flask-rest-jsonapi is based on JSON API 1.0 Specifications for JSON object responses.

In this blog post, I describe how I wrote API schema and endpoints for an already existing database model in the Open Event API Server. Following this blog post, you can learn how to write similar classes for your database models. Let’s dive into how the API Schema is defined for any Resource in the Open Event API Server. Resource, here, is an object based on a database model. It provides a link between the data layer and your logical data abstraction. This ResourceManager has three classes.

  1. Resource List
  2. Resource Detail
  3. Resource Relationship

(We’ll take a look at the Speakers API.)
First, we see the already implemented Speaker Model
 :

class Speaker(db.Model):

   """Speaker model class"""

   __tablename__ = 'speaker'

   id = db.Column(db.Integer, primary_key=True)

   name = db.Column(db.String, nullable=False)

   photo = db.Column(db.String)

   website = db.Column(db.String)

   organisation = db.Column(db.String)

   is_featured = db.Column(db.Boolean, default=False)

   sponsorship_required = db.Column(db.Text)

   def __init__(self,

                name=None,

                photo_url=None,

                website=None,

                organisation=None,

                is_featured=False,

                sponsorship_required=None):

      self.name = name

      self.photo = photo_url

      self.website = website

      self.organisation = organisation

      self.is_featured = is_featured

      self.sponsorship_required = sponsorship_required

 

Here’s the Speaker API Schema:

class SpeakerSchema(Schema):
   class Meta:
       type_ = 'speaker'
       self_view = 'v1.speaker_detail'
       self_view_kwargs = {'id': '<id>'}
   id = fields.Str(dump_only=True)
   name = fields.Str(required=True)
   photo_url = fields.Url(allow_none=True)
   website = fields.Url(allow_none=True)
   organisation = fields.Str(allow_none=True)
   is_featured = fields.Boolean(default=False)
   sponsorship_required = fields.Str(allow_none=True)
class SpeakerList(ResourceList):  
   schema = SpeakerSchema
   data_layer = {'session': db.session,
                 'model': Speaker}
class SpeakerDetail(ResourceDetail):
   schema = SpeakerSchema
   data_layer = {'session': db.session,
                 'model': Speaker}
class SpeakerRelationship(ResourceRelationship):
   schema = SpeakerSchema
   data_layer = {'session': db.session,
                 'model': Speaker}

 

Last piece of code is listing the actual endpoints in __init__ file for flask-rest-jsonapi

api.route(SpeakerList, 'speaker_list', '/events/<int:event_id>/speakers', '/sessions/<int:session_id>/speakers', '/users/<int:user_id>/speakers')
api.route(SpeakerDetail, 'speaker_detail', '/speakers/<int:id>')
api.route(SpeakerRelationship, 'speaker_event', '/speakers/<int:id>/relationships/event')
api.route(SpeakerRelationship, 'speaker_user', '/speakers/<int:id>/relationships/user')
api.route(SpeakerRelationship, 'speaker_session', '/speakers/<int:id>/relationships/sessions')

 

How to write API schema from database model?

Each column of the database model is a field in the API schema. These are marshmallow fields and can be of several data types – String, Integer, Float, DateTime, Url.

Three class definitions follow the Schema class.

  • List:
    SpeakerList class is the basis of endpoints:
api.route(SpeakerList, 'speaker_list', '/events/<int:event_id>/speakers',
          '/sessions/<int:session_id>/speakers',                                                             
          '/users/<int:user_id>/speakers')

This class will contain methods that generate a list of speakers based on the id that is passed in view_kwargs. Let’s say that ‘/sessions/<int:session_id>/speakers’ is requested. As the view_kwargs here contains sesssion_id, the query methods in SpeakerList class will fetch a list of speaker profiles related to  the sessions identified by session_id.

The flask-rest-jsonapi allows GET and POST methods for ResourceList. When using these endpoints for POST, the before_create_object and before_post methods can be written. These methods are overridden from the base ResourceList class in flask-rest-jsonapi/resource.py when they are defined in Speaker’s class.

  • Detail: 

SpeakerDetail class provides these endpoints:

 api.route(SpeakerDetail, 'speaker_detail', '/speakers/<int:id>')

The Resource Detail provides methods to facilitate GET, PATCH and DELETE requests provided for the endpoints. Methods like: before_get_object, before_update_object, after_update_object are derived from ResourceDetail class. The endpoints return an object of the resource based on the view_kwargs in a JSON response.

  • Relationship:

SpeakerRelationship class, as you might have guesses, provides:

api.route(SpeakerRelationship, 'speaker_event', '/speakers/<int:id>/relationships/event')

api.route(SpeakerRelationship, 'speaker_user', '/speakers/<int:id>/relationships/user')

api.route(SpeakerRelationship, 'speaker_session', '/speakers/<int:id>/relationships/sessions')

SpeakerRelationship class provides methods to create, update and delete relationships between the speaker and related resources – events, users and sessions in this case.

The above is a bare bone API schema example. The actual implementation in Open Event Server has lots of helper methods too to cater to our specific needs.

Additional Resources:

Continue ReadingUsing Flask-REST-JSONAPI’s Resource Manager In Open Event API Server

Documenting Open Event API Using API-Blueprint

FOSSASIA‘s Open Event Server API documentation is done using an api-blueprint. The API Blueprint language is a format used to describe API in an API blueprint file, where a blueprint file (or a set of files) is such that describes an API using the API Blueprint language. To follow up with the blueprint, an apiary editor is used. This editor is responsible for rendering the API blueprint and printing the result in user readable API documented format. We create the API blueprint manually.

Using API Blueprint:-
We create the API blueprint by first adding the name and metadata for the API we aim to design. This step looks like this :-

FORMAT: V1
HOST: https://api.eventyay.com

# Open Event API Server

The Open Event API Server

# Group Authentication

The API uses JWT Authentication to authenticate users to the server. For authentication, you need to be a registered user. Once you have registered yourself as an user, you can send a request to get the access_token.This access_token you need to then use in Authorization header while sending a request in the following manner: `Authorization: JWT <access_token>`


API blueprint starts with the metadata, here FORMAT and HOST are defined metadata. FORMAT keyword specifies the version of API Blueprint . HOST defines the host for the API.

The heading starts with # and the first heading is regarded as the name of the API.

NOTE – Also all the heading starts with one or more # symbol. Each symbol indicates the level of the heading. One # symbol followed by heading serves as the top level i.e. one # = Top Level. Similarly for  ## = second level and so on. This is in compliance with normal markdown format.
        Following the heading section comes the description of the API. Further, headings are used to break up the description section.

Resource Groups:
—————————–
    By using group keyword at the starting of a heading , we create a group of related resources. Just like in below screenshot we have created a Group Users.

# Group Users

For using the API you need(mostly) to register as an user. Registering gives you access to all non admin API endpoints. After registration, you need to create your JWT access token to send requests to the API endpoints.


| Parameter | Description | Type | Required |
|:----------|-------------|------|----------|
| `name`  | Name of the user | string | - |
| `password` | Password of the user | string | **yes** |
| `email` | Email of the user | string | **yes** |

 

Resources:
——————
    In the Group Users we have created a resource Users Collection. The heading specifies the URI used to access the resource inside of the square brackets after the heading. We have used here parameters for the resource URI which takes us into how to add parameters to the URI. Below code shows us how to add parameters to the resource URI.

## Users Collection [/v1/users{?page%5bsize%5d,page%5bnumber%5d,sort,filter}]
+ Parameters
    + page%5bsize%5d (optional, integer, `10`) - Maximum number of resources in a single paginated response.
    + page%5bnumber%5d (optional, integer, `2`) - Page number to fetchedfor the paginated response.
    + sort (optional, string, `email`) - Sort the resources according to the given attribute in ascending order. Append '-' to sort in descending order.
    + filter(optional, string, ``) - Filter according to the flask-rest-jsonapi filtering system. Please refer: http://flask-rest-jsonapi.readthedocs.io/en/latest/filtering.html for more.

 

Actions:
————–
    An action is specified with a sub-heading inside of  a resource as the name of Action, followed by HTTP method inside the square brackets.
    Before we get on further, let us discuss what a payload is. A payload is an HTTP transaction message including its discussion and any additional assets such as entity-body validation schema.

There are two payloads inside an Action:

  1. Request: It is a payload containing one specific HTTP Request, with Headers and an optional body.
  2. Response: It is a payload containing one HTTP Response.

A payload may have an identifier-a string for a request payload or an HTTP status code for a response payload.

+ Request

    + Headers

            Accept: application/vnd.api+json

            Authorization: JWT <Auth Key>

+ Response 200 (application/vnd.api+json)


Types of HTTP methods for Actions:

  • GET – In this action, we simply send the header data like Accept and Authorization and no body. Along with it we can send some GET parameters like page[size]. There are two cases for GET: List and Detail. For example, if we consider users, a GET for List helps us retrieve information about all users in the response, while Details, helps us retrieve information about a particular user.

The API Blueprint examples implementation of both GET list and detail request and response are as follows.

### List All Users [GET]
Get a list of Users.

+ Request

    + Headers

            Accept: application/vnd.api+json

            Authorization: JWT <Auth Key>

+ Response 200 (application/vnd.api+json)

        {
            "meta": {
                "count": 2
            },
            "data": [
                {
                    "attributes": {
                        "is-admin": true,
                        "last-name": null,
                        "instagram-url": null,

 

### Get Details [GET]
Get a single user.

+ Request

    + Headers

            Accept: application/vnd.api+json

            Authorization: JWT <Auth Key>

+ Response 200 (application/vnd.api+json)

        {
            "data": {
                "attributes": {
                    "is-admin": false,
                    "last-name": "Doe",
                    "instagram-url": "http://instagram.com/instagram",

 

  • POST – In this action, apart from the header information, we also need to send a data. The data must be correct with jsonapi specifications. A POST body data for an users API would look something like this:
### Create User [POST]
Create a new user using an email, password and an optional name.

+ Request (application/vnd.api+json)

    + Headers

            Authorization: JWT <Auth Key>

    + Body

            {
              "data":
              {
                "attributes":
                {
                  "email": "example@example.com",
                  "password": "password",


A POST request with this data, would create a new entry in the table and then return in jsonapi format the particular entry that was made into the table along with the id assigned to this new entry.

  • PATCH – In this action, we change or update an already existing entry in the database. So It has a header data like all other requests and a body data which is almost similar to POST except that it also needs to mention the id of the entry that we are trying to modify.
### Update User [PATCH]
+ `id` (integer) - ID of the record to update **(required)**

Update a single user by setting the email, email and/or name.

Authorized user should be same as user in request body or must be admin.

+ Request (application/vnd.api+json)

    + Headers

            Authorization: JWT <Auth Key>

    + Body

            {
              "data": {
                "attributes": {
                  "password": "password1",
                  "avatar_url": "http://example1.com/example1.png",
                  "first-name": "Jane",
                  "last-name": "Dough",
                  "details": "example1",
                  "contact": "example1",
                  "facebook-url": "http://facebook.com/facebook1",
                  "twitter-url": "http://twitter.com/twitter1",
                  "instagram-url": "http://instagram.com/instagram1",
                  "google-plus-url": "http://plus.google.com/plus.google1",
                  "thumbnail-image-url": "http://example1.com/example1.png",
                  "small-image-url": "http://example1.com/example1.png",
                  "icon-image-url": "http://example1.com/example1.png"
                },
                "type": "user",
                "id": "2"
              }
            }

Just like in POST, after we have updated our entry, we get back as response the new updated entry in the database.

  • DELETE – In this action, we delete an entry from the database. The entry in our case is soft deleted by default. Which means that instead of deleting it from the database, we set the deleted_at column with the time of deletion. For deleting we just need to send header data and send a DELETE request to the proper endpoint. If deleted successfully, we get a response as “Object successfully deleted”.
### Delete User [DELETE]
Delete a single user.

+ Request

    + Headers

            Accept: application/vnd.api+json

            Authorization: JWT <Auth Key>

+ Response 200 (application/vnd.api+json)

        {
          "meta": {
            "message": "Object successfully deleted"
          },
          "jsonapi": {
            "version": "1.0"
          }
        }


How to check after manually entering all these? We can use the
apiary website to render it, or simply use different renderer to do it. How? Checkout for my next blog on apiary and aglio.

Learn more about api blueprint here: https://apiblueprint.org/

Continue ReadingDocumenting Open Event API Using API-Blueprint

Open Event Server: Working with Migration Files

FOSSASIA‘s Open Event Server uses alembic migration files to handle all database operations and updations.  From creating tables to updating tables and database, all works with help of the migration files.
However, many a times we tend to miss out that automatically generated migration files mainly drops and adds columns rather than just changing them. One example of this would be:

def upgrade():
    ### commands auto generated by Alembic - please adjust! ###
    op.add_column('session', sa.Column('submission_date', sa.DateTime(), nullable=True))
    op.drop_column('session', 'date_of_submission')

Here, the idea was to change the has_session_speakers(string) to is_session_speakers_enabled (boolean), which resulted in the whole dropping of the column and creation of a new boolean column. We realize that, on doing so we have the whole data under  has_session_speakers lost.

How to solve that? Here are two ways we can follow up:

  • op.alter_column:
    ———————————-

When update is as simple as changing the column names, then we can use this. As discussed above, usually if we migrate directly after changing a column in our model, then the automatic migration created would drop the old column and create a new column with the changes. But on doing this in the production will cause huge loss of data which we don’t want. Suppose we want to just change the name of the column of start_time to starts_at. We don’t want the entire column to be dropped. So an alternative to this is using op.alter_column. The two main necessary parameters of the op.alter_column is the table name and the column which you are willing to alter. The other parameters include the new changes. Some of the commonly used parameters are:

  1. nullable Optional: specify True or False to alter the column’s nullability.
  2. new_column_name – Optional; specify a string name here to indicate the new name within a column rename operation.
  3. type_Optional: a TypeEngine type object to specify a change to the column’s type. For SQLAlchemy types that also indicate a constraint (i.e. Boolean, Enum), the constraint is also generated.
  4. autoincrement –  Optional: set the AUTO_INCREMENT flag of the column; currently understood by the MySQL dialect.
  5. existing_typeOptional: a TypeEngine type object to specify the previous type. This is required for all column alter operations that don’t otherwise specify a new type, as well as for when nullability is being changed on a column.

    So, for example, if you want to change a column name from “start_time” to “starts_at” in events table you would write:
    op.alter_column(‘events’, ‘start_time’, new_column_name=’starts_at’)
def upgrade():
    ### commands auto generated by Alembic - please adjust! ###
    op.alter_column('sessions_version', 'end_time', new_column_name='ends_at')
    op.alter_column('sessions_version', 'start_time', new_column_name='starts_at')
    op.alter_column('events_version', 'end_time', new_column_name='ends_at')
    op.alter_column('events_version', 'start_time', new_column_name='starts_at')


Here,
session_version and events_version are the tables name altering columns start_time to starts_at and end_time to ends_at with the op_alter_column parameter new_column_name.

  • op.execute:
    ——————–

Now with alter_column, most of the alteration in the column name or constraints or types is achievable. But there can be a separate scenario for changing the column properties. Suppose I change a table with column “aspect_ratio” which was a string column and had values “on” and “off” and want to convert the type to Boolean True/False. Just changing the column type using alte_column() function won’t work since we need to also modify the whole data. So, sometimes we need to execute raw SQL commands. To do that, we can use the op.execute() function.
The way it is done:

def upgrade():
    ### commands auto generated by Alembic - please adjust! ###
    op.execute("ALTER TABLE image_sizes ALTER full_aspect TYPE boolean USING CASE 
            full_aspect WHEN 'on' THEN TRUE ELSE FALSE END", execution_options=None)

    op.execute("ALTER TABLE image_sizes ALTER icon_aspect TYPE boolean USING CASE 
            icon_aspect WHEN 'on' THEN TRUE ELSE FALSE END", execution_options=None)

    op.execute("ALTER TABLE image_sizes ALTER thumbnail_aspect TYPE boolean USING CASE 
            thumbnail_aspect WHEN 'on' THEN TRUE ELSE FALSE END"execution_options=None)

For a little more advanced use of op.execute() command will be:

op.alter_column('events', 'type', new_column_name='event_type_id')
    op.alter_column('events_version', 'type', new_column_name='event_type_id')
    op.execute('INSERT INTO event_types(name, slug) SELECT DISTINCT event_type_id, 
                lower(replace(regexp_replace(event_type_id, \'& |,\', \'\', \'g\'),
                \' \', \'-\')) FROM events where not exists (SELECT 1 FROM event_types 
                where event_types.name=events.event_type_id) and event_type_id is not
                null;')
    op.execute('UPDATE events SET event_type_id = (SELECT id FROM event_types WHERE 
                event_types.name=events.event_type_id)')
    op.execute('ALTER TABLE events ALTER COLUMN event_type_id TYPE integer USING 
                event_type_id::integer')

In this example:

  • op.alter_column() renames the column type to event_type_id of events table
  • op.execute() does the following:
  • Inserts into column name of event_types table the value of event_type_idN (which previously contained the name of the event_type) from events table, and
  • Inserts into slug column of event_types table the value of event_type_id where all letters are changed to lowercase; “& ” and “,” to “”; and spaces to “-”.
    1. Checks whether a type with that name already exists so as to disallow any duplicate entries in the event_types table.
    2. Checks whether the event_type_id is null because name of event_types table cannot be null.

You can learn more on Alembic migrations here: http://alembic.zzzcomputing.com/en/latest/ops.html

Continue ReadingOpen Event Server: Working with Migration Files

Adding Global Search and Extending Bookmark Views in Open Event Android

When we design an application it is essential that the design and feature set enables the user to find all relevant information she or he is looking for. In the first versions of the Open Event Android App it was difficult to find the Sessions and Speakers related to a certain Track. It was only possible to search for them individually. The user also could not view bookmarks on the Main Page but had to go to a separate tab to view them. These were some capabilities I wanted to add to the app.

In this post I will outline the concepts and advantages of a Global Search and a Home Screen in the app. I took inspiration from the Google I/O 2017 App  that had these features already. And, I am demonstrating how I added a Home Screen which also enabled users to view their bookmarks on the Home Screen itself.

Global Search v/s Local Search

Local Search
Global Search

 

 

 

 

 

 

 

 

 

If we observe clearly in the above images we can see there exists a stark difference in the capabilities of each search.
See how in the Local Search we are just able to search within the Tracks section and not anything else.
This is fixed in the Global Search page which exists along with the new home screen.
As all the results that a user might need are obtained from a single search, it improves the overall user-experience of the app. Also a noticeable feature that was missing in the current iteration of the application was that a user had to go to a separate tab to view his/her bookmarks. It would be better for the app to have a home page detailing all the Event’s/Conference’s details as well as display user bookmarks on the homepage.

New Home

Home screen
Home screen with Bookmarks

 

 

 

 

 

 

 

 

 

Home screen with Bookmarks               
Home screen Demo

 

 

 

 

 

 

 

 

 

The above posted images/gifs indicate the functioning and the UI/UX of the new Homescreen within the app.
Currently I am working to further improve the way the Bookmarks are displayed.
The new home screen provides the user with the event details i.e FOSSASIA 2017 in this case. This would be different for each conference/event and the data is fetched from the open-event-orga server(the first part of the project) if it doesn’t already exist in the JSON files provided in the assets folder of the application. All the event information is being populated by the JSON files provided in the assets folder in the app directory structure.

  • config.json
  • sponsors.json
  • microlocations.json
  • event.json(this stores the information that we see on the home screen)
  • sessions.json
  • speakers.json
  • track.json

All the file names are descriptive enough to denote what do all of them store.I hope that I have put forward why the addition of a New Home with Bookmarks along with the Global Search feature was a neat addition to the app.

Link to PR for this feature : https://github.com/fossasia/open-event-android/pull/1565

Resources

 

 

Continue ReadingAdding Global Search and Extending Bookmark Views in Open Event Android

Open Event Server: No (no-wrap) Ellipsis using jquery!

Yes, the title says it all i.e., Enabling multiple line ellipsis. This was used to solve an issue to keep Session abstract view within 200 characters (#3059) on FOSSASIA‘s Open Event Server project.

There is this one way to ellipsis a paragraph in html-css and that is by using the text-overflow property:

.div_class{
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}’’

But the downside of this is the one line ellipis. Eg: My name is Medozonuo. I am…..

And here you might pretty much want to ellipsis after a few characters in multiple lines, given that your div space is small and you do want to wrap your paragraph. Or maybe not.

So jquery to the rescue.

There are two ways you can easily do this multiple line ellipsis:

1) Height-Ellipsis (Using the do-while loop):

//script:
if ($('.div_class').height() > 100) {
    var words = $('.div_class').html().split(/\s+/);
    words.push('...');

    do {
        words.splice(-2, 1);
        $('.div_class').html( words.join(' ') );
    } while($('.div_class').height() > 100);
}

Here, you check for the div content’s height and split the paragraph after that certain height and add a “…”, do- while making sure that the paragraphs are in multiple lines and not in one single line. But checkout for that infinite loop.

2) Length-Ellipsis (Using substring function):  

//script:
$.each($('.div_class'), function() {
        if ($(this).html().length > 100) {
               var cropped_words = $(this).html();
               cropped_words = cropped_words.substring(0, 200) + "...";
               $(this).html(cropped_words);
        }
 });

Here, you check for the length/characters rather than the height, take in the substring of the content starting from 0-th character to the 200-th character and then add in extra “…”.

This is exactly how I used it in the code.

$.each($('.short_abstract',function() {
   if ($(this).html().length > 200) {
       var  words = $(this).html();
       words = words.substring(0,200 + "...";
       $(this).html(words);
    }
});


So ellipsing paragraphs over heights and lengths can be done using jQuery likewise.

Continue ReadingOpen Event Server: No (no-wrap) Ellipsis using jquery!

DetachedInstanceError: Dealing with Celery, Flask’s app context and SQLAlchemy in the Open Event Server

In the open event server project, we had chosen to go with celery for async background tasks. From the official website,

What is celery?

Celery is an asynchronous task queue/job queue based on distributed message passing.

What are tasks?

The execution units, called tasks, are executed concurrently on a single or more worker servers using multiprocessing.

After the tasks had been set up, an error constantly came up whenever a task was called

The error was:

DetachedInstanceError: Instance <User at 0x7f358a4e9550> is not bound to a Session; attribute refresh operation cannot proceed

The above error usually occurs when you try to access the session object after it has been closed. It may have been closed by an explicit session.close() call or after committing the session with session.commit().

The celery tasks in question were performing some database operations. So the first thought was that maybe these operations might be causing the error. To test this theory, the celery task was changed to :

@celery.task(name='lorem.ipsum')
def lorem_ipsum():
    pass

But sadly, the error still remained. This proves that the celery task was just fine and the session was being closed whenever the celery task was called. The method in which the celery task was being called was of the following form:

def restore_session(session_id):
    session = DataGetter.get_session(session_id)
    session.deleted_at = None
    lorem_ipsum.delay()
    save_to_db(session, "Session restored from Trash")
    update_version(session.event_id, False, 'sessions_ver')


In our app, the app_context was not being passed whenever a celery task was initiated. Thus, the celery task, whenever called, closed the previous app_context eventually closing the session along with it. The solution to this error would be to follow the pattern as suggested on http://flask.pocoo.org/docs/0.12/patterns/celery/.

def make_celery(app):
    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    task_base = celery.Task

    class ContextTask(task_base):
        abstract = True

        def __call__(self, *args, **kwargs):
            if current_app.config['TESTING']:
                with app.test_request_context():
                    return task_base.__call__(self, *args, **kwargs)
            with app.app_context():
                return task_base.__call__(self, *args, **kwargs)

    celery.Task = ContextTask
    return celery

celery = make_celery(current_app)


The __call__ method ensures that celery task is provided with proper app context to work with.

 

Continue ReadingDetachedInstanceError: Dealing with Celery, Flask’s app context and SQLAlchemy in the Open Event Server

Building the Scheduler UI

{ Repost from my personal blog @ https://blog.codezero.xyz/building-the-scheduler-ui }

If you hadn’t already noticed, Open Event has got a shiny new feature. A graphical and an Interactive scheduler to organize sessions into their respective rooms and timings.

As you can see in the above screenshot, we have a timeline on the left. And a lot of session boxes to it’s right. All the boxes are re-sizable and drag-drop-able. The columns represent the different rooms (a.k.a micro-locations). The sessions can be dropped into their respective rooms. Above the timeline, is a toolbar that controls the date. The timeline can be changed for each date by clicking on the respective date button.

The Clear overlaps button would automatically check the timeline and remove any sessions that are overlapping each other. The Removed sessions will be moved to the unscheduled sessions pane at the left.

The Add new micro-location button can be used to instantly add a new room. A modal dialog would open and the micro-location will be instantly added to the timeline once saved.

The Export as iCal allows the organizer to export all the sessions of that event in the popular iCalendar format which can then be imported into various calendar applications.

The Export as PNG saves the entire timeline as a PNG image file. Which can then be printed by the organizers or circulated via other means if necessary.

Core Dependencies

The scheduler makes use of some javascript libraries for the implementation of most of the core functionality

  • Interact.js – For drag-and-drop and resizing
  • Lodash – For array/object manipulations and object cloning
  • jQuery – For DOM Manipulation
  • Moment.js – For date time parsing and calculation
  • Swagger JS – For communicating with our API that is documented according to the swagger specs.
Retrieving data via the API

The swagger js client is used to obtain the sessions data using the API. The client is asynchronously initialized on page load. The client can be accessed from anywhere using the javascript function initializeSwaggerClient.

The swagger initialization function accepts a callback which is called if the client is initialized. If the client is not initialized, the callback is called after that.

var swaggerConfigUrl = window.location.protocol + "//" + window.location.host + "/api/v2/swagger.json";  
window.swagger_loaded = false;  
function initializeSwaggerClient(callback) {  
    if (!window.swagger_loaded) {
        window.api = new SwaggerClient({
            url: swaggerConfigUrl,
            success: function () {
                window.swagger_loaded = true;
                if (callback) {
                    callback();
                }

            }
        });
    } else {
        if (callback) {
            callback();
        }
    }
}

For getting all the sessions of an event, we can do,

initializeSwaggerClient(function () {  
    api.sessions.get_session_list({event_id: eventId}, function (sessionData) {
        var sessions = sessionData.obj;  // Here we have an array of session objects
    });
});

In a similar fashion, all the micro-locations of an event can also be loaded.

Processing the sessions and micro-locations

Each session object is looped through, it’s start time and end time are parsed into moment objects, duration is calculated, and it’s distance from the top in the timeline is calculated in pixels. The new object with additional information, is stored in an in-memory data store, with the date of the event as key, for use in the timeline.

The time configuration is specified in a separate time object.

var time = {  
    start: {
        hours: 0,
        minutes: 0
    },
    end: {
        hours: 23,
        minutes: 59
    },
    unit: {
        minutes: 15,
        pixels: 48,
        count: 0
    },
    format: "YYYY-MM-DD HH:mm:ss"
};

The smallest unit of measurement is 15 minutes and 48px === 15 minutes in the timeline.

Each day of the event is stored in a separate array in the form of Do MMMM YYYY(eg. 2nd May 2013).

The array of micro-location objects is sorted alphabetically by the room name.

Displaying sessions and micro-locations on the timeline

According to the day selected, the sessions for that day are displayed on the timeline. Based on their time, the distance of the session div from the top of the timeline is calculated in pixels and the session box is positioned absolutely. The height of the session in pixels is calculated from it’s duration and set.

For pixels-minutes conversion, the following are used.

/**
 * Convert minutes to pixels based on the time unit configuration
 * @param {number} minutes The minutes that need to be converted to pixels
 * @returns {number} The pixels
 */
function minutesToPixels(minutes) {  
    minutes = Math.abs(minutes);
    return (minutes / time.unit.minutes) * time.unit.pixels;
}

/**
 * Convert pixels to minutes based on the time unit configuration
 * @param {number} pixels The pixels that need to be converted to minutes
 * @returns {number} The minutes
 */
function pixelsToMinutes(pixels) {  
    pixels = Math.abs(pixels);
    return (pixels / time.unit.pixels) * time.unit.minutes;
}
Adding interactivity to the session elements

Interact.js is used to provide interactive capabilities such as drag-drop and resizing.

To know how to use Interact.js, you can checkout some previous blog posts on the same, Interact.js + drag-drop and Interact.js + resizing.

Updating the session information in database on every change

We have to update the session information in database whenever it is moved or resized. Every time a session is moved or resized, a jQuery event is triggered on $(document) along with the session object as the payload.

We listen to this event, and make an API request with the new session object to update the session information in the database.


The scheduler UI is more complex than said in this blog post. To know more about it, you can checkout the scheduler’s javascript code atapp/static/js/admin/event/scheduler.js.

Continue ReadingBuilding the Scheduler UI

Unit Testing

There are many stories about unit testing. Developers sometimes say that they don’t write tests because they write a good quality code. Does it make sense, if no one is infallible?.

At studies only a  few teachers talk about unit testing, but they only show basic examples of unit testing. They require to write a few tests to finish final project, but nobody really  teaches us the importance of unit testing.

I have also always wondered what benefits can it bring. As time is a really important factor in our work it often happens that we simply resign of this part of process development to get “more time” rather than spend time on writing stupid tests. But now I know that it is a vicious circle.

Customers requierments does not help us. They put a high pressure to see visible results not a few statistics about coverage status. None of them cares about some strange numbers. So, as I mentioned above, we usually focuses on building new features and get riid of tests. It may seem to save time, but it doesn’t.

In reality tests save us a lot of time because we can identify and fix bugs very quickly. If a bug ocurrs because someone’s change we don’t have to spend long hours trying to figure out wgat is going out. That’s why we need tests.  

It is especially visible in huge open source projects. FOSSASIA organization has about 200 contributors. In OpenEvent project we have about 20 active developers, who generate many lines of code every single day. Many of them change over and over again as well as interfere  with each other.

Let me provide you with a simple example. In our team we have about 7 pull requests per day. As I mentioned above we want to make our code high quality and free of bugs, but without testing identifying if pull request causes a bug is very difficult task. But fortunately this boring job makes Travis CI for us. It is a great tool which uses our tests and runs them on every PR  to check if bugs occur. It helps us to quickly notice bugs and maintain our project very well.

What is unit testing?

Unit testing is a software development method in which the smallest testable parts of an application are tested

Why do we need writing unit tests?

Let me point all arguments why unit testing is really important while developing a project.

  • To prove that our code works properly

If developer adds another condition, test checks if method returns correct results. You simply don’t need to wonder if something is wrong with you code.

  • To reduce amount of bugs

It let you to know what inputs params’ function should get and what results should be returned. You simply don’t  write unused code

  • To save development time

Developers don’t waste time on checking every code’s change if his code works correctly

  • Unit tests help to understand software design
  • To provide quick feedback about method which you are testing
  • To help document a code

How to write unit test in Python

In my work I write use tests in Python. I am going to share my sample code  with you now

  • Import module unittest
  • Choose function to test
  • Write unit test

Example OpenEvent test in Python

class TestPagesUrls(OpenEventTestCase):

   def setUp(self):

       self.app = Setup.create_app()

   def test_if_urls_exist(self):

       """Test all urls via GET method"""

       with app.test_request_context():

           for rule in app.url_map.iter_rules():

               if excluded_paths(rule):

                   status_code = self.app.get(request.url[:-1] + str(rule).replace('//', '/'),        follow_redirects=True).status_code

                   self.assertTrue(status_code in [200, 302, 401])

 

I want to check if all views exist but it required a lot of time. That’s why I wonder I how to avoid writing similar tests. Finally, based  on our list of routes I am able to write test which checks code’s status  on every page.

If some of them response returns status_code different than 200, 302 or 401, test fails.This results means that somethings is wrong. Simple, isn’t it ?  Try to test it manually…. This one short test cover about 40 use cases…

This example shows an incredible value of unit tests! If developer makes a bug in response he receives an error that something is wrong with a view. Travis CI allows to reject all  wrong pull requests and merge only these which fulfill our quality requirements.   

Fixing  error is one part but finding a bug is even harder task. But an ability to detect bug on early stage of process development reduces cost of software.

 

Continue ReadingUnit Testing

How can you get an access to Instagram API?

unnamed.png

First of all you need to know that Instagram API uses OAuth 2.0 protocol. OAuth 2.0 provides a specific authorization flow for web apps, desktop apps and mobile apps. Instagram requires authentication before getting information from their API, but don’t be afraid it’s very simple.

Pre Requirements:

  • Created account in Instagram
  • Registered Client(You can create your own client here)

Instagram Developer Documentation.png

Requirements:

CLIENT_ID -79e1a142dbeabd57a3308c52ad43e31d
CLIENT_SECRET -34a6834081c44c20bd11e0a112a6adg1
REDIRECT_URI - http://127.0.0.1:8001/iCallback

You can get above information from https://www.instagram.com/developer/clients/manage/

CODE - You need to open page https://api.instagram.com/oauth/authorize/?client_id=CLIENT-ID&redirect_uri=REDIRECT-URI&response_type=code
https://api.instagram.com/oauth/authorize/?client_id=79e1a142dbeabd57a3308c52ad43e31d&redirect_uri=http://127.0.0.1:8001/iCallback&response_type=code

You will be redirected to

http://your-redirect-uri?code=CODE

In my case it looks like this:

http://127.0.0.1:8001/iCallback?code=2e122f3d76e8125b8b4982f16ed623c2

Now we have all information to get access token!

 curl -F 'client_id=CLIENT_ID'  
      -F 'client_secret=CLIENT_SECRET' 
      -F 'grant_type=authorization_code'  
      -F 'redirect_uri=REDIRECT_URI' 
      -F 'code=CODE'  https://api.instagram.com/oauth/access_token

if everything is ok you should receive

 { "access_token": "fb2e77d.47a0479900504cb3ab4a1f626d174d2d",
   "user": { "id": "1574083",
   "username": "rafal_kowalski",
   "full_name": "Rafal Kowalski",
   "profile_picture": "..." } } 

In Open Event we used it to get all media from instagram – to use it  as for example a background in an event details’ page

curl 'https://api.instagram.com/v1/users/self/media/recent/?access_token=ACCESS_TOKEN'

 

Continue ReadingHow can you get an access to Instagram API?

Adding revisioning to SQLAlchemy Models

{ Repost from my personal blog @ https://blog.codezero.xyz/adding-revisioning-to-sqlalchemy-models }

In an application like Open Event, where a single piece of information can be edited by multiple users, it’s always good to know who changed what. One should also be able to revert to a previous version if needed. That is where revisioning comes into picture.

We use SQLAlchemy as the database toolkit and ORM. So, we wanted a versioning tool that would work well with our existing setup. That’s when I came across SQLAlchemy-Continuum – a versioning extension for SQLAlchemy.

Installation

The installation of the module is just like any other python library. (don’t forget to add it to your requirements.txt file, if you have one)

pip install SQLAlchemy-Continuum
Setup

Now, it’s time to enable SQLAlchemy-Continuum for the required models.

Let’s consider an Event model.

import sqlalchemy as sa

class Event(Base):
    __tablename__ = 'events'
    id = sa.Column(sa.Integer, primary_key=True, autoincrement=True)
    name = sa.Column(sa.String)
	start_time = sa.Column(db.DateTime, nullable=False)
    end_time = sa.Column(db.DateTime, nullable=False)
    description = db.Column(db.Text)
    schedule_published_on = db.Column(db.DateTime)

We need to do three things to enable SQLAlchemy-Continuum.

  1. Call make_versioned() before the model(s) is/are defined.
  2. Add __versioned__ = {} to all the models that we want to be versioned
  3. Call configure_mappers from SQLAlchemy after declaring all the models.
import sqlalchemy as sa
from sqlalchemy_continuum import make_versioned

# Must be called before defining all the models
make_versioned()

class Event(Base):

    __tablename__ = 'events'
    __versioned__ = {}  # Must be added to all models that are to be versioned

    id = sa.Column(sa.Integer, primary_key=True, autoincrement=True)
    name = sa.Column(sa.String)
	start_time = sa.Column(db.DateTime, nullable=False)
    end_time = sa.Column(db.DateTime, nullable=False)
    description = db.Column(db.Text)
    schedule_published_on = db.Column(db.DateTime)

# Must be called after defining all the models
sa.orm.configure_mappers()

SQLAlchemy-Continuum creates two tables:

  1. events_version which stores the version history for the Event model linked to the transaction table via a foreign key
  2. transaction which stores information about each transaction (like the user who performed the transaction, transaction datetime, etc)

SQLAlchemy-Continuum also adds listeners to Event to record all create, update, delete actions.

Usage

All the CRUD (Create, read, update, delete) operations can be done as usual. SQLAlchemy-Continuum takes care of creating a version record for each CUD operation. The versions can be easily accessed using the versions property that is now available in the Event model.

event = Event(name="FOSSASIA 2017", description="Open source conference in asia")
session.add(event)  # Event added to transaction
session.commit() # Transaction comitted and event recored created

# This would have created the first version record which can be accessed
event.versions[0].name == "FOSSASIA 2017"

# Lets make some changes to the recored.
event.name = "FOSSASIA 2016"
session.add(event)
session.commit()

# This would have created the second version record which can be accessed
event.versions[1].name == "FOSSASIA 2016"

# The first version record still remains and can be accessed
event.versions[0].name == "FOSSASIA 2017"

So, that’s how basic versioning can be implemented in SQLAlchemy using SQLAlchemy-Continuum.

Continue ReadingAdding revisioning to SQLAlchemy Models