Implementing Database Migrations to Badgeyay

Badgeyay project is divided into two parts i.e front-end of Ember JS and back-end with REST-API programmed in Python.

We have integrated PostgreSQL as the object-relational database in Badgeyay and we are using SQLAlchemy SQL Toolkit and Object Relational Mapper tools for working with databases and Python. As we have Flask microframework for Python, so we are having Flask-SQLAlchemy as an extension for Flask that adds support for SQLAlchemy to work with the ORM.

One of the challenging jobs is to manage changes we make to the models and propagate these changes in the database. For this purpose, I have added Added Migrations to Flask SQLAlchemy for handling database changes using the Flask-Migrate extension.

In this blog, I will be discussing how I added Migrations to Flask SQLAlchemy for handling Database changes using the Flask-Migrate extension in my Pull Request.

First, Let’s understand Database Models, Migrations, and Flask Migrate extension. Then we will move onto adding migrations using Flask-Migrate. Let’s get started and understand it step by step.

What are Database Models?

A Database model defines the logical design and structure of a database which includes the relationships and constraints that determine how data can be stored and accessed. Presently, we are having a User and file Models in the project.

What are Migrations?

Database migration is a process, which usually includes assessment, database schema conversion. Migrations enable us to manipulate modifications we make to the models and propagate these adjustments in the database. For example, if later on, we make a change to a field in one of the models, all we will want to do is create and do a migration, and the database will replicate the change.

What is Flask Migrate?

Flask-Migrate is an extension that handles SQLAlchemy database migrations for Flask applications using Alembic. The database operations are made available through the Flask command-line interface or through the Flask-Script extension.

Now let’s add support for migration in Badgeyay.

Step 1 :

pip install flask-migrate

 

Step 2 :

We will need to edit run.py and it will look like this :

import os
from flask import Flask
from flask_migrate import Migrate  // Imported Flask Migrate

from api.db import db
from api.config import config

......

db.init_app(app)
migrate = Migrate(app, db) // It will allow us to run migrations
......

@app.before_first_request
def create_tables():
    db.create_all()

if __name__ == '__main__':
    app.run()

 

Step 3 :

Creation of Migration Directory.

 export FLASK_APP=run.py
 flask db init

 

This will create Migration Directory in the backend API folder.

└── migrations
    ├── README
    ├── alembic.ini
    ├── env.py
    ├── script.py.mako
    └── versions

 

Step 4 :

We will do our first Migration by the following command.

flask db migrate

 

Step 5 :

We will apply the migrations by the following command.

flask db upgrade

 

Now we are all done with setting up Migrations to Flask SQLAlchemy for handling database changes in the badgeyay repository. We can verify the Migration by checking the database tables in the Database.

This is how I have added Migrations to Flask SQLAlchemy for handling Database changes using the Flask-Migrate extension in my Pull Request.

Resources:

  • PostgreSQL Docs    – Link
  • Flask Migrate Docs  – Link
  • SQLAlchemy Docs  – Link
  • Flask SQLAlchemy Docs – Link
Continue ReadingImplementing Database Migrations to Badgeyay

Database Listener for User Centric Events

Badgeyay is an open-source utility developed by FOSSASIA to generate badges for conferences and events. The project is separated into two components to ease maintainability. First is the frontend part which is in ember and second part is backend which is in Flask. The choice of database to support backend is PostgreSQL.

Now comes the problem, whenever a user is registered in the database, he should receive  a verification mail, that he is successfully registered on the platform. For this case we have to listen to the database events on User model. This issue has greater extendibility than only sending greeting or verification mail to the user. We can extend this to trigger services that are dependent on user registration, like subscribing the user to some set of services based on the plan he opted while registration and many more.

These type of issues cannot be handled by normal relationship with tables and other entities, there has to be logic in place to support such functionalities. So the challenges for tackling the problem are as follows:

  • Listen to the insert_action in User model
  • Extracting the details necessary for the logic
  • Execute particular logic

Procedure

  1. Attaching insert_action listener to the User model. This function will get triggered whenever an entity is saved in the User model.

<!– HTML generated using hilite.me –>

@db.event.listens_for(User, "after_insert")
def logic(mapper, connection, target): {
......
}
  1. When the function gets triggered, extract the details of the saved user that is necessary for the logic. As currently we are sending greeting mail to the user,we only need the email of the user. Target is the actual saved user passed as argument to the listening function from the library.

<!– HTML generated using hilite.me –>

msg = {}
msg['subject'] = "Welcome to Badgeyay"
msg['receipent'] = target.email
msg['body'] = "It's good to have you onboard with Badgeyay. Welcome to " \
"FOSSASIA Family."
sendMail(msg)
  1. Now the details are passed to sendMail() function for sending mail which uses flask-mail library to send mail to the recipient.
    def sendMail(message):
    if message and message.receipent:
    try:
    msg = Message(
    subject=message.subject,
    sender=app.config['MAIL_USERNAME'], Response(200).generateMessage(
    recipients=[message.receipent],
    body=message.body)
    Mail(app).send(msg)
    except Exception as e:
    return jsonify(
    Response(500).exceptWithMessage(
    str(e),
    'Unable to send the mail'))
    return jsonify(
    Response(200).generateMessage(
    'Mail Sent'))
    else:
    return jsonify(
    Response(403).generateMessage(
    'No data received')) 'No data received'))
    
  2. This will send mail to the user who has been registered to the application.

Similarly we can use separate logics according to the need of the application.

 

The Pull Request for the above functionality is at this Link

Topics Involved

Working on the issue involve following topics:

  • Configuring mail service to allow insecure apps access.
  • Sending mail from the flask-mail to end user
  • Attaching listener to listen for database change
  • Extraction of data from saved object in database sqlalchemy.

Resources

  • Sending Mails Programmatically –  Link
  • Flask Mail Documentation – Link
  • Listening to database events – Link
  • Enabling access to GMAIL to send mails to recipient – Link
Continue ReadingDatabase Listener for User Centric Events

Adding multiple email support for users on Open Event Server

The Open Event Server enables organizers to manage events from concerts to conferences and meet-ups. It offers features for events with several tracks and venues. Event managers can create invitation forms for speakers and build schedules in a drag and drop interface. The event information is stored in a database. The system provides API endpoints to fetch the data, and to modify and update it.

The Open Event Server is based on JSON 1.0 Specification and hence build on top of Flask Rest Json API (for building Rest APIs) and Marshmallow (for Schema).

In this blog, we will talk about how to add support of multiple emails for a user in Open Event Server. The focus is on model and schema creation for this support.

Model Creation

For the UserEmail, we’ll make our model as follows

from app.models import db

class UserEmail(db.Model):
“””user email model class”””
__tablename__ = ‘user_emails’
id = db.Column(db.Integer, primary_key=True)
email = db.Column(db.String(120), unique=True, nullable=False)
verified = db.Column(db.Boolean, default=False)
user_id = db.Column(db.Integer, db.ForeignKey(‘users.id’, ondelete=’CASCADE’))
user = db.relationship(“User”, backref=”emails”, foreign_keys=[user_id])

def __init__(self, email=None, user_id=None):
self.email = email
self.user_id = user_id

def __str__(self):
return ‘User:’ + unicode(self.user_id).encode(‘utf-8’) + ‘ email: ‘ + unicode(self.email).encode(‘utf-8’)

def __unicode__(self):
return unicode(self.id)

Now, let’s try to understand the attributes of this model.

  1. id is most important Column required in every model to set it as primary key and to uniquely identify an UserEmail object.
  2. email is that attribute which is required hence should be unique and non-nullable.
  3. Verified attribute is used to check whether a email is verified or not (thus should be boolean)
  4. User_id is the attribute which specifies id of the user whose email is contained in the UserEmail object.
  5. Finally, a relationship with the user of id user_id and these emails (associated with the User.id == user_id) will be stored in the attribute emails in User Model.

Schema Creation

For the model UserEmail, we’ll make our schema UserEmailSchema as follows

from marshmallow_jsonapi import fields
from marshmallow_jsonapi.flask import Schema, Relationshipfrom app.api.helpers.utilities import dasherizeclass UserEmailSchema(Schema):
“””   API Schema for user email Model   “””class Meta:
“””  Meta class for user email API schema  “””
type_ = ‘user-emails’
self_view = ‘v1.user_emails_detail’
self_view_kwargs = {‘id’: ‘<id>’}
inflect = dasherize

id = fields.Str(dump_only=True)
email = fields.Email(allow_none=False)
user_id = fields.Integer(allow_none=False)
user = Relationship(attribute=’user’,
self_view=’v1.user_email’,
self_view_kwargs={‘id’: ‘<id>’},
related_view=’v1.user_detail’,
related_view_kwargs={‘user_id’: ‘<id>’},
schema=’UserSchema’,
type_=’user’
)

  • Marshmallow-jsonapi provides a simple way to produce JSON API-compliant data in any Python web framework.

Now, let’s try to understand the schema UserEmailSchema

  1. id : Same as in model id is used as uniquely identify an UserEmail object.
  2. email : Same as in model email is required thus allow_none is set to False.
  3. User_id : user_id is the id of user whose email is contained in a UserEmailSchema object.
  4. User : It tells whole attributes of the user to which this email belongs to.

So, we saw how to add multiple email support for users on Open Event Server. We just required to create a model and its schema to add this feature. Similarly, to add support for any database model in the project, we need to create Model and Schema with all the attributes as specified in the model too. This Schema creation is done with guidelines of JSONAPI 1.0 Specification using Marshmallow.

Resources

Continue ReadingAdding multiple email support for users on Open Event Server

How to use Realm in SUSI Android to Save Data

Sometimes we need to store information on the device locally so that we can use information offline and also query data faster. Initially, SQLite was only option to store information on the device. But working with SQLite is difficult sometimes and also it makes code difficult to understand. Also, SQL queries take a long time. But now we have realm a better alternative of SQLite. The Realm is a lightweight mobile database and better substitute of SQLite. The Realm has own C++ core and store data in a universal, table-based format by a C++ core. This allows Realm to allow data access from multiple languages as well as a range of queries. In this blog post, I will show you why we used Realm and how we save data in SUSI Android using Realm.

“How about performance? Well, we’re glad you asked 🙂 For all the API goodness & development productivity we give you, we’re still up to 100x faster than some SQLite ORMs and on average ~10x faster than raw SQLite and common ORMs for typical operations.” (compare: https://blog.realm.io/realm-for-android/)

Advantages of Realm over SQLite are following:

  • It is faster than SQLite as explained on the Realm blog. One of the reasons realm is faster than SQLite is, the traditional SQLite + ORM abstraction is leaky because ORM simply converts  Objects and their methods into SQL statements. Realm, on the other hand, is an object database, meaning your objects directly reflect your database.
  • It is easier to use as it uses objects for storing data. When we use SQLite we need boilerplate code to convert values to and from the database, setting up mappings between classes and tables, fields and columns, foreign keys, etc. Whereas in Realm data is directly exposed as objects and can be queried without any conversion.

Prerequisites

To include this library in your project you need

  • Android studio version 1.5.1 or higher.
  • JDK version 7.0 or higher.
  • Android API level 9 or higher.

How to use realm in Android

To use Realm in your project we must add the dependency of the library in build.gradle(project) file 

 dependencies {
       classpath “io.realm:realm-gradle-plugin:3.3.1”
   }

and build.gradle(module) file.

apply plugin: realm-android
dependencies {
compile io.realm:android-adapters:1.3.0
}

Now you have to instantiate Realm in your application class. Setting a default configuration in your Application class, will ensure that it is available in the rest of your code.

RealmConfiguration realmConfiguration = new RealmConfiguration.Builder(this)
                                                              .deleteRealmIfMigrationNeeded().build();
Realm.setDefaultConfiguration(realmConfiguration);

Now we need to create a model class. A model class is use to save data in Realm and retrieve saved data and it must extend RealmObject class. For eg.

public class Person extends RealmObject {
   private String name;
   public String getName() {
       return name;
   }
   public void setName(String name) {
       this.name = name;
   }
}

Field in the model class uses to define columns. For eg. ‘name’ is a column name. Method like setName() use to save data  and getName() use to retrieve saved data.

Now create an instance of the Realm in the activity where you want to use it. It will be used to read data from the Realm and write data to the Realm.

Realm realm = Realm.getInstance(this);

Before you start a new transaction you must call beginTransaction(). It will open database.

realm.beginTransaction();

To write data to the Realm you need to create an instance of the model class. createObject used to create an instance of RealmObject class. Our model class is RealmObject type so we use createObject() to create an instance of the model class.

Person person = realm.createObject(Person.class);

Write data to realm.

person.setName(“MSDHONI”);

And after it you must call commitTransaction(). commitTransaction() use to end transaction.

realm.commitTransaction();

Reading data from Realm is easier than writing data to it. You need to create an instance of the Realm.

Realm realm = Realm.getInstance(this);

To create query use where the method and pass the class of object you want to query. After creating query you can fetch all data using findAll() method.

realm.where(Person.class).findAll();

Reference

Continue ReadingHow to use Realm in SUSI Android to Save Data

Filtering List with Search Manager in Connfa Android App

It is a good practice to provide the facility to filter lists in Android apps to improve the user experience. It often becomes very unpleasing to scroll through the entire list when you want to reach a certain data point. Recently I modified Connfa app to read the list of speakers from the Open Event Format. In this blog I describe how to add filtering facility in lists with Search Manager.

First, we declare the search menu so that the widget appears in it.

<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android">
    <item android:id="@+id/search"
        android:title="Search"
        android:icon="@drawable/search"
        android:showAsAction="collapseActionView ifRoom"
        android:actionViewClass="android.widget.SearchView" />
</menu>

In above menu item the collapseActionView attribute allows your SearchView to expand to take up the whole action bar and collapse back down into a normal action bar item when not in use. Now we create the SearchableConfiguration which defines how SearchView behaves.

<?xml version="1.0" encoding="utf-8"?>
<searchable
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:label="@string/app_name"
    android:hint="Search friend">
</searchable>

Also add this to the activity that will be used with <meta-data> tag in the manifest file. Then associate searchable configuration with the SearchView in the activity class

@Override
public boolean onCreateOptionsMenu(Menu menu) {
    MenuInflater inflater = getMenuInflater();
    inflater.inflate(R.menu.search_menu, menu);

    SearchManager searchManager = (SearchManager)
                            getSystemService(Context.SEARCH_SERVICE);
    searchMenuItem = menu.findItem(R.id.search);
    searchView = (SearchView) searchMenuItem.getActionView();

    searchView.setSearchableInfo(searchManager.
                            getSearchableInfo(getComponentName()));
    searchView.setSubmitButtonEnabled(true);
    searchView.setOnQueryTextListener(this);

    return true;
}

Implement SearchView.OnQueryTextListener in activity, need to override two new methods now

@Override
public boolean onQueryTextSubmit(String searchText) {
  
  return true;
}

@Override
public boolean onQueryTextChange(String searchedText) {

   if (mSpeakersAdapter != null) {
       lastSearchRequest = searchedText;
       mSpeakersAdapter.getFilter().filter(searchedText);
   }
   return true;
}

Find the complete implementation here. In the end it will look like this,

 

References

Android Search View documentation – https://developer.android.com/reference/android/widget/SearchView.html

Continue ReadingFiltering List with Search Manager in Connfa Android App

Using Data Access Object to Store Information

We often need to store the information received from the network to retrieve that later. Although we can store and read data directly but by using data access object to store information enables us to do data operations without exposing details of the database. Using data access object is also a best practice in software engineering. Recently I modified Connfa app to store the data received in Open Event format. In this blog, I describe how to use data access object.

The goal is to abstract and encapsulate all access to the data and provide an interface. This is called Data Access Object pattern. In a nutshell, the DAO “knows” which data source (that could be a database, a flat file or even a WebService) to connect to and is specific for this data source. It makes no difference in applications when it accesses a relational database or parses xml files (using a DAO). The DAO is usually able to create an instance of a data object (“to read data”) and also to persist data (“to save data”) to the datasource.

Consider the example from Connfa app in which get the tracks from API and store them in SQL database. We use DAO to create a layer between model and database. Where AbstractEntityDAO is an abstract class which have the functions to perform CRUD operation. We extend it to implement them in our DAO model. Here is TrackDAO structure,

public class TrackDao extends AbstractEntityDAO<Track, Long> {

    public static final String TABLE_NAME = "table_track";

    @Override
    protected String getSearchCondition() {
        return "_id=?";
    }
    
    ...
}

Find the complete class to see the detailed methods to implement search conditions, get key columns, create instance etc.  here.

Here is a general method to get the data from the database. Where getFacade() for the given layer element, this method returns the requested facade object to represent the passed in layer element.

public List<ClassToSave> getAllSafe() {
   ILAPIDBFacade facade = getFacade();
   try {
       facade.open();
       return getAll();

   } finally {
       facade.close();
   }
}

Now we can create an instance to use these methods instead of directly using SQL operations. This functions gets the data and sort it accordingly.

private TrackDao mTrackDao;
 public List<Track> getTracks() {
   List<Track> tracks = mTrackDao.getAllSafe();
   Collections.sort(tracks, new Comparator<Track>() {
       @Override
       public int compare(Track track, Track track2) {
           return Double.compare(track.getOrder(), track2.getOrder());
       }
   });
   return tracks;
}

References:

 

Continue ReadingUsing Data Access Object to Store Information

How User Event Roles relationship is handled in Open Event Server

Users and Events are the most important part of FOSSASIA‘s Open Event Server. Through the advent and upgradation of the project, the way of implementing user event roles has gone through a lot many changes. When the open event organizer server was first decoupled to serve as an API server, the user event roles like all other models was decided to be served as a separate API to provide a data layer above the database for making changes in the entries. Whenever a new role invite was accepted, a POST request was made to the User Events Roles table to insert the new entry. Whenever there was a change in the role of an user for a particular event, a PATCH request was made. Permissions were made such that a user could insert only his/her user id and not someone else’s entry.

def before_create_object(self, data, view_kwargs):
        """
        method to create object before post
        :param data:
        :param view_kwargs:
        :return:
        """
        if view_kwargs.get('event_id'):
            event = safe_query(self, Event, 'id', view_kwargs['event_id'], 'event_id')
            data['event_id'] = event.id

        elif view_kwargs.get('event_identifier'):
            event = safe_query(self, Event, 'identifier', view_kwargs['event_identifier'], 'event_identifier')
            data['event_id'] = event.id
        email = safe_query(self, User, 'id', data['user'], 'user_id').email
        invite = self.session.query(RoleInvite).filter_by(email=email).filter_by(role_id=data['role'])\
                .filter_by(event_id=data['event_id']).one_or_none()
        if not invite:
            raise ObjectNotFound({'parameter': 'invite'}, "Object: not found")

    def after_create_object(self, obj, data, view_kwargs):
        """
        method to create object after post
        :param data:
        :param view_kwargs:
        :return:
        """
        email = safe_query(self, User, 'id', data['user'], 'user_id').email
        invite = self.session.query(RoleInvite).filter_by(email=email).filter_by(role_id=data['role'])\
                .filter_by(event_id=data['event_id']).one_or_none()
        if invite:
            invite.status = "accepted"
            save_to_db(invite)
        else:
            raise ObjectNotFound({'parameter': 'invite'}, "Object: not found")


Initially what we did was when a POST request was sent to the User Event Roles API endpoint, we would first check whether a role invite from the organizer exists for that particular combination of user, event and role. If it existed, only then we would make an entry to the database. Else we would raise an “Object: not found” error. After the entry was made in the database, we would update the role_invites table to change the status for the role_invite.

Later it was decided that we need not make a separate API endpoint. Since API endpoints are all user accessible and may cause some problem with permissions, it was decided that the user event roles would be handled entirely through the model instead of a separate API. Also, the workflow wasn’t very clear for an user. So we decided on a workflow where the role_invites table is first updated with the particular status and after the update has been made, we make an entry to the user_event_roles table with the data that we get from the role_invites table.

When a role invite is accepted, sqlalchemy add() and commit() is used to insert a new entry into the table. When a role is changed for a particular user, we make a query, update the values and save it back into the table. So the entire process is handled in the data layer level rather than the API level.

The code implementation is as follows:

def before_update_object(self, role_invite, data, view_kwargs):
        """
        Method to edit object
        :param role_invite:
        :param data:
        :param view_kwargs:
        :return:
        """
        user = User.query.filter_by(email=role_invite.email).first()
        if user:
            if not has_access('is_user_itself', id=user.id):
                raise UnprocessableEntity({'source': ''}, "Only users can edit their own status")
        if not user and not has_access('is_organizer', event_id=role_invite.event_id):
            raise UnprocessableEntity({'source': ''}, "User not registered")
        if not has_access('is_organizer', event_id=role_invite.event_id) and (len(data.keys())>1 or 'status' not in data):
            raise UnprocessableEntity({'source': ''}, "You can only change your status")

    def after_update_object(self, role_invite, data, view_kwargs):
        user = User.query.filter_by(email=role_invite.email).first()
        if 'status' in data and data['status'] == 'accepted':
            role = Role.query.filter_by(name=role_invite.role_name).first()
            event = Event.query.filter_by(id=role_invite.event_id).first()
            uer = UsersEventsRoles.query.filter_by(user=user).filter_by(event=event).filter_by(role=role).first()
            if not uer:
                uer = UsersEventsRoles(user, event, role)
                save_to_db(uer, 'Role Invite accepted')


In the above code, there are two main functions –
before_update_object which gets executed before the entry in the role_invites table is updated, and after_update_object which gets executed after.

In the before_update_object, we verify that the user is accepting or rejecting his own role invite and not someone else’s role invite. Also, we ensure that the user is allowed to only update the status of the role invite and not any other sensitive data like the role_name or email. If the user tried to edit any other field except status, then an error is shown to him/her. However if the user has organizer access, then he/she can edit the other fields of the role_invites table as well. The has_access() helper permission function helps us ensure the permission checks.

In the after_update_object we make the entry to the user event roles table. In the after_update_object from the role_invite parameter we can get the exact values of the newly updated row in the table. We use the data of this role invite to find the user, event and role associated with this role. Then we create a UsersEventsRoles object with user, event and role as parameters for the constructor. Then we use save_to_db helper function to save the new entry to the database. The save_to_db function uses the session.add() and session.commit() functions of flask-sqlalchemy to add the new entry directly to the database.

Thus, we maintain the flow of the user event roles relationship. All the database entries and operation related to users-events-roles table remains encapsulated from the client user so that they can use the various API features without thinking about the complications of the implementations.

 

Reference:

Continue ReadingHow User Event Roles relationship is handled in Open Event Server

Save Chat Messages using Realm in SUSI iOS

Fetching data from the server each time causes a network load which makes the app depend on the server and the network in order to display data. We use an offline database to store chat messages so that we can show messages to the user even if network is not present which makes the user experience better. Realm is used as a data storage solution due to its ease of usability and also, since it’s faster and more efficient to use. So in order to save messages received from the server locally in a database in SUSI iOS, we are using Realm and the reasons for using the same are mentioned below.

The major upsides of Realm are:

  • It’s absolutely free of charge,
  • Fast, and easy to use.
  • Unlimited use.
  • Work on its own persistence engine for speed and performance

Below are the steps to install and use Realm in the iOS Client:

Installation:

  • Install Cocoapods
  • Run `pod repo update` in the root folder
  • In your Podfile, add use_frameworks! and pod ‘RealmSwift’ to your main and test targets.
  • From the command line run `pod install`
  • Use the `.xcworkspace` file generated by Cocoapods in the project folder alongside `.xcodeproj` file

After installation we start by importing `Realm` in the `AppDelegate` file and start configuring Realm as below:

func initializeRealm() {
        var config = Realm.Configuration(schemaVersion: 1,
            migrationBlock: { _, oldSchemaVersion in
                if (oldSchemaVersion < 0) {
                    // Nothing to do!
                }
        })
        config.fileURL = config.fileURL?.deletingLastPathComponent().appendingPathComponent("susi.realm")
        Realm.Configuration.defaultConfiguration = config
}

Next, let’s head over to creating a few models which will be used to save the data to the DB as well as help retrieving that data so that it can be easily used. Since Susi server has a number of action types, we will cover some of the action types, their model and how they are used to store and retrieve data. Below are the currently available data types, that the server supports.

enum ActionType: String {
  case answer
  case websearch
  case rss
  case table
  case map 
  case anchor
}

Let’s start with the creation of the base model called `Message`. To make it a RealmObject, we import `RealmSwift` and inherit from `Object`

class Message: Object {
  dynamic var queryDate = NSDate()
  dynamic var answerDate = NSDate()
  dynamic var message: String = ""
  dynamic var fromUser = true
  dynamic var actionType = ActionType.answer.rawValue
  dynamic var answerData: AnswerAction?
  dynamic var mapData: MapAction?
  dynamic var anchorData: AnchorAction?
}

Let’s study these properties of the message one by one.

  • `queryDate`: saves the date-time the query was made
  • `answerDate`: saves the date-time the query response was received
  • `message`: stores the query/message that was sent to the server
  • `fromUser`: a boolean which keeps track who created the message
  • `actionType`: stores the action type
  • `answerData`, `rssData`, `mapData`, `anchorData` are the data objects that actually store the respective action’s data

To initialize this object, we need to create a method that takes input the data received from the server.

// saves query and answer date
if let queryDate = data[Client.ChatKeys.QueryDate] as? String,
let answerDate = data[Client.ChatKeys.AnswerDate] as? String {
  message.queryDate = dateFormatter.date(from: queryDate)! as NSDate
  message.answerDate = dateFormatter.date(from: answerDate)! as NSDate}if let type = action[Client.ChatKeys.ResponseType] as? String,
  let data = answers[0][Client.ChatKeys.Data] as? [[String : AnyObject]] {
  if type == ActionType.answer.rawValue {
     message.message = action[Client.ChatKeys.Expression] as! String
     message.actionType = ActionType.answer.rawValue
    message.answerData = AnswerAction(action: action)
  } else if type == ActionType.map.rawValue {
    message.actionType = ActionType.map.rawValue
    message.mapData = MapAction(action: action)
  } else if type == ActionType.anchor.rawValue {
    message.actionType = ActionType.anchor.rawValue
    message.anchorData = AnchorAction(action: action)
    message.message = message.anchorData!.text
  }
}

Since, the response from the server for a particular query might contain numerous action types, we create loop inside a method to capture all those action types and save each one of them. Since, there are multiple action types thus we need a list containing all the messages created for the action types. For each action in the loop, corresponding data is saved into their specific objects.

Let’s discuss the individual action objects now.

  • AnswerAction
class AnswerAction: Object {
  dynamic var expression: String = ""
  convenience init(action: [String : AnyObject]) {
    self.init()
    if let expression = action[Client.ChatKeys.Expression] as? String {
      self.expression = expression
    }
  }
}

 This is the simplest action type implementation. It contains a single property `expression` which is a string type. For initializing it, we take the action object and extract the expression key-value and save it.

if type == ActionType.answer.rawValue {
  message.message = action[Client.ChatKeys.Expression] as! String
  message.actionType = ActionType.answer.rawValue
  // pass action object and save data in `answerData`
  message.answerData = AnswerAction(action: action)
}

Above is the way an answer action is checked and data saved inside the `answerData` variable.

2)   MapAction

class MapAction: Object {
  dynamic var latitude: Double = 0.0
  dynamic var longitude: Double = 0.0
  dynamic var zoom: Int = 13

  convenience init(action: [String : AnyObject]) {
    self.init()
    if let latitude = action[Client.ChatKeys.Latitude] as? String,
    let longitude = action[Client.ChatKeys.Longitude] as? String,
    let zoom = action[Client.ChatKeys.Zoom] as? String {
      self.longitude = Double(longitude)!
      self.latitude = Double(latitude)!
      self.zoom = Int(zoom)!
    }
  }
}

This action implementation contains three properties, `latitude` `longitude` `zoom`. Since the server responds the values inside a string, each of them need to be converted to their respective type using force-casting. Default values are provided for each property in case some illegal value comes from the server.

3)   AnchorAction

class AnchorAction: Object {
  dynamic var link: String = ""
  dynamic var text: String = ""

  convenience init(action: [String : AnyObject]) {
    self.init()if let link = action[Client.ChatKeys.Link] as? String,
    let text = action[Client.ChatKeys.Text] as? String {
      self.link = link
      self.text = text
    }
  }
}

Here, the link to the openstreetmap website is saved in order to retrieve the image for displaying.

Finally, we need to call the API and create the message object and use the `write` clock of a realm instance to save it into the DB.

if success {
  self.collectionView?.performBatchUpdates({
    for message in messages! {
    // real write block
      try! self.realm.write {
        self.realm.add(message)
        self.messages.append(message)
        let indexPath = IndexPath(item: self.messages.count - 1, section: 0)
        self.collectionView?.insertItems(at: [indexPath])
      }
   }
}, completion: { (_) in
    self.scrollToLast()
  })
}

list of message items and inserted into the collection view.Below is the output of the Realm Browser which is a UI for viewing the database.

References:

Continue ReadingSave Chat Messages using Realm in SUSI iOS

Persistence Layer in Open Event Organizer Android App

Open Event Organizer is an Event Managing Android App with the core features of Attendee Check In by QR Code Scan and Data Sync with the Open Event API Server. As an event can be large, so the app will be dealing with a large amount of a data. Hence to avoid repetitive network requests for fetching the data, the app maintains a local database containing all the required data and the database is synced with the server. Android provides android.database.sqlite package which contains the API needed to use the database on the Android. But it is really not a good practice to use the sqlite queries everywhere in the app. So there comes a persistence layer. A persistence layer works between the database and the business logic. Open Event Organizer uses Raizlabs’s DbFlow, an ORM based Android Database Library for the same. I will be talking about its implementation through the app in this blog.

First of all, you declare the base class of the database which is used to create the database by Android for the app. You declare all the base constants here. The class looks like:

@Database(
   name = OrgaDatabase.NAME,
   version = OrgaDatabase.VERSION,
   ...
)
public class OrgaDatabase {
   public static final String NAME = "orga_database";
   public static final int VERSION = 2;
   ...
}

OrgaDatabase.java
app/src/main/java/org/fossasia/openevent/app/data/db/configuration/OrgaDatabase.java

Initialise the database in the Application class using FlowManager provided by the library. Choose the Application class to do this to ensure that the library finds the generated code in the DbFlow.

FlowManager.init(
   new FlowConfig.Builder(context)
       .addDatabaseConfig(
           new DatabaseConfig.Builder(OrgaDatabase.class)
           ...
           .build()
       )
       .build());

OrgaApplication.java
app/src/main/java/org/fossasia/openevent/app/OrgaApplication.java

The database is created now. For tables creation, DbFlow uses model classes which must be annotated using the annotations provided by the library. The basic annotations are – @Table, @PrimaryKey, @Column, @ForeignKey etc.

For example, the Attendee class in the app looks like:

@Table(database = OrgaDatabase.class)
public class Attendee ... {

   @PrimaryKey
   public long id;

   @Column
   public boolean checkedIn;
   ...
   ...
   @ForeignKey(
       onDelete = ForeignKeyAction.CASCADE,
       onUpdate = ForeignKeyAction.CASCADE)
   public Order order;
   ...
}

Attendee.java
app/src/main/java/org/fossasia/openevent/app/data/models/Attendee.java

This will create a table named attendee with the columns and relationships annotated. Now comes the part of accessing data from the database. Open Event App uses RxJava’s support to the DbFlow library which enables async data accessing. The getItems method from DataBaseRepository looks like:

public <T> Observable<T> getItems(Class<T> typeClass, SQLOperator... conditions) {
   return RXSQLite.rx(SQLite.select()
       .from(typeClass)
       .where(conditions))
       .queryList()
       .flattenAsObservable(items -> items);
}

 

The method returns an observable emitting the items from the result. For data saving, the method looks like:

DatabaseDefinition database = FlowManager.getDatabase(OrgaDatabase.class);
FastStoreModelTransaction<T> transaction = FastStoreModelTransaction
   .insertBuilder(FlowManager.getModelAdapter(itemClass))
   .addAll(items)
   .build();
database.executeTransaction(transaction);

 

And for updating data, the method looks like:

ModelAdapter<T> modelAdapter = FlowManager.getModelAdapter(classType);
modelAdapter.update(item);

DatabaseRepository.java
app/src/main/java/org/fossasia/openevent/app/data/db/DatabaseRepository.java

DbFlow provides DirectModelNotifier which is used to get notified of the database change anywhere in the app. Open Event App uses PublishSubjects to send notifications on database change event. The implementation of the DatabaseChangeListener in the app looks like:

public class DatabaseChangeListener<T> ... {
   private PublishSubject<ModelChange<T>> publishSubject = PublishSubject.create();
   private DirectModelNotifier.ModelChangedListener<T> modelModelChangedListener;
   ...
   public void startListening() {
       modelModelChangedListener = new DirectModelNotifier.ModelChangedListener<T>() {
           @Override
           public void onTableChanged(@Nullable Class<?> aClass, @NonNull BaseModel.Action action) {
               // No action to be taken
           }
           @Override
           public void onModelChanged(@NonNull T model, @NonNull BaseModel.Action action) {
               publishSubject.onNext(new ModelChange<>(model, action));
           }
       };
       DirectModelNotifier.get().registerForModelChanges(classType, modelModelChangedListener);
   }
   ...
}

DatabaseChangeListener.java
app/src/main/java/org/fossasia/openevent/app/data/db/DatabaseChangeListener.java

The class is used in the app to get notified of the data change and to update the required local data fields using data from item emitted by the publishSubject of the class. This is used in the app where same data is accessed at more than one places. For example, There are two fragments – AttendeesFragment and AttendeeCheckInFragment from which an attendee’s check in status is toggled. So when the status is toggled from AttendeeCheckInFragment, the change must be updated in the AttendeesFragment’s attendees list. This is carried out using DatabaseChangeListener in the AttendeesPresenter which provides attendees list to the AttendeesFragment. And on the change in the attendee’s check in status, AttendeePresenter’s attendeeListener listens for the change and update the attendee in the list accordingly.

Links:
1. Raizlabs’s DbFlow , an ORM Android Database Library Github Repo Link
2. DbFlow documentation
3. Android database managing API android.database.sqlite

Continue ReadingPersistence Layer in Open Event Organizer Android App

Selecting Best persistent storage for Phimpme Android and how to use it

As we are progressing in our Phimpme Android app. I added account manager part which deals with connecting all other accounts to phimpme. Showing a list of connected accounts.

We need a persistent storage to store all the details such as username, full name, profile image url, access token (to access API). I researched on various Object Relation mapping (ORMs) such as:

  1. DBFlow: https://github.com/Raizlabs/DBFlow
  2. GreenDAO: https://github.com/greenrobot/greenDAO
  3. SugarORM: http://satyan.github.io/sugar/
  4. Requery: https://github.com/requery/requery

and other NoSQL databases such as Realm Database : https://github.com/realm/realm-java.

After reading a lot from some blogs on the benchmarking of these ORMs and database, I came to know that Realm database is quite better in terms of Speed of writing data and ease of use.

Steps to integrate Realm Database:

  • Installation of Realm database in android

Following these steps https://realm.io/docs/java/latest/#installation quickly setup realm in android. Add

classpath "io.realm:realm-gradle-plugin:3.3.2"

in Project level build.gradle file and Add

apply plugin: 'realm-android' 

in app level build.gradle, That’s it for using Realm

  • Generating required Realm models

Firstly, make sure what you need to store in your database. In case of phimpme, I first go through the account section and noted down what needs to be there.  Profile image URL, username, full name, account indicator image name. Below image illustrate this better.

This is the Realm Model class I made in Kotlin to store name, username and access token for accessing API.

open class AccountDatabase(
       @PrimaryKey var name: String = "",
       var username: String = "",
       var token: String = ""
) : RealmObject()

  • Writing data in database

In Account manager, I create a add account option from where a dialog appear with a list of accounts. Currently, Twitter is working, when onSuccess function invoke in AccountPickerFragment I start a twitter session and store values in database. Writing data in database:

// Begin realm transaction
realm.beginTransaction();

// Creating Realm object for AccountDatabase Class
account = realm.createObject(AccountDatabase.class,
       accountsList[0]);

account.setUsername(session.getUserName());
account.setToken(String.valueOf(session.getAuthToken()));
realm.commitTransaction();

Begin and commit block in necessary. There is one more way of doing this is using execute function in Realm

  • Use Separate Database Helper class for Database operations

It’s good to use a separate class for all the Database operations needed in the project. I created a DatabaseHelper Class and added a function to query the result needed. Query the database

public RealmResults<AccountDatabase> fetchAccountDetails(){
   return realm.where(AccountDatabase.class).findAll();
}

It give all of the results, stored in the database like below

  • Problems I faced with annotation processor while using Kotlin and Realm together

The Kotlin annotation processor not running due to the plugins wrong order. This issue https://github.com/realm/realm-java/pull/2568 helped me in solving that. I addded apply plugin: ‘kotlin-kapt’. In app gradle file and shift apply plugin: ‘realm-android’ In below the order.

Resources:

 

Continue ReadingSelecting Best persistent storage for Phimpme Android and how to use it