Serializing Java objects for REST API Requests in Open Event Organizer App

Open Event Organizer App is a client side application which uses REST API for network requests. The server supports sending and receiving of data only in JSONAPI spec, so, we needed to serialize java models into JSON objects and deserialize JSON data into java models following JSONAPI spec. To achieve this we followed the following steps.

Specifications

We will be using jasminb/jsonapi-converter which handles request/response parsing of models following JSONAPI Spec and Retrofit plugin of jackson converter to serializing JSON to Java Models and vice versa.

Let’s create a java model. We are using some annotations provided by Lombok library to avoid writing boilerplate code. @JsonNaming annotation is used to apply KebabCaseStrategy while serializing fields

@Data
@Type(“order”)
@AllArgsConstructor
@JsonNaming(PropertyNamingStrategy.KebabCaseStrategy.class)
@Table(database = OrgaDatabase.class, allFields = true)
public class Order {

@PrimaryKey
@Id(LongIdHandler.class)
public Long id;

public float amount;
public String completedAt;
public String identifier;
public String paidVia;
public String paymentMode;
public String status;

@Relationship(“event”)
@ForeignKey(stubbedRelationship = true, onDelete = ForeignKeyAction.CASCADE)
public Event event;

public Order() { }
}

In the NetworkModule class, there is a method providesMappedClasses() containing a list of classes that needs to be serialized/deserialized. We need to add the above model in the list. Then, this list is provided to Singleton instance of JSONAPIConvertorFactory through Dagger. JSONAPIConvertorFactory uses the Retrofit ObjectMapper and maps the classes that are handled by this instance.

@Provides
Class[] providesMappedClasses() {
return new Class[]{Event.class, Attendee.class, Ticket.class, Order.class};
}

Further, various serialization properties can be used while building Singleton ObjectMapper instance. Adding any properties here ensures that these are applied to all the mapped classes by JSONAPIConvertorFactory. For eg, we are using the serialization property to throw an exception and fail whenever empty beans are encountered.

@Provides
@Singleton
ObjectMapper providesObjectMapper() {
return new ObjectMapper()
.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES)
.disable(SerializationFeature.FAIL_ON_EMPTY_BEANS)
// Handle constant breaking changes in API by not including null fields
// TODO: Remove when API stabilizes and/or need to include null values is there
.setSerializationInclusion(JsonInclude.Include.NON_ABSENT);
}

Resources

  1. Github Repository for jsonapi-converter https://github.com/jasminb/jsonapi-converter
  2. Github repository for Jackson Retrofit Plugin https://github.com/square/retrofit/tree/master/retrofit-converters/jackson
  3. Official Website for Project Lombok https://projectlombok.org/

Github Repository for Open-Event-Orga-App https://github.com/fossasia/open-event-orga-app

Continue ReadingSerializing Java objects for REST API Requests in Open Event Organizer App

Adding device names’ support for check-ins to Open Event Server

The Open Event Server provides backend support to Open Event Organizer Android App which is used to check-in attendees in an event. When checking in attendees, it is important for any event organizer to keep track of the device that was used to check someone in. For this, we provide an option in the Organizer App settings to set the device name. But this device name should have support in the server as well.

The problem is to be able to add device name data corresponding to each check-in time. Currently attendees model has an attribute called `checkin-times`, which is a csv of time strings. For each value in the csv, there has to be a corresponding device name value. This could be achieved by providing a similar csv key-value pair for “device-name-checkin”.

The constraints that we need to check for while handling device names are as follows:

  • If there’s `device_name_checkin` in the request, there must be `is_checked_in` and `checkin_times` in the data as well.
  • Number of items in checkin_times csv in data should be equal to the length of the device_name_checkin csv.
  • If there’s checkin_times in data, and device-name-checkin is absent, it must be set to `-` indicating no set device name.
if ‘device_name_checkin’ in data and data[‘device_name_checkin’] is not None:
  if ‘is_checked_in’ not in data or not data[‘is_checked_in’]:
       raise UnprocessableEntity(
           {‘pointer’: ‘/data/attributes/device_name_checkin’},
           “Attendee needs to be checked in first”
       )
  elif ‘checkin_times’ not in data or data[‘checkin_times’] is None:
      raise UnprocessableEntity(
          {‘pointer’: ‘/data/attributes/device_name_checkin’},
           “Check in Times missing”
      )
  elif len(data[‘checkin_times’].split(“,”)) != len(data[‘device_name_checkin’].split(“,”)):
     raise UnprocessableEntity(
           {‘pointer’: ‘/data/attributes/device_name_checkin’},
           “Check in Times missing for the corresponding device name”
         )
 if ‘checkin_times’ in data:
   if ‘device_name_checkin’ not in data or data[‘device_name_checkin’] is None:
       data[‘device_name_checkin’] = ‘-‘

The case is a little different for a PATCH request since we need to check for the number of items differently like this:

if ‘device_name_checkin’ in data and data[‘device_name_checkin’] is not None:
            if obj.device_name_checkin is not None:
               data[‘device_name_checkin’] = ‘{},{}’.format(obj.device_name_checkin, data[‘device_name_checkin’])                                                   
            if len(data[‘checkin_times’].split(“,”)) != len(data[‘device_name_checkin’].split(“,”)):
               raise UnprocessableEntity(
                   {‘pointer’: ‘/data/attributes/device_name_checkin’},
                   “Check in Time missing for the corresponding device name”)

Since we expect only the latest value to be present in a PATCH request, we first add it to the object by formatting using:

'{},{}'.format(obj.device_name_checkin, data['device_name_checkin'])

and then compare the length of the obtained CSVs for check in times and device names, so that corresponding to each check in time, we have either a device name or the default fill in value ‘-’.

That’s all. Read the full code here.

Requests and Responses:

Resources

  1. SQLAlchemy Docs
    https://docs.sqlalchemy.org/en/latest/
  2. Alembic Docs
    http://alembic.zzzcomputing.com/en/latest/
  3. Flask REST JSON API Classical CRUD operation
    https://flask-rest-jsonapi.readthedocs.io/en/latest/quickstart.html#classical-crud-operations
Continue ReadingAdding device names’ support for check-ins to Open Event Server

Fetch Five Star Skill Rating from getSkillList API in SUSI.AI Android

SUSI.AI had a thumbs up/down rating system till now, which has now been replaced by a five star skill rating system. Now, the user is allowed to rate the skill based on a five star rating system. The UI components include a rating bar and below the rating bar is a section that displays the skill rating statistics – total number of ratings, average rating and a graph showing the percentage of users who rated the skill with five stars, four stars and so on.

SUSI.AI Skills are rules that are defined in SUSI Skill Data repo which are basically the processed responses that SUSI returns to the user queries. When a user queries something from the SUSI Android app, a query to SUSI Server is made which in turn fetches data from SUSI Skill Data and returns a JSON response to the app. Similarly, to get skill ratings, a call to the ‘/cms/getSkillList.json’ API is made. In this API, the server checks the SUSI Skill Data repo for the skills and returns a JSON response consisting of all the required information like skill name, author name, description, ratings, etc. to the app. Then, this JSON response is parsed to extract individual fields to display the appropriate information in the skill details screen of the app.

API Information

The endpoint to fetch skills is ‘/cms/getSkillList.json’
The endpoints takes three parameters as input –

  • model – It tells the model to which the skill belongs. The default value is set to general.
  • group – It tells the group(category) to which the skill belongs. The default value is set to All.
  • language – It tells the language to which the skill belongs. The default value is set to en.

Since all skills have to be fetched, this API is called for every group individually. For instance, call “https://api.susi.ai/cms/getSkillList.json?group=Knowledge” to get all skills in group “Knowledge”. Similarly, call for other groups.

Here is a sample response of a skill named ‘Capital’ from the group Knowledge :

"capital": {
      "model": "general",
      "group": "Knowledge",
      "language": "en",
      "developer_privacy_policy": null,
      "descriptions": "A skill to tell user about capital of any country.",
      "image": "images/capital.png",
      "author": "chashmeet singh",
      "author_url": "https://github.com/chashmeetsingh",
      "skill_name": "Capital",
      "terms_of_use": null,
      "dynamic_content": true,
      "examples": ["What is the capital of India?"],
      "skill_rating": {
        "negative": "0",
        "positive": "4",
        "feedback_count" : 0,
        "stars": {
          "one_star": 0,
          "four_star": 1,
          "five_star": 0,
          "total_star": 1,
          "three_star": 0,
          "avg_star": 4,
          "two_star": 0
        }
      },
      "creationTime": "2018-03-17T17:11:59Z",
      "lastAccessTime": "2018-06-06T00:46:22Z",
      "lastModifiedTime": "2018-03-17T17:11:59Z"
    },


It consists of all details about the skill called ‘Capital’:

  1. Model (model)
  2. Group (group)
  3. Language (language)
  4. Developer Privacy Policy (developer_privacy_policy)
  5. Description (descriptions)
  6. Image (image)
  7. Author (author)
  8. Author URL (author_url)
  9. Skill name (skill_name)
  10. Terms of Use (terms_of_use)
  11. Content Type (dynamic_content)
  12. Examples (examples)
  13. Skill Rating (skill_rating)
  14. Creation Time (creationTime)
  15. Last Access Time (lastAccessTime)
  16. Last Modified Time (lastModifiedTime)

From among all this information, the information of interest for this blog is Skill Rating. This blog mainly deals with showing how to parse the JSON response to get the skill rating star values, so as to display the actual data in the skill rating graph.

A request to the getSkillList API is made for each group using the GET method.

@GET("/cms/getSkillList.json")
Call<ListSkillsResponse> fetchListSkills(@Query("group") String groups);

It returns a JSON response consisting of all the aforementioned information. Now, to parse the JSON response, do the following :

  1. Add a response for the response received as a result of API call. ListSkillsResponse contains two objects – group and skills.
    This blog is about getting the skill rating, so let us proceed with parsing the required response. The skills object contains the skill data that we need. Hence, next a SkillData class is created.

    class ListSkillsResponse {
       val group: String = "Knowledge"
       val skillMap: Map<String, SkillData> = HashMap()
    }
  2. Now, add the SkillData class. This class defines the response that we saw for ‘Capital’ skill above. It contains skill name, author, skill rating and so on.

    class SkillData : Serializable {
       var image: String = ""
       @SerializedName("author_url")
       @Expose
       var authorUrl: String = ""
       var examples: List<String> = ArrayList()
       @SerializedName("developer_privacy_policy")
       @Expose
       var developerPrivacyPolicy: String = ""
       var author: String = ""
       @SerializedName("skill_name")
       @Expose
       var skillName: String = ""
       @SerializedName("dynamic_content")
       @Expose
       var dynamicContent: Boolean? = null
       @SerializedName("terms_of_use")
       @Expose
       var termsOfUse: String = ""
       var descriptions: String = ""
       @SerializedName("skill_rating")
       @Expose
       var skillRating: SkillRating? = null
    }
    
  3. Now, add the SkillRating class. As what is required is the skill rating, narrowing down to the skill_rating object. The skill_rating object contains the actual rating for each skill i.e. the stars values. So, this files defines the response for the skill_rating object.

    class SkillRating : Serializable {
       var stars: Stars? = null
    }
    
  4. Further, add a Stars class. Ultimately, the values that are needed are the number of users who rated a skill at five stars, four stars and so on and also the total number of users and the average rating. Thus, this file contains the values inside the ‘stars’ object.

    class Stars : Serializable {
       @SerializedName("one_star")
       @Expose
       var oneStar: String? = null
       @SerializedName("two_star")
       @Expose
       var twoStar: String? = null
       @SerializedName("three_star")
       @Expose
       var threeStar: String? = null
       @SerializedName("four_star")
       @Expose
       var fourStar: String? = null
       @SerializedName("five_star")
       @Expose
       var fiveStar: String? = null
       @SerializedName("total_star")
       @Expose
       var totalStar: String? = null
       @SerializedName("avg_star")
       @Expose
       var averageStar: String? = null
    }
    

Now, the parsing is all done. It is time to use these values to plot the skill rating graph and complete the section displaying the five star skill rating.

To plot these values on the skill rating graph refer to the blog on plotting horizontal bar graph using MPAndroid Chart library. In step 5 of the linked blog, replace the second parameter to the BarEntry constructor by the actual values obtained by parsing.

Here is how we do it.

  • To get the total number of ratings
val  totalNumberofRatings: Int = skillData.skillRating?.stars?.totalStars

 

  • To get the average rating
val averageRating: Float = skillData.skillRating?.stars?.averageStars

 

  • To get number of users who rated the skill at five stars
val fiveStarUsers: Int = skillData.skillRating?.stars?.fiveStar

Similarly, get the number of users for fourStar, threeStar, twoStar and oneStar.

Note : If the totalNumberOfRatings equals to zero, then the skill is unrated. In this case, display a message informing the user that the skill is unrated instead of plotting the graph.

Now, as the graph shows the percentage of users who rated the skill at a particular number of stars, calculate the percentage of users corresponding to each rating, parse the result to Float and place it as the second parameter to the BarEntry constructor  as follows :

 entries.add(BarEntry(4f, (fiveStarUsers!!.toFloat() / totalUsers) * 100f)))

Similarly, replace the values for all five entries. Finally, add the total ratings and average rating section and display the detailed skill rating statistics for each skill, as in the following figure.

Resources

 

 

Continue ReadingFetch Five Star Skill Rating from getSkillList API in SUSI.AI Android

Upgrading Open Event to Use Sendgrid API v3

Sendgrid recently upgraded their web API to send emails, and support for previous versions was deprecated. As a result, Open Event Server’s mail sending tasks were rendered unsuccessful, because the requests they were sending to Sendgrid were not being processed. On top of that, it was also found out later that the existing Sendgrid API key on the development server was expired. This had to be fixed at the earliest because emails are a core part of Open Event functionality.

The existing way for emails to be sent via Sendgrid used to hit the endpoint “https://api.sendgrid.com/api/mail.send.json” to send emails. Also, the payload structure was as follows:

payload = {
    'to': to,
    'from': email_from,
    'subject': subject,
    'html': html
}

Also, a header  “Authorization”: “Bearer ” accompanied the above payload. However, Sendgrid changed the payload structure to be of the following format:

{

“personalizations”: [

{“to”: [

{“email”: “example@example.com“}

]

}

],

“from”: {

“email”: “example@example.com

},

“subject”: “Hello, World!”,

“content”: [

{

“type”: “text/plain”,

“value”: “Heya!”

}

]

}

Furthermore, the endpoint was changed to be “https://api.sendgrid.com/v3/mail/send”. To incorporate all these changes with the minimum number of modified lines in the codebase, it was required for that the structure change itself happens at a fairly low level. This was because there are lots of features in the server that perform a wide variety of email actions. Thus, it was clear that changing all of them will not be the most efficient thing to do. So the perfect place to implement the API changes was the function send_email() in mail.py, because all other higher-level email functions are built on top of this function. But this was not the only change, because this function itself used another function, called send_email_task() in tasks.py, specifically for sending email via Sendgrid. So, in conclusion, the header modifications were made in send_email() and payload structure as well as endpoint modifications were made within send_email_task(). This brought the server codebase back on track to send emails successfully. Finally, the key for development server was also renewed and added to its settings in the Heroku Postgres database.

Screenshots:

Screen Shot 2018-08-21 at 3.40.12 PM.png

Screen Shot 2018-08-21 at 3.40.32 PM.png

Resources

Continue ReadingUpgrading Open Event to Use Sendgrid API v3

Implementing Checkout Times for Attendees on Open Event Server

As of this writing, Open Event Server did not have the functionality to add, manipulate and delete checkout times of attendees. Event organizers should have access to log and update attendee checkout times. So it was decided to implement this functionality in the server. This boiled down to having an additional attribute checkout_times in the ticket holder model of the server.

So the first step was to add a string column named checkout_times in the ticket holder database model, since this was going to be a place for comma-separated values (CSV) of attendee checkout times. An additional boolean attribute named is_checked_out was also added to convey whether an attendee has checked out or not. After the addition of these attributes in the model, we saved the file and performed the required database migration:

To create the migration file for the above changes:

$ python manage.py db migrate

To upgrade the database instance:

$ python manage.py db upgrade

Once the migration was done, the API schema file was modified accordingly:

class AttendeeSchemaPublic(SoftDeletionSchema):
    """
    Api schema for Ticket Holder Model
    """
    
    checkout_times = fields.Str(allow_none=True)  # ←
    is_checked_out = fields.Boolean()  # ←
    

After the schema change, the attendees API file had to have code to incorporate these new fields. The way it works is that when we receive an update request on the server, we add the current time in the checkout times CSV to indicate a checkout time, so the checkout times field is essentially read-only:

from datetime import datetime
...
class AttendeeDetail(ResourceDetail):
    def before_update_object(self, obj, data, kwargs):
        
        if 'is_checked_out' in data and data['is_checked_out']:
        ...
        else:
            if obj.checkout_times and data['checkout_times'] not in \
obj.checkout_times.split(","):
                data['checkout_times'] = '{},{},{}'.format(
                    obj.checkout_times,
                    data['checkout_times'],
                    datetime.utcnow())

 

This completes the implementation of checkout times, so now organizers can process attendee checkouts on the server with ease.

Resources

Continue ReadingImplementing Checkout Times for Attendees on Open Event Server

Ember Controller for Badge Generation In Badgeyay

Badgeyay is an open source project developed by FOSSASIA Community. This project aims towards giving a platform for badge generation using several customizations options. Current structure of project is in two parts to maintain modularity, which are namely backend, developed in flask, and frontend, developed in ember.

After refactoring the frontend and backend API we need to create a controller for the badge generation in frontend. Controller will help the components to send and receive data from them and prepare the logic for sending request to API so that badges can be generated and can receive the result as response from the server. Particularly we need to create the controller for badge generation route, create-badges.

As there are many customizations option presented to user, we need to chain the requests so that they sync with each other and the logic should not break for the badge generation.

Procedure

  1. Creating the controller from the ember-cli
ember g controller create-badge

 

  1. After the component generation, we need to create actions that can be passed to components. Let’s build action to submit form and then chain the different actions together for the badge generation.
submitForm() {
  const _this = this;
  const user = _this.get(‘store’).peekAll(‘user’);
  let uid;
  user.forEach(user_ => {
    uid = user_.get(‘id’);
  });
  if (uid !== undefined && uid !== ) {
    _this.set(‘uid’, uid);
  }

  let badgeData = {
    uid     : _this.uid,
    badge_size : ‘A3’
  };

  if (_this.csvEnable) {
    badgeData.csv = _this.csvFile;
  }
  if (_this.defFontColor !== && _this.defFontColor !== undefined) {
    badgeData.font_color = ‘#’ + _this.defFontColor;
  }
  if (_this.defFontSize !== && _this.defFontSize !== undefined) {
    badgeData.font_size = _this.defFontSize.toString();
  }
  if (_this.defFont !== && _this.defFont !== undefined) {
    badgeData.font_type = _this.defFont;
  }

  _this.send(‘sendManualData’, badgeData);

},

 

  1. As we can see in the above code snippet that _this.send(action_name, arguments) is calling another action sendManualData. This action then sends a network request to the backend if the Manual data is selected as input source otherwise will go with the CSV upload. If no option is chosen then it will show an error on the user screen, notifying him to select one input source.
sendManualData(badgeData) {
    const _this = this;
    if (_this.manualEnable) {
      let textEntry = _this.get(‘store’).createRecord(‘text-data’, {
        uid   : _this.uid,
        manual_data : _this.get(‘textData’),
        time   : new Date()
      });
      textEntry.save().then(record => {
        _this.set(‘csvFile’, record.filename);
        badgeData.csv = _this.csvFile;
        _this.send(‘sendDefaultImg’, badgeData);
        _this.get(‘notify’).success(‘Text saved Successfully’);
      }).catch(err => {
        let userErrors = textEntry.get(‘errors.user’);
        if (userErrors !== undefined) {
          _this.set(‘userError’, userErrors);
        }
      });
    } else if (_this.csvEnable) {
      if (_this.csvFile !== undefined && _this.csvFile !== ) {
        badgeData.csv = _this.csvFile;
        _this.send(‘sendDefaultImg’, badgeData);
      }
    } else {
      // No Input Source specified Error
    }
  },

 

The above code will choose the manual data if the manual data boolean flag is set else not, and then does a network request and wait for the promise to be resolved. As soon as the promise is resolved it calls another action to for the default image.

  1. After selecting the input source, now the background for the badge has to be selected. It will look for the boolean flags of the defaultImage, backgroundColorImage, customImage and will make the network request accordingly.
sendDefaultImg(badgeData) {
    const _this = this;
    if (_this.defImage) {
      let imageRecord = _this.get(‘store’).createRecord(‘def-image-upload’, {
        uid    : _this.uid,
        defaultImage : _this.defImageName
      });
      imageRecord.save()
        .then(record => {
          _this.set(‘custImgFile’, record.filename);
          badgeData.image = _this.custImgFile;
          _this.send(‘sendBadge’, badgeData);
        })
        .catch(error => {
          let userErrors = imageRecord.get(‘errors.user’);
          if (userErrors !== undefined) {
            _this.set(‘userError’, userErrors);
          }
        });
    } else if (_this.custImage) {
      if (_this.custImgFile !== undefined && _this.custImgFile !== ) {
        badgeData.image = _this.custImgFile;
        _this.send(‘sendBadge’, badgeData);
      }
    } else if (_this.colorImage && _this.defColor !== undefined && _this.defColor !== ) {
      console.log(_this.defColor);
      let imageRecord = _this.get(‘store’).createRecord(‘bg-color’, {
        uid : _this.uid,
        bg_color : _this.defColor
      });
      imageRecord.save()
        .then(record => {
          badgeData.image = record.filename;
          _this.send(‘sendBadge’, badgeData);
        })
        .catch(error => {
          let userErrors = imageRecord.get(‘errors.user’);
          if (userErrors !== undefined) {
            _this.set(‘userError’, userErrors);
          }
        });
    } else {
      // Inflate error for No Image source.
    }
  },

 

After the promise resolvement, the final action is called to send badge data payload to backend api for badge generation.

  1. After the complete preparation of the payload, now it’s time to send the payload to the backend api for the badge generation and after the promise resolvement showing the respective downloadable link in the frontend.
sendBadge(badgeData) {
    const _this = this;
    let badgeRecord = _this.get(‘store’).createRecord(‘badge’, badgeData);
    badgeRecord.save()
      .then(record => {
        _this.set(‘badgeGenerated’, true);
        _this.set(‘genBadge’, record.id);
        _this.get(‘notify’).success(‘Badge generated Successfully’);
      })
      .catch(err => {
        console.error(err.message);
      });
  },

 

Now after the promise resolvement the local variable badgGenerated is set to true so that the success message can be shown in the frontend for successful badge generation along with the link.

Link to respective PR – Link

Topics Involved

  • Chaining of actions and requests
  • Manipulating DOM on the conditional statements
  • Component bindings
  • Ember data
  • Promise resolvement

Resources

  • Link to ember data for the API requests and promise resolvement – Link
  • Implementing Controllers in Ember – Link
  • Chaining actions together in ember – Link
Continue ReadingEmber Controller for Badge Generation In Badgeyay

Integrating Firebase Cloud Functions In Badgeyay

Badgeyay is an open source project developed by FOSSASIA Community for generating badges for conferences and events. The Project is divided into two parts frontend, which is in ember, and backend, which is in flask. Backend uses firebase admin SDK (Python) and Frontend uses firebase javascript client with emberfire wrapper for ember. Whenever an user signs up on the website, database listener that is attached to to the Model gets triggered and uses flask-mail for sending welcome mail to the user and in case of email and password signup, verification mail as well.

Problem is sending mail using libraries is a synchronous process and takes a lot of processing on the server. We can use messaging queues like RabbitMQ and Redis but that will be burden as server cost will increase. The workaround is to remove the code from the server and create a firebase cloud function for the same task.

Firebase cloud functions lets you run backend code on the cloud and can be triggered with HTTP events or listen for the events on the cloud, like user registration.

Procedure

  1. Firebase uses our Gmail ID for login, so make sure to have a Gmail ID and on the first sight we will be greeted with Firebase console, where we can see our created or imported firebase apps.

  2. Create the app by clicking on the Add Project Icon and write the name of the application (e.g. Test Application) and select the region, in my case it is India. Firebase will automatically generated an application ID for the app. Click on Create Project to complete creation of project

  3. After Completion, click on the project to enter into the project. You will be greeted with an overview saying to integrate firebase with your project. We will click on the Add Firebase to web App and save the config as JSON in a file as clientKey.json for later use.

  4. Now we need to install the firebase tools on our local machine so for that execute
    npm i -g firebase-tools
    1. Now login from the CLI so that firebase gets token for the Gmail ID of the user and can access the firebase account of that Gmail ID.
    firebase login
    1. After giving permissions to the firebase CLI from your Gmail account in the new tab opened in browser, create a folder named cloud_functions in the project directory and in that execute
    firebase init
    1. Select only functions from the list of options by pressing space.

    2. After this select the project from the list where you want to use the cloud function. You can skip the step if you later want to add the cloud function to project by selecting don’t setup a default project and can later be used by command
      firebase use –add

    3. Choose the language of choice

    4. If you want, you can enforce eslint on the project and after this the cloud function is set up and the directory structure looks as follows.

    5. We will write our cloud function in index.js. So let’s take a look at index.js
      const functions = require(‘firebase-functions’);

      // // Create and Deploy Your First Cloud Functions
      // // https://firebase.google.com/docs/functions/write-firebase-functions
      //
      // exports.helloWorld = functions.https.onRequest((request, response) => {
      //  response.send(“Hello from Firebase!”);
      // });

      As we can see there is a sample function already given, we don’t need that sample function so we will remove it and will write the logic for sending mail. Before that we need to acquire the key for service accounts so that admin functionality can be accessed in the cloud function. So for that go to project settings and then service accounts and click on Generate New Private Key  and save it as serviceKey.json

    6. Now the directory structure will look like this after adding the clientKey.json and serviceKey.json

    7. We will use node-mailer for sending mails in cloud functions and as there is a limitation on the gmail account to send only 500 mails in a day, we can use third party services like sendGrid and others for sending mails with firebase. Configure node-mailer for sending mails as
      const nodemailer = require(‘nodemailer’);

      const gmailEmail = functions.config().gmail.email;
      const gmailPassword = functions.config().gmail.password;
      const mailTransport = nodemailer.createTransport({
       service: ‘gmail’,
       auth: {
      user: gmailEmail,
      pass: gmailPassword
       }
      });

      Also set the environment variables for the cloud functions like email and password:

      firebase functions:config:set gmail.email=“Email ID” gmail.password=“Password”
      1. Logic for sending Greeting Mail on user registration
      exports.greetingMail = functions.auth.user().onCreate((user) => {
       const email = user.email;
       const displayName = user.displayName;

       return sendGreetingMail(email, displayName);
      });

      function sendGreetingMail(email, displayName) {
       const mailOptions = {
      from: `${APP_NAME}<noreply@firebase.com>`,
      to: email,
       };

       mailOptions.subject = `Welcome to Badgeyay`;
       mailOptions.text = `Hey ${displayName || ”}! Welcome to Badgeyay. We welcome you onboard and pleased to offer you service.`;
       return mailTransport.sendMail(mailOptions).then(() => {
      return console.log(‘Welcome mail sent to: ‘, email)
       }).catch((err) => {
      console.error(err.message);
       });
      }

      Function will get triggered on creation of user in firebase and calls the greeting mail function with parameters as the email id of the registered user and the Display name. Then a default template is used to send mail to the recipient and Logged on successful submission.

      1. Currently firebase admin sdk doesn’t support the functionality to send verification mail but the client SDK does. So the approach which is followed in badgeyay is that admin SDK will create a custom token and client sdk will use that custom token to sign in and them send verification mail to the user.
      exports.sendVerificationMail = functions.auth.user().onCreate((user) => {
       const uid = user.uid;
       if (user.emailVerified) {
      console.log(‘User has email already verified: ‘, user.email);
      return 0;
       } else {
      return admin.auth().createCustomToken(uid)
        .then((customToken) => {
          return firebase.auth().signInWithCustomToken(customToken)
        })
        .then((curUser) => {
          return firebase.auth().onAuthStateChanged((user_) => {
            if (!user.emailVerified) {
              user_.sendEmailVerification();
              return console.log(‘Verification mail sent: ‘, user_.email);
            } else {
              return console.log(‘Email is already verified: ‘, user_.email);
            }
          })
        })
        .catch((err) => {
          console.error(err.message);
        })
       }
      });
      1. Now we need to deploy the functions to firebase.
      firebase deploy –only functions

      Link to the respective PR  : Link

      Topics Involved

      • Firebase Admin SDK
      • Configuring Gmail for third party apps
      • Token Verification and verification mail by client SDK
      • Nodemailer and Express.js

      Resources

      • Firebase Cloud functions – Link
      • Extending authentication with cloud function – Link
      • Custom Token Verification – Link
      • Nodemailer message configuration – Link
      • Issue discussion on sending verification mail with admin SDK – Link
Continue ReadingIntegrating Firebase Cloud Functions In Badgeyay

Extending the News Feature to Show results from multiple organisations

News Tab in Susper was earlier implemented to show results only from a single organisation using  site: modifier for query facet. In this blog I will discuss about how I have modified the current News Tab to show results from various News organisation like BBC, Al Jazeera, The Guardian etc.

Implementation:

Step 1:

Creating a JSON file to store organisations detail:

We need to decide from which organisations  we will fetch results using YaCy Server and then display it in Susper. We have provided a JSON file where user can add or delete News sources easily. The results will only limited to the organisations which are present in the JSON file.

Step 2:

Creating a service to fetch details from JSON file:

Now after creating the the JSON file we need a service which will fetch results from the JSON file according to our need. This service will be a simple  Angular Service having a class GetJsonService and it will access the newsFile.json and map the results in JSON format. Here is the service which does this task for us.

Step 3:

Creating a service to fetch news accordingly:

Now after fetching the JSON result we need a service to fetch the News Results from YaCy Server. I have created a separate service to do this task where I have fetched results using site: modifier from each organisation and returned the results.The code for the news.service.ts is below.

export class NewsService {
 constructor(private jsonp: Jsonp) { }
 getSearchResults(searchquery, org) {
   let searchURL = 'https://yacy.searchlab.eu/solr/select?query=';
   searchURL += searchquery.query + ' site:' + org;
   let params = new URLSearchParams();
   for (let key in searchquery) {
     if (searchquery.hasOwnProperty(key)) {
       params.set(key, searchquery[key]); } }
  //Set other parameters
   return this.jsonp
     .get(searchURL, {search: params}).map(res =>
       res.json()[0]
     ).catch(this.handleError); }

 

Step 4:

Updating the results section:

Now we have a service that gives results from a single organisation and a JSON list of organisations. Now in results.component.ts we can simply subscribe to getJsonService and in a loop we will call getNewsService by changing the organisation in every iteration. We will then check that whether we are getting the valid results or not (undefined). The results which are not valid can cause errors when we will try to read any field of an undefined variable. Then, We will simply append the 2 result items from each organisation in an empty array and later use this array to show results.

this.getJsonService.getJSON().subscribe(res => { this.newsResponse = [];
for (let i = 0; i < res.newsOrgs.length; i++) { this.getNewsService.getSearchResults(querydata,res.newsOrgs[i].provider).subscribe( response => {
    if (response.channels[0].items[0] !== undefined) {
    this.newsResponse.push(response.channels[0].items[0]); }
    if (response.channels[0].items[1] !== undefined) {
    this.newsResponse.push(response.channels[0].items[1]); } } ); }
    });

 

The newsClick() function is activated on clicking News Tab and it updates the query and its details in store.

Step 5

Displaying the results:

Now we will modify the results.component.html to show results from new newsResponse array which have 2 results each from 5 organisations.

For this we will use each item of newsResponse using *ngFor and display its title,link and description in html template. We will also use [style.color] property of our element and set the color according to theme.

<div *ngFor="let item of newsResponse" class="result"> <div class="title">
<a class="title-pointer" href="{{item.link}}" [style.color]="themeService.titleColor">{{item.title}}</a>
</div> <div class="link">
<p  [style.color]="themeService.linkColor">{{item.link}}</p>
 </div> </div>

 

Here is the view of Susper’s News Tab where we are getting results from 5 different organisations.

Resources

  1. YaCy Modifiers: http://www.yacy-websuche.de/wiki/index.php/En:SearchParameters
  1. Angular Services: https://angular.io/tutorial/toh-pt4
  2. Reading JSON data in Angular: https://stackoverflow.com/questions/43275995/angular4-how-do-access-local-json

 

Continue ReadingExtending the News Feature to Show results from multiple organisations

Implementing Endpoint to Resend Email Verification

Earlier, when a user registered via Open Event Frontend, s/he received a verification link via email to confirm their account. However, this was not enough in the long-term. If the confirmation link expired, or for some reasons the verification mail got deleted on the user side, there was no functionality to resend the verification email, which prevented the user from getting fully registered. Although the front-end already showed the option to resend the verification link, there was no support from the server to do that, yet.

So it was decided that a separate endpoint should be implemented to allow re-sending the verification link to a user. /resend-verification-email was an endpoint that would fit this action. So we decided to go with it and create a route in `auth.py` file, which was the appropriate place for this feature to reside. First step was to do the necessary imports and then definition:

from app.api.helpers.mail import send_email_confirmation
from app.models.mail import USER_REGISTER_WITH_PASSWORD
...
...
@auth_routes.route('/resend-verification-email', methods=['POST'])
def resend_verification_email():
...

Now we safely fetch the email mentioned in the request and then search the database for the user corresponding to that email:

def resend_verification_email():
    try:
        email = request.json['data']['email']
    except TypeError:
        return BadRequestError({'source': ''}, 'Bad Request Error').respond()

    try:
        user = User.query.filter_by(email=email).one()
    except NoResultFound:
        return UnprocessableEntityError(
{'source': ''}, 'User with email: ' + email + ' not found.').respond()
    else:

    ...

Once a user has been identified in the database, we proceed further and create an essentially unique hash for the user verification. This hash is in turn used to generate a verification link that is then ready to be sent via email to the user:

else:
    serializer = get_serializer()
    hash_ = str(base64.b64encode(str(serializer.dumps(
[user.email, str_generator()])).encode()), 'utf-8')
    link = make_frontend_url(
'/email/verify'.format(id=user.id), {'token': hash_})

Finally, the email is sent:

send_email_with_action(
user, USER_REGISTER_WITH_PASSWORD,
app_name=get_settings()['app_name'], email=user.email)
    if not send_email_confirmation(user.email, link):
        return make_response(jsonify(message="Some error occured"), 500)
    return make_response(jsonify(message="Verification email resent"), 200)

But this was not enough. When the endpoint was tested, it was found that actual emails were not being delivered, even after correctly configuring the email settings locally. So, after a bit of debugging, it was found that the settings, which were using Sendgrid to send emails, were using a deprecated Sendgrid API endpoint. A separate email function is used to send emails via Sendgrid and it contained an old endpoint that was no longer recommended by Sendgrid:

@celery.task(name='send.email.post')
def send_email_task(payload, headers):
   requests.post(
       "https://api.sendgrid.com/api/mail.send.json",
       data=payload,
       headers=headers
   )

The new endpoint, as per Sendgrid’s documentation, is:

https://api.sendgrid.com/v3/mail/send

But this was not the only change required. Sendgrid had also modified the structure of requests they accepted, and the new structure was different from the existing one that was used in the server. Following is the new structure:

'{"personalizations": [{"to": [{"email": "example@example.com"}]}],"from": {"email": "example@example.com"},"subject": "Hello, World!","content": [{"type": "text/plain", "value": "Heya!"}]}'

The header structure was also changed, so the structure in the server was also updated to

headers = {
"Authorization": ("Bearer " + key),
"Content-Type": "application/json"
}

The Sendgrid function (which is executed as a Celery task) was modified as follows, to incorporate the changes in the API endpoint and structure:

import json
...
@celery.task(name='send.email.post')
def send_email_task(payload, headers):
    data = {"personalizations": [{"to": []}]}
    data["personalizations"][0]["to"].append({"email": payload["to"]})
    data["from"] = {"email": payload["from"]}
    data["subject"] = payload["subject"]
    data["content"] = [{"type": "text/html", "value": payload["html"]}]
    requests.post(
        "https://api.sendgrid.com/v3/mail/send",
        data=json.dumps(data),
        headers=headers,
        verify=False  # doesn't work with verification in celery context
    )

 

As can be seen, there is a bug that doesn’t allow SSL verification within the celery context. However, the verification is successful when the functionality is executed independent of the celery context. But now email sending via Sendgrid actually works, which makes our verification resend endpoint functional:Screen Shot 2018-08-10 at 10.04.12 PM.pngEmail is received successfully by the recipient:

Screen Shot 2018-08-10 at 10.04.30 PM.png

Thus, a working email verification endpoint is implemented, which can be easily integrated in the frontend.


Resources:

Continue ReadingImplementing Endpoint to Resend Email Verification

Add RSS feed and JSON output based on type specified with query param

The idea behind writing this blog post is to discuss the method on how RSS feed and JSON output sources have been included in loklak to provide respective data sources based on the type specified with query as a parameter.

Accessing Current Query in Info-box

Accessing link to RSS feed and JSON output of loklak requires the Query to be passed as a value with parameter ‘q’ (e.g. api/search.json?q=FOSSASIA or api/search.rss?q=FOSSASIA). In order to represent links as buttons in Info-box at sidebar of loklak.org, current Query needs to be accessed/stored inside Info-box from ngrx store.

public stringQuery;
...
this.store.select(fromRoot.getQuery).subscribe(
    query => this.stringQuery = query.displayString);

 

Firstly stringQuery variable is created to store the current Query from store. As the Query can be changed in store (User might search for several Queries), storing of current Query needs to be done inside ngOnChanges().

Checking type associated with Query

There are various types associated with Query to get different type of results like ‘from:FOSSASIA will give results specifically from ‘FOSSASIA’ which will have different query parameter from other types like ‘@FOSSASIA’ or ‘#FOSSASIA’. To assign appropriate Query parameter to each of these types, we need to check the Query pattern to apply Query param based on the type.

import { hashtagRegExp, fromRegExp, mentionRegExp }
      from ‘../../utils/reg-exp’;
...

if ( hashtagRegExp.exec(this.stringQuery) !== null ) {
	// Check for hashtag this.stringQuery
	this.queryString = ‘%23 + hashtagRegExp.exec(
	this.stringQuery)[1] + '' + hashtagRegExp.exec(
	this.stringQuery)[0];
} else if ( fromRegExp.exec(this.stringQuery) !== null ) {
	// Check for from user this.stringQuery
	this.queryString = ‘from%3A’ + fromRegExp.exec(
	this.stringQuery)[1];
} else if ( mentionRegExp.exec(this.stringQuery) !== null ) {
	// Check for mention this.stringQuery
	this.queryString = ‘%40 + mentionRegExp.exec(
	this.stringQuery)[1];
} else {
	// for other queries
	this.queryString = this.stringQuery;
}

 

Note: hashtagRegExp, fromRegExp, mentionRegExp are the utility functions created to match the pattern of given string (Query) in order to classify the preceding type associated with Query. These are provided here as a reference, which can be used to add more of the types.

Passing current queryString in RSS and JSON link

General link for both RSS and JSON data remains same for each Query passed, only the type associated would be changed in the Query value. So representing the link in an anchor tag in UI would be as –

<a class=“data rss” href=“
		http://api.loklak.org/api/search.
		rss?timezoneOffset=-330&q=
		{{stringQuery}}” target=“_blank”>
</a>
<a class=”data json” href=”http://api.
		loklak.org/api/search
		.json?timezoneOffset=-330&q=
		{{stringQuery}}” target=”_blank“>
</a>

 

{{stringQuery}} is the actual query parameter to be passed to get the required results.

Testing RSS feed and JSON data

Search for a query on loklak, and click on the the RSS or JSON button below sidebar on results page and compare with the results.

RSS and JSON button should be similar to –

Resources

Continue ReadingAdd RSS feed and JSON output based on type specified with query param