Stripe Authorization in Open Event Server

Stripe is a popular software platform for online payments. Since Open Event  allows the event organizers to sell tickets, an option to accept payments through Stripe is extremely beneficial to the organizer. Stripe allows accepting payments on other’s behalf using Connect. Connect is the Stripe’s full stack solution for platforms that need to process payments and process to multiple parties. This blog post goes over how Event organizers are able to link their Stripe accounts in order to accept payments later.

Registering the platform

The Admin of the Open Event Server will create an account on Stripe and register the platform. Upon creating the  account he/she will get a secret_key and publishable_key.  Moreover on registering the platform a client_id will be provided. These keys are saved in the application settings and only the Admin is authorized to view or change them.

Connecting the Organiser’s account

The Open Event Frontend has a wizard for creating an Event. It provides the organiser an option to connect his/her Stripe account in order to accept payments.

Upon clicking the following button, the organiser is directed to Stripe where he/she can fill the required details.  

The button directs the organizer to the following URL:

https://connect.stripe.com/oauth/authorize?response_type=code&client_id=client_id&scope=read_write&redirect_uri=redirect_uri 

The above URL has the following parameters:

  • client_id – The client ID acquired when registering your platform.required.
  • response_type – Response type. The value is always code. required.
  • redirect_uri – The URL to redirect the customer to after authorization.
  • scope – We need it to be read_write in order to be able to charge on behalf of the customer later.

After successfully entering the required details, the organizer is redirected to the redirect_url as specified in the above URL with a query parameter named as authorization_code. The Frontend sends this code to the Server using the Stripe Authorization endpoint which will be discussed in detail below.

Fetching Tokens from Stripe

The Server accepts the authorization_code by exposing the Stripe Authorization endpoint. It then uses it to fetch organizer’s details and token from Stripe and stores it for future use.

The schema for Stripe Authorization is extremely simple. We require the client to send an authorization_code which will be used to fetch the details. Stripe_publishable_key of the event organizer is exposed via the endpoint and will be used by the Frontend later.

class StripeAuthorizationSchema(Schema):
    """
        Stripe Authorization Schema
    """

    class Meta:
        """
        Meta class for StripeAuthorization Api Schema
        """
        type_ = 'stripe-authorization'
        self_view = 'v1.stripe_authorization_detail'
        self_view_kwargs = {'id': '<id>'}
        inflect = dasherize

    id = fields.Str(dump_only=True)
    stripe_publishable_key = fields.Str(dump_only=True)
    stripe_auth_code = fields.Str(load_only=True, required=True)

    event = Relationship(attribute='event',
                self_view='v1.stripe_authorization_event',
                self_view_kwargs={'id': '<id>'},
                related_view='v1.event_detail',
                related_view_kwargs={'stripe_authorization_id':                      '<id>'},
                schema="EventSchema",
                type_='event')

We use the Requests library in order to fetch the results. First we fetch the client_id that we had stored in the application settings using a helper method called get_credentials. We then use it along with the authorization_code in order to make a POST request to Stripe Connect API. The full method is given below for reference.

@staticmethod
def get_event_organizer_credentials_from_stripe(stripe_auth_code):
        """
        Uses the stripe_auth_code to get the other credentials for the event organizer's stripe account
        :param stripe_auth_code: stripe authorization code
        :return: response from stripe
        """
        credentials = StripePaymentsManager.get_credentials()

        if not credentials:
            raise Exception('Stripe is incorrectly configured')

        data = {
            'client_secret': credentials['SECRET_KEY'],
            'code': stripe_auth_code,
            'grant_type': 'authorization_code'
        }

        response = requests.post('https://connect.stripe.com/oauth/token', data=data)
        return json.loads(response.text)

We call the above method before creating the object using the before_create_object method of Marshmallow which allows us to do data preprocessing and validations.

If the request was a success, the response from Stripe connect API includes all the details necessary to accept payments on their behalf. We add these fields to the data and save it in the database.

{
  "token_type": "bearer",
  "stripe_publishable_key": PUBLISHABLE_KEY,
  "scope": "read_write",
  "livemode": false,
  "stripe_user_id": USER_ID,
  "refresh_token": REFRESH_TOKEN,
  "access_token": ACCESS_TOKEN
}

In case there was an error, an error_description would be returned. This error_description is sent back to the frontend and shown to the event organizer.

{
  "error": "invalid_grant",
  "error_description": "Authorization code already used:                                               
                        AUTHORIZATION_CODE"
}

After successfully fetching the results, we save it inside the database and return the stripe_publishable_key which will be used by the Frontend when charging the ticket buyers later.

Lastly we can go over the Stripe Authorization model as well. The stripe_secret_key will be used to charge the customers later.

id = db.Column(db.Integer, primary_key=True)
stripe_secret_key = db.Column(db.String)
stripe_refresh_token = db.Column(db.String)
stripe_publishable_key = db.Column(db.String)
stripe_user_id = db.Column(db.String)
stripe_auth_code = db.Column(db.String)

References

 

Continue ReadingStripe Authorization in Open Event Server

SUSI AI 5 Star Skill Rating System

For making a system more reliable and robust, continuous evaluation is quite important. So is in case of SUSI AI. User feedback is important to improve SUSI skills and create new ones. Previously we had only thumbs up / thumbs down as a feedback method, from the susi chat client. But now a 5 star rating system has been added to the SUSI Skill CMS so that users can rate a skill there. Before the implementation of API  let’s look how data is stored in SUSI AI Susi_server uses DAO in which skill rating is stored as JSONTray.

The server side implementation

A new java class has been created for the API, FiveStarRateSkillService.java.

public class FiveStarRateSkillService extends AbstractAPIHandler implements APIHandler {
    private static final long serialVersionUID =7947060716231250102L;
    @Override
    public BaseUserRole getMinimalBaseUserRole() {
        return BaseUserRole.USER;
    }
    @Override
    public JSONObject getDefaultPermissions(BaseUserRole baseUserRole) {
        return null;
    }
    @Override
    public String getAPIPath() {
        return "/cms/rateSkill.json";
    }
...
}

The getMinimalBaseRole method tells the minimum User role required to access this servlet it can also be ADMIN, USER or ANONYMOUS. In our case it is USER. A user needs to be logged in to rate a skill on a scale of 1-5 stars.  The API runs at “/cms/fiveStarRateSkill.json” endpoint.

Next, create serviceImpl method in the above class to handle the request from the client and respond to it.

1. Fetch the required query parameters and store them in variables. They include skill model, group, language, skill name and starts that the user has given in the rating.

String skill_name = call.get("skill", null);
String skill_stars = call.get("stars", null);

2. Then check if the skill exists. If not them throw an exception. Otherwise, increment the count of the corresponding rating. The rating object has keys as one_star, two_star, three_star, four_star and five_star that has the count of that star rating.       

if (skill_stars.equals("1")) {
    skillName.put("one_star", skillName.getInt("one_star") + 1 + "");
}
else if (skill_stars.equals("2")) {
    skillName.put("two_star", skillName.getInt("two_star") + 1 + "");
}
else if (skill_stars.equals("3")) {
    skillName.put("three_star", skillName.getInt("three_star") + 1 + "");
}
else if (skill_stars.equals("4")) {
    skillName.put("four_star", skillName.getInt("four_star") + 1 + "");
}
else if (skill_stars.equals("5")) {
    skillName.put("five_star", skillName.getInt("five_star") + 1 + "");
}

3. Re-calculate the total rating done on that skill and its average rating and update the object. If the skill has not been already rated then create a new rating object and initialize it with the 0 star counts.

public JSONObject createRatingObject(String skill_stars) {
        JSONObject skillName = new JSONObject();
        JSONObject skillStars = new JSONObject();

        skillStars.put("one_star", 0);
        skillStars.put("two_star", 0);
        skillStars.put("three_star", 0);
        skillStars.put("four_star", 0);
        skillStars.put("five_star", 0);
        skillStars.put("avg_star", 0);
        skillStars.put("total_star", 0);

        skillName.put("stars", skillStars);
}

The complete FiveStarRateSkillService.java is available here : –

https://github.com/fossasia/susi_server/blob/development/src/ai/susi/server/api/cms/FiveStarRateSkillService.java

Rating a skill

Sample endpoint

https://api.susi.ai/cms/fiveStarRateSkill.json?model=general&group=Knowledge&language=en&skill=aboutsusi&stars=3&callback=p&_=1526813916145

This gives 3 star rating to the “aboutsusi” skill.

Parameters

  • Model
  • Group
  • Language
  • Skill
  • Stars

Response

{
  "ratings": {
    "one_star": 0,
    "four_star": 0,
    "five_star": 1,
    "total_star": 1,
    "three_star": 0,
    "avg_star": 5,
    "two_star": 0
  },
  "session": {"identity": {
    "type": "email",
    "name": "1anuppanwar@gmail.com",
    "anonymous": false
  }},
  "accepted": true,
  "message": "Skill ratings updated"
}

Getting the stats of Skill Ratings

Sample endpoint

http://127.0.0.1:4000/cms/getSkillRating.json?model=general&group=Knowledge&language=en&skill=aboutsusi&callback=p&_=1526813916145

This fetches the current ratings of the “aboutsusi” skill.

Parameters

  • Model
  • Group
  • Language
  • Skill

Response

{
    "session": {
        "identity": {
            "type": "host",
            "name": "172.68.144.159_81c88a10",
            "anonymous": true
        }
    },
    "skill_name": "aboutsusi",
    "accepted": true,
    "message": "Skill ratings fetched",
    "skill_rating": {
        "negative": "0",
        "positive": "0",
        "stars": {
            "one_star": 0,
            "four_star": 2,
            "five_star": 1,
            "total_star": 4,
            "three_star": 1,
            "avg_star": 4,
            "two_star": 0
        },
        "feedback_count": 3
    }
}

Conclusion

So this 5 star rating system will help in improving the SUSI skills. Also, it will help in making better decisions when we have multiple similar skills and we have to choose one to respond to the user query.

References

Continue ReadingSUSI AI 5 Star Skill Rating System

Adding System Image for Event Categories

The Open Event Server is using the JSON 1.0 Specification and build on top of Flask Rest Json API (for building Rest APIs) and Marshmallow (for Schema). In this blog, we will talk about how to add feature of System Image for Event Categories on Open Event Server. The focus is on Model updation, Schema updation and migrating the Database.

Model Updation

For adding System Image, we’ll update our Model EventTopic.

In this feature, we are providing rights to the Admin to add a system image for each Event Category so that if no image is given by a organizer of event on event creation then it will use the system image of that Event Category as event image by default.

Here we are adding a Column named system_image_url which is of type String. This value cannot be nullable and having a default value.

Migrating the Database

For the migrating the Database we will use simple commands.

This command runs migrations. If it cause problems naming Multiple Migration Head, then you need to run

This problem is caused when two developers push a migration file without merging two heads to achieve one head.

The above command will give us ids of two migration heads.

This command is merging two migration heads.

This command is upgrading the migrations.

Finally, we migrate the Database using above command.

Schema Updation

For the system image, we’ll update the Schema EventTopicSchema as follows

In this feature, to provide system image for each Event Category we’ll add a field named system_image_url in the Schema.

Here we are adding a field named system_image_url which is of marshmallow field type URL. This value cannot be none.

Validating the Event Image and using System Image by default

In this step, we’ll check if a event image is provided by organizer. If that is not provided then we’ll use system image of Event Category as Event Image.

Here, we will first take the event topic of event as added by the organizer. Then we will fetch the the database row in Event Topic model which has id == event_topic_id . Then we will return the system image url of that event topic to the event image.

So we saw how we could provide a default image for any event.

Resources

Continue ReadingAdding System Image for Event Categories

Adding Push endpoint to send data from Loklak Search to Loklak Server

To provide enriched and sufficient amount of data to Loklak, Loklak Server should have multiple sources of data. The api/push.json endpoint of loklak server is used in Loklak to post the search result object to server. It will increase the amount and quality of data on server once the Twitter api is supported by Loklak (Work is in progress to add support for twitter api in loklak).

Creating Push Service

The idea is to create a separate service for making a Post request to server. First step would be to create a new ‘PushService’ under ‘services/’ using:

ng g service services/push

Creating model for Push Api Response

Before starting to write code for push service, create a new model for the type of response data obtained from Post request to ‘api/push.json’. For this, create a new file push.ts under ‘models/’ with the code given below and export the respective push interface method in index file.

export interface PushApiResponse {
   status: string;
   records: number;
   mps: number;
   message: string;
}

Writing Post request in Push Service

Next step would be to create a Post request to api/push.json using HttpClient module. Import necessary dependencies and create an object of HttpClient module in constructor and write a PostData() method which would take the data to be send, makes a Post request and returns the Observable of PushApiResponse created above.

import { Injectable } from ‘@angular/core’;
import {
   HttpClient,
   HttpHeaders,
   HttpParams
} from ‘@angular/common/http’;
import { Observable } from ‘rxjs’;
import {
	ApiResponse,
	PushApiResponse
} from ‘../models’;

@Injectable({
   providedIn: ‘root’
})
export class PushService {

   constructor( private http: HttpClient ) { }
   public postData(data: ApiResponse):
   		Observable<PushApiResponse> {

	const httpUrl = ‘https://api.loklak.org/
		api/push.json’;
	const headers = new HttpHeaders({
		‘Content-Type’: ‘application/
			x-www-form-urlencoded’,
		‘Accept’: ‘application/json’,
		‘cache-control’: ‘no-cache’
	});
	const {search_metadata, statuses} = data;
	
	// Converting the object to JSON string.
	const dataToSend = JSON.stringify({
		search_metadata: search_metadata,
		statuses});
	
	// Setting the data to send in
	// HttpParams() with key as ‘data’
	const body = new HttpParams()
		.set(‘data’, dataToSend);
	
	// Making a Post request to api/push.json
	// endpoint. Response Object is converted
	// to PushApiResponse type.
	return this.http.post<PushApiResponse>(
		httpUrl, body, {headers:
		headers
	});
   }
}

 

Note: Data (dataToSend) send to backend should be exactly in same format as obtained from server.

Pushing data into service dynamically

Now the main part is to provide the data to be send into the service. To make it dynamic, import the Push Service in ‘api-search.effects.ts’ file under effects and create the object of Push Service in its constructor.

import { PushService } from ‘../services’;
constructor(
   
   private pushService: PushService
) { }

 

Now, call the pushService object inside ‘relocateAfterSearchSuccess$’ effect method and pass the search response data (payload value of search success action) inside Push Service’s postData() method.

@Effect()
relocateAfterSearchSuccess$: Observable<Action>
   = this.actions$
       .pipe(
           ofType(
               apiAction.ActionTypes
			   	.SEARCH_COMPLETE_SUCCESS,
               apiAction.ActionTypes
			   	.SEARCH_COMPLETE_FAIL
           ),
           withLatestFrom(this.store$),
           map(([action, state]) => {
               this.pushService
			   .postData(action[‘payload’]);
           
       );

Testing Successful Push to Backend

To test the success of Post request, subscribe to the response data and print the response data on console. You should see something like:

Where each of these is a response of one successful Post request.

Resources

Continue ReadingAdding Push endpoint to send data from Loklak Search to Loklak Server

Adding Event Roles concerning a User on Open Event Server

The Open Event Server enables organizers to manage events from concerts to conferences and meetups. It offers features for events with several tracks and venues. Event managers can create invitation forms for speakers and build schedules in a drag and drop interface. The event information is stored in a database. The system provides API endpoints to fetch the data, and to modify and update it. The Open Event Server is based on JSON 1.0 Specification and hence build on top of Flask Rest Json API (for building Rest APIs) and Marshmallow (for Schema).

In this blog, we will talk about how to add different events role concerning a user on Open Event Server. The focus is on its model and Schema updation.

Model Updation

For the User Table, we’ll update our User Model as follows:

Now, let’s try to understand these hybrid properties.

In this feature, we are providing Admin the rights to see whether a user is acting as a organizer, co-organizer, track_organizer, moderator, attendee and registrar of any of the event or not. Here, _is_role method is used to check whether an user plays a event role like organizer, co-organizer, track_organizer, moderator, attendee and registrar or not. This is done by querying the record from UserEventsRole model. If the record is present then the returned value is True otherwise False.

Schema Updation

For the User Model, we’ll update our Schema as follows

Now, let’s try to understand this Schema.

Since all the properties will return either True or false so these all properties are set to Boolean in Schema. Here dump_only means, we will return this property in the Schema.

So, we saw how User Model and Schema is updated to show events role concerning a user on Open Event Server.

Resources

Continue ReadingAdding Event Roles concerning a User on Open Event Server

Adding Defaults Prior to Schema Validation Elegantly

The Open Event Server offers features for events with several tracks and venues. When we were designing the models for the API, we wanted to add default values for some fields in case they aren’t provided by the client. This blog discusses how we have implemented in the project using python decorators that complies to the DRY principle and is easy to test and maintain.

Problem

Let’s first discuss the problem at hand in detail. We use Marshmallow extensively in our project. Marshmallow is an ORM/ODM/framework-agnostic library for converting complex data types, such as objects, to and from native python data types. We use it for Validating the input data, Deserializing the input data to app-level objects and Serializing app-level objects to primitive Python types.

We can define Schema’s very easily using Marshmallow. It also provides an easy way to declare default values to the fields. Below is a sample schema declaration:

class SampleSchema(Schema):
    """
    Sample Schema declaration
    """

    class Meta:
        """
        Meta class for the Sample Schema
        """
        type_ = 'sample-schema'

    id = fields.Str(dump_only=True)
    field_without_default = fields.Str(required=True)
    field_with_default = fields.Boolean(required=True, default=False)

We have defined an id field for storing the unique ID of the Model. We have also defined two other fields. One of them named as “field_with_default” is a Boolean field and has a default value specified as False.

When a client makes a POST/PATCH request to the server, we first validate the input data sent to us by the clients against the defined schema. Marshmallow also supports schema validation but it doesn’t support using default values during deserialization of the input data. It meant that whenever the input data had a missing field, the server would throw a validation error even for a field for which the default values are defined. It was clearly wrong since if the default values are defined, we would want that value to be used for the field. This defeats the entire purpose of declaring default values at the first place.

So, we would ideally like the following behaviour from the Server:

  1. If the values are defined in the input data, use it during validation.
  2. If the value for a required field is not defined but default value has been defined in the Schema, then use that value.
  3. If no value has been defined for a required field and it doesn’t have any default value specified, then throw an error.

Solution

Marshmallow provides decorators like @pre_load and @post_load for adding pre-processing and post-processing methods. We can use them to add a method in each of the Schema classes which takes in the input data and the schema and adds default values to fields before we validate the input.

The first approach which we took was to add the following method to each of the schema classes defined in the project.

@pre_load
def patch_defaults(schema, in_data):
        data = in_data.get('data')
        if data is None or data.get('attributes') is None:
            return in_data
        attributes = data.get('attributes')
        for name, field in schema.fields.items():
            dasherized_name = dasherize(name)
            attribute = attributes.get(dasherized_name)
            if attribute is None:
                attributes[dasherized_name] = field.default
        return in_data

The method loops over all the fields defined in the schema class using schema.fields.item(). dasherize is a helper function defined in the utils class which converts underscores(_) in the variable name to dashes(-). After replacing the underscores with dashes we check if the value for the attribute is None. If it is None, then we assign it the specified default value.

Enhancing the solution

The above solution works but there is a problem. We have around 50 schemas defined in the project. Copy pasting this method 50 times would definitely violate the DRY principle. Moreover if we need to change this method in the future, we would have to do it 50 times.

One way to avoid it would be to add the patch_defaults method in a separate file and add a helper method make_object in each of the schema classes which just calls it.

@pre_load
def make_object(self, in_data):
    return patch_defaults(self, in_data)

We would still be repeating the helper method in 50 different files but since it’s sole purpose is to call the patch_defaults method, we won’t have to make changes in 50 files.

It certainly works well but we can go a step further and make it even easier. We can define a class decorator which would add the above make_object method to the class.

def use_defaults():
    """
    Decorator added to model classes which have default values specified for one of it's fields
    Adds the make_object method defined above to the class.
    :return: wrapper
    """
    def wrapper(k, *args, **kwargs):
        setattr(k, "make_object", eval("make_object", *args, **kwargs))
        return k
    return wrapper

Now we can simply add the use_defaults() decorator on the schema class and it would work.

References

Continue ReadingAdding Defaults Prior to Schema Validation Elegantly

Adding Statistics of Event-Type on Open Event Server

The Open Event Server enables organizers to manage events from concerts to conferences and meet-ups. It offers features for events with several tracks and venues. In this blog, we will talk about how to add statistics of event-type on Open Event Server. The focus is on to get number of events of a specific event type.

Number of events of a specific event type

Now, let’s try to understand this API. Here, we are using flask Blueprints to add this API to the API index.

  1. First of all, we are using the decorator of event_statistics which will append this API route with that of mentioned in the Blueprint event_statistics.
  2. We will just allow logged in user to access this API using JWT (JSON Web Token)
  3. To return the response having all the event types alongwith number of events of it, requires a lot of queries if tried to fulfilled by SQLALchemy ORM. So instead of using ORM we will query using SQL command so that we query the number of all the events of different event types in just one query, which will eventually reduces the time of server to return the response.
  4. In function event_types_count we are using db.engine.execute to run the SQL command of getting the statistics of events respective to event types.
  5. The response will include id of event_type, name of event_type and count of events of corresponding event_type.
  6. Finally, we jsonify the list having objects of statistics of events respective to event types.

Similarly, event topics statistics can be implemented to return the number of events of all the event topics.

Resources

Continue ReadingAdding Statistics of Event-Type on Open Event Server

Adding JSON-API to Badgeyay Backend

Badgeyay has two main components, the Python-Flask backend server and the EmberJS frontend.

EmberJS frontend uses ember data to save the data from the backend server api into the store of EmberJS frontend. To make the ember data frontend comply with backend api we need the backend server to send responses that comply with the standards of the JSON-API.

What is JSON-API?

As stated by JSONAPI.ORG

"If you've ever argued with your team about the way your JSON responses should be formatted, JSON API can be your anti-bikeshedding tool."

To put it up simply, JSON-API is a way of representing the JSON data that is being generated by the server backend. In this way we represent the JSON data in a particular way that follows the JSON-API convention. An example of data that follows json-api standards is given below:

{
"data": {
"id": "1",
"type": "posts",
"attributes": {
"title": "This is a JSON API data"
},
"relationships": {
"author": {
"links": {
"related": "/example/number"
}
},
"comments": {
"links": {
"related": "/example/number/article/"
}
"data": [
{"id": 5, "type": "example"},
{"id": 12, "type": "example"}
],
}
},
}
}

Adding JSON-API using Marshmallow-JSONAPI

We proceeded on to adding json-api into the Python-Flask backend. Before we proceed to adding json-api, we first need to install marshmallow_jsonapi

To install marshmallow_jsonapi

$ ~ pip install marshmallow-jsonapi

After installing marshmallow_jsonapi, we proceed onto making our first schema.

A schema is a layer of abstraction that is provided over a database model that can be used to dump data from or into an object. This object can therefore be used to either store in database or to dump it to the EmberJS frontend. Let us create a schema for File.

from marshmallow_jsonapi.flask import Schema
from marshmallow_jsonapi import fields


class FileSchema(Schema):
class Meta:
type_ = 'File'
self_view = 'fileUploader.get_file'
kwargs = {'id': '<id>'}

id = fields.Str(required=True, dump_only=True)
filename = fields.Str(required=True)
filetype = fields.Str(required=True)
user_id = fields.Relationship(
self_url='/api/upload/get_file',
self_url_kwargs={'file_id': '<id>'},
related_url='/user/register',
related_url_kwargs={'id': '<id>'},
include_resource_linkage=True,
type_='User'
)

So we have successfully created a Schema for getting files. This schema has an id, filename and filetype. It also has a relationship with the User.

Let us now create a route for this Schema. The below snippet of code is used to find a given file using this schema.

@router.route('/get_file', methods=['GET'])
def get_file():
input_data = request.args
file = File().query.filter_by(filename=input_data.get('filename')).first()
return jsonify(FileSchema().dump(file).data)

 

Now to get details of a file using our newly created route and schema all we need to do is use the following cURL command:

$ ~ curl -X GET "http://localhost:5000/api/upload/get_file?filename={your_file_name}"

You will get something like this as a response:

{
"data": {
"attributes": {
"filename": "13376967-8846-4c66-bcab-4a6b7d58aca7.csv",
"filetype": "csv"
},
"id": "967dc51b-289a-43a1-94c1-5cfce04b0fbf",
"links": {
"self": "/api/upload/get_file"
},
"relationships": {
"user_id": {
"data": {
"id": "J9v2LBIai1MOc8LijeLx7zWsP4I2",
"type": "User"
},
"links": {
"related": "/user/register",
"self": "/api/upload/get_file"
}
}
},
"type": "File"
},
"links": {
"self": "/api/upload/get_file"
}
}

Further Improvements

After adding JSON-API standards to the backend API we can easily integrate it with the EmberJS frontend. Now we can work on adding more schemas as a method of layers of abstraction so that the backend can serve more functionalities and the data can be consumed by the frontend as well.

Resources

Continue ReadingAdding JSON-API to Badgeyay Backend

Open Event Server – Pages API

This article illustrates how the Pages API has been designed and implemented on the server side, i.e., FOSSASIA‘s Open Event Server. Pages endpoint is used to create static pages such as “About Page” or any other page that doesn’t need to be updated frequently and only a specific content is to be shown.

Parameters

  1. name – This stores the name of the page.
      1. Type – String
      2. Required – Yes
  2. title – This stores the title of the page.
      1. Type – String
      2. Required – No
  3. url – This stores the url of the page.
      1. Type – String
      2. Required – Yes
  4. description – This stores the description of the page.
      1. Type – String
      2. Required – Yes
  5. language – This stores the language of the page.
      1. Type – String
      2. Required – No
  6. index – This stores the position of the page.
      1. Type – Integer
      2. Required – No
      3. Default – 0
  7. place – Location where the page will be placed.
      1. Type – String
      2. Required – No
      3. Accepted Values – ‘footer’ and ‘event’

These are the allowed parameters for the endpoint.

Model

Lets see how we model this API. The ORM looks like this :

__tablename__ = 'pages'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String, nullable=False)
title = db.Column(db.String)
url = db.Column(db.String, nullable=False)
description = db.Column(db.String)
place = db.Column(db.String)
language = db.Column(db.String)
index = db.Column(db.Integer, default=0)

As you can see, we created a table called “pages”. This table has 8 columns, 7 of which are the parameters that I have mentioned above. The column “id” is an Integer column and is the primary key column. This will help to differentiate between the various entries in the table.

The visualisation for this table looks as follows :

API

We support the following operations:

  1. GET all the pages in the database
  2. POST create a new page
  3. GET details of a single page as per id
  4. PATCH a single page by id
  5. DELETE a single page by id

To implement this we first add the routes in our python file as follows :

api.route(PageList, 'page_list', '/pages')
api.route(PageDetail, 'page_detail', '/pages/<int:id>')

Then we define these classes to handle the requests. The first route looks as follows:

class PageList(ResourceList):
   """
   List and create page
   """
   decorators = (api.has_permission('is_admin', methods="POST"),)
   schema = PageSchema
   data_layer = {'session': db.session,
                 'model': Page}

As can be seen above, this request requires the user to be an admin. It uses the Page model described above and handles a POST request.

The second route is:

class PageDetail(ResourceDetail):
   """
   Page detail by id
   """
   schema = PageSchema
   decorators = (api.has_permission('is_admin', methods="PATCH,DELETE"),)
   data_layer = {'session': db.session,
                 'model': Page}

This route also requires the user to be an admin. It uses the Page model and handles PATCH, DELETE requests.

To summarise our APIs are:

GET

/v1/pages{?sort,filter}

POST

/v1/pages{?sort,filter}

GET

/v1/pages/{page_id}

PATCH

/v1/pages/{page_id}

DELETE

/v1/pages/{page_id}

References

Continue ReadingOpen Event Server – Pages API

Generating Badges from Badgeyay API

Badgeyay is a badge generator and its main functionality is generating badges. Since the beginning of GSoC 2018 period, Badgeyay is under refactoring and remodeling process. We have introduced many APIs to make sure that Badgeyay works. Now, the badge generator has an endpoint to generate badges for your events/meetups

How to create badges?

Creating badges using the newly formed API is simpler than before. All you need to do is pass some basic details of the image you want, the data you want, the size and the color of font etc to the API and woosh! Within a blink of your eye the badges are generated.

Backend requires some data fields to generate badges

{
"csv" : "a731h-jk12n-bbau2-saj2-nxg31.csv",
"image" : "p2ja7-gna398-c23ba-naj31a.png",
"text-color" : "#ffffff"
}

“csv” is the filename of the csv that the user uploads and get back as a result, “image” is the image name that user gets after a successful upload to the respective APIs, “text-color” is the color of the text that the user wants on the badges.

Output of the API

{
"output" :  "path-to-the-pdf-of-the-badge-generated",
.
.
}

What is happening behind the scene?

Once the user sends the data to the API, the required route is triggered and the data is checked,If the data is not present an error response is sent back to the user so as to inform them about the misplacement or improper format of data.

import os
from flask import Blueprint, jsonify, request
from flask import current_app as app
# from api.helpers.verifyToken import loginRequired
from api.utils.response import Response
from api.utils.svg_to_png import SVG2PNG
from api.utils.merge_badges import MergeBadges


router = Blueprint('generateBadges', __name__)


@router.route('/generate_badges', methods=['POST'])
def generateBadges():
try:
data = request.get_json()
except Exception as e:
return jsonify(
Response(401).exceptWithMessage(str(e),'Could not find any JSON'))

if not data.get('csv'):
return jsonify(
Response(401).generateMessage('No CSV filename found'))
if not data.get('image'):
return jsonify(Response(401).generateMessage('No Image filename found'))
csv_name = data.get('csv')
image_name = data.get('image')
text_color = data.get('text-color') or '#ffffff'
svg2png = SVG2PNG()
svg2png.do_text_fill('static/badges/8BadgesOnA3.svg', text_color)
merge_badges = MergeBadges(image_name, csv_name)
merge_badges.merge_pdfs()

output = os.path.join(app.config.get('BASE_DIR'), 'static', 'temporary', image_name)
return jsonify(
Response(200).generateMessage(str(output)))

 

After the data is received, we send it to MergeBadges which internally calls the GenerateBadges class which creates the badges.

Brief explanation of the Badge Generation Process:
- Gather data from the user- Fill the SVG for badges with the text color

- Load the image from uploads directory
- Generate badges for every individual
- Create PDFs for individual Badges
- Merge those PDFs to provide an all-badges pdf to the user

 

And this is how we generated badges for the user using the Badgeyay Backend API.

How is this effective?

We are making sure that the user chooses the image and csv that he/she has uploaded only,

In this way we maintain a proper workflow, we also manage these badges into the database and hence using the filenames helps a lot.It does not involve sending huge files and a lot of data like we had in the previous API.

Earlier, we used to send the image and the csv altogether that caused a serious mismanagement of the project. In this case we are accepting the CSVs and the Images on different API routes and then using the specific image and csv to make badges. We can now more easily relate to the files associated with each and every badge and henceforth we can easily manage them in the database.

Further Improvements

We will work on adding security to the route so that not anyone can create badges. We also need to integrate database into badges generated service so that we can maintain the badges that the user has generated.

Resources

Continue ReadingGenerating Badges from Badgeyay API