A guide to use Permission Manager in Open Event API Server

This article provides a simple guide to use permission manager in Open Event API Server. Permission manager is constantly being improved and new features are being added into it. To ensure that all co-developers get to know about it and make use of them, this blog posts describes every part of permission manager.

Bootstrapping

Permission manager as a part of flask-rest-jsonapi works as a decorator for different resources of the API. There are two ways to provide the permission decorator to any view

  • First one is to provide it in the list of decorators
decorators = (api.has_permission('is_coorganizer', fetch="event_id",
                                fetch_as="event_id", model=StripeAuthorization),)
    • Second way is to explicitly provide it as a decorator to any view
@api.has_permission('custom_arg', custom_kwargs='custom_kwargs')
    def get(*args, **kwargs):
        return 'Hello world !'

In the process of booting up, we first need to understand the flow of Resources in API. All resources even before doing any schema check, call the decorators. So this way you will not get any request data in the permission methods. All you will receive is a dict of the URL parameters but again it will not include the filter parameters.
Permission Manager receives five parameters as: 

def permission_manager(view, view_args, view_kwargs, *args, **kwargs):

First three are provided into it implicitly by flask-rest-jsonapi module

  • view: This is the resource’s view method which is called through the API. For example, if I go to /events then the get method of ResourceList will be called.
  • view_args: These are args associated with that view.
  • view_kwargs: These are kwargs associated with that resource view. It includes all your URL parameters as well.
  • args: These are the custom args which are provided when calling the permission manager. Here at permission manager is it expected that the first index of args will be the name of permission to check for.
  • kwargs: This is the custom dict which is provided on calling the permission manager. The main pillar of the permission manager. Described below in usage.

Using Permission Manager

Using permission manager is basically understanding the different options you can send through the kwargs so here is the list of the things you can send to permission manager
These are all described in the order of priority in permission manager

  • method (string): You can provide a string containing the methods where permission needs to be checked as comma separated values of different methods in a string.
    For example: method=”GET,POST”
  • leave_if (lambda): This receives a lambda function which should return boolean values. Based on returned value if is true then it will skip the permission check. The provided lambda function receives only parameter, “view_kwargs”
    Example use case can be the situation where you can leave the permission for any specifically related endpoint to some resource and would like to do a manual check in the method itself.
  • check (lambda): Opposite to leave_if. It receives a lambda function that will return boolean values. Based on returned value, If it is true then only it will go further and check the request for permissions else will throw forbidden error.
  • fetch (string): This is the string containing the name of the key which has to be fetched for the fetch_as key (described below). Permission manager will first look for this value in view_kwargs dict object. If it is not there then it will make the query to get one(described below at model )
  • fetch_as (string): This is the string containing the name of a key. The value of fetch key will be sent to the permission functions by this name.
  • model (string): This is one most interesting concept here. To get the value of the fetch key. Permission manager first looks into view_kwargs and if there no such value then you can still get one through the model. The model attribute here receives the class of the database model which will be used to get the value of the fetch key.
    It makes the query to get the single resource from this model and look for the value of the fetch key and then pass it to the permission functions/methods.
    The interesting part is that by default it uses <id> from view_kwargs to get the resource from the model but in any case if there is no specific ID with name <id> on the view_kwargs. You can use these two options as:
  • fetch_key_url (string): This is the name of the key whose value will be fetched from view_kwargs and will be used to match the records in database model to get the resource.
  • fetch_key_model (string): This is the name of the match column in the database model for the fetch_key_url, The value of it will be matched with the column named as the value of fetch_key_model.
    In case there is no record found in the model then permission manager will throw NotFound 404 Error.

A helper for permissions

The next big thing in permission manager is the addition of new helper function “has_access”

def has_access(access_level, **kwargs):
   if access_level in permissions:
       auth = permissions[access_level](lambda *a, **b: True, (), {}, (), **kwargs)
       if type(auth) is bool and auth is True:
           return True
   return False

This method allows you to check the permission at the mid of any method of any view and of any resource. Just provide the name of permission in the first parameter and then the additional options needed by the permission function as the kwargs values.
This does not throw any exception. Just returns the boolean value so take care of throwing any exception by yourselves.

Anything to improve on?

I will not say this exactly as the improvement but I would really like to make it more meaningful and interesting to add permission. May be something like this below:

permission = "Must be co_organizer OR track_organizer, fetch event_id as event_id, use model Event"

This clearly needs time to make it. But I see this as an interesting way to add permission. Just provide meaningful text and rest leave it to the permission manager.

Continue ReadingA guide to use Permission Manager in Open Event API Server

Download SUSI.AI Setting Files from Data Folder

In this blog, I will discuss how the DownloadDataSettings servlet hosted on SUSI server functions. This post also covers a step by step demonstration on how to use this feature if you have hosted your own custom SUSI server and have admin rights to it. Given below is the endpoint where the request to download a particular file has to be made.

/data/settings

For systematic functionality and workflow, Users with admin login, are given a special access. This allows them to download the settings files and go through them easily when needed. There are various files which have email ids of registered users (accounting.json), user roles associated to them (authorization.json), groups they are a part of (groups.json) etc. To list all the files in the folder, use the given below end point:

/aaa/listSettings.json

How does the above servlet works? Prior to that, let us see how to to get admin rights on your custom SUSI.AI server.
For admin login, it is required that you have access to files and folders on server. Signup with an account and browse to

/data/settings/authorization.json

Find the email id with which you signed up for admin login and change userRole to “admin”. For example,

{
	"email:test@test.com": {
		"permissions": {},
		"userRole": "user"
	}
}

If you have signed up with an email id “test@test.com” and want to give admin access to it, modify the userRole to “admin”. See below.

{
	"email:test@test.com": {
		"permissions": {},
		"userRole": "admin"
	}
}

Till now, server did not have any email id with admin login or user role equal to admin. Hence, this exercise is required only for the first admin. Later admins can use changeUserRole application and give/change/modify user roles for any of the users registered. By now you must have admin login session. Let’s see now how the download and file listing servlets work.
First, the server creates a path by locally referencing settings folder with the help of DAO.data_dir.getPath(). This will give a string path to the data directory containing all the data-settings files. Now the server just has to make a JSONArray and has to pass a String array to JSONArray’s constructor, which will eventually be containing the name of all the data/settings files. If the process is not successfull ,then, “accepted” = false will be sent as an error to the user. The base user role to access the servlet is ADMIN as only admins are allowed to download data/setting files,
The file name which you have to download has to be sent in a HTTP request as a get parameter. For example, if an admin has to download accounting.json to get the list of all the registered users, the request is to be made in the following way:

BASE_URL+/data/settings?file=file_name

*BASE_URL is the URL where the server is hosted. For standard server, use BASE_URL = http://api.susi.ai.

In the initial steps, Server generates a path to data/settings folder and finds the file, name of which it receives in the request. If no filename is specified in the API call, by default, the server sends accounting.json file.

File settings = new File(DAO.data_dir.getPath()+"/settings");
String filePath = settings.getPath(); 
String fileName = post.get("file","accounting"); 
filePath += "/"+fileName+".json";

Next, the server will extract the file and using ServletOutputStream class, it will generate properties for it and set appropriate context for it. This context will, in turn, fetch the mime type for the file generated. If the mime type is returned as null, by default, mime type for the file will be set to application/octet-stream. For more information on mime type, please look at the following link. A complete list of mime types is compiled and documented here.

response.setContentType(mimetype);
response.setContentLength((int)file.length());

In the above code snippet, mime type and length of the file being downloaded is set. Next, we set the headers for the download response and use filename for that.

response.setHeader("Content-Disposition", "attachment; filename=" + fileName +".json");

All the manual work is done by now. The only thing left is to open a buffer stream, size of which has been defined as a class variable.
Here we use a byte array of size 4096 elements and write the file to client’s default download storage.

private static final int BUFSIZE = 4096;
byte[] byteBuffer = new byte[BUFSIZE];
             DataInputStream in = new DataInputStream(new FileInputStream(file));
            while ((in != null) && ((length = in.read(byteBuffer)) != -1))
            {
                outStream.write(byteBuffer,0,length);
            }

            in.close();
            outStream.close();

All the above-mentioned steps are enclosed in a try-catch block, which catches an exception if any ,and logs it in the log file. This message is also sent to the client for appropriate user information along with the success or failure indication through a boolean flag. Do not forget to close the input and output buffers as it may lead to memory leaks and someone with proper knowledge of network and buffer stream would be able to steal any essential or secured data.

Additional Resources

Continue ReadingDownload SUSI.AI Setting Files from Data Folder

Generating the Mozilla All Hands Open Event Android App

The main aim of FOSSASIA Open Event Android App is to give an event organiser the ability to generate the app through a single click by providing the necessary json and binary files. The app was tested on the Mozilla All Hands 2017 event which is an annual conference held for Mozilla employees to showcase their experience and also interact with each other.  The sample files for the event can be found here. (https://github.com/fossasia/open-event/tree/master/sample/MozillaAllHands17). The data for the event was taken from here. (https://sanfranciscoallhandsjune2017.sched.com/).

What was required for building the sample for Mozilla All Hands 2017 event?

  • images folder containing the necessary images of speaker having support for local images too, the logo of the event, the background image of the event.
  • event json file which has all the event specific information like the name of the event, the schedule of the event, the description of the event etc.
  • forms json file having session and speaker form data.
  • meta json file having the root url of the event.
  • microlocations json file having all the locations where the sessions are going to happen.
  • session_types json file consisting data of all the type of session which will occur in the event (the length of session, the name of type and the id).
  • sessions json file consisting session specific data like the title of the session, start time and end time of session, which track that session belongs to etc.
  • speakers json file consisting of speaker specific data like the name of the speaker, image of the speaker, social links of the speaker etc.
  • sponsors json file consisting list of all sponsors of the event.
  • tracks json file consisting of tracks specific data.
  • config.json file which consists of the api url, app name.

After generating the required zip containing the above json files, the zip is uploaded to this site (http://droidgen.eventyay.com/) and we get an apk after filling the required information. This means all the event organiser is able to generate the sample app by providing the necessary information about the event.

The sample can be found here:

Folder Link: https://github.com/fossasia/open-event/tree/master/sample/MozillaAllHands17

Zip Link: https://github.com/fossasia/open-event/tree/master/sample/MozillaAllHands17.zip

How did the Mozilla All Hands 2017 sample app look like?

The screenshots of the sample apk can be seen below:

 

 

What were the issues found in the sample app?

As compared to the previous sample apk like Google IO Open Event Android App some issues like support for local speaker images, background issue of the logo were solved. However there were certain issues which we observed on testing the app with the Mozilla All Hands 2017 event:

  1. The theme of the app remains the same no matter which event it is. It is important to give the event organiser the ability to customise the theme of the app.
  2. Certain information in the app like the event information is hard-coded and needs to be taken from the assets folder instead of strings.xml.

Related Links

Continue ReadingGenerating the Mozilla All Hands Open Event Android App

Using Jackson Library in Open Event Android App for JSON Parsing

Jackson library is a popular library to map Java objects to JSON and vice-versa and has a better parsing speed as compared to other popular parsing libraries like GSON, JSONP etc. This blog post gives a basic explanation on how we utilised the Jackson library in the Open Event Android App.

To use the Jackson library we need to add the dependency for it in the app/build.gradle file.

compile 'com.squareup.retrofit2:converter-jackson:2.2.0'

After updating the app/build.gradle file with the code above we were able to use Jackson library in our project.

Example of Jackson Library JSON Parsing

To explain how the mapping is done let us take an example. The file given below is the sponsors.json file from the android app.

[
    “description”: “”,
    “id”: 1,
    “level”: 3,
    “name”:  KI Group,
    “type”: Gold,
    “url: “”,
    “logo-url”: “” 
]

The sponsors.json consists of mainly 7 attributes namely description, id, level, name, type, url and logo url for describing one sponsor of the event. The Sponsors.java file for converting the json response to Java POJO objects was done as follows utilizing the Jackson library:

public class Sponsors extends RealmObject {
    @JsonProperty(“id”)
    private int id;

    @JsonProperty(“type”)
    private String type;

    @JsonProperty(“description”)
    private String description;

    @JsonProperty(“level”)
    private String level;

    @JsonProperty(“name”)
    private String name;

    @JsonProperty(“url”)
    private String url;

    @JsonProperty(“logo-url”)
    private String logoUrl;
}

As we can see from the above code snippet, the JSON response is converted to Java POJO objects simply by using the annotation “@JsonProperty(“”)” which does the work for us.

Another example which makes this library amazing are the setter and getter annotations which help us use a single variable for two different json attributes if need be. We faced this situation when we were moving from the old api to the new json api. In that case we wanted support for both, old and new json attributes. In that case we simply used the following code snippet which made our transition to new api easier.

public class Sponsors extends RealmObject {
    private int id;
    private String type;
    private String description;
    private String level;
    private String name;
    private String url;
    private String logoUrl;

    @JsonSetter(“logo”)
    public void setLogo(String logoUrl) {
        this.logoUrl = logoUrl;
    }

    @JsonSetter(“logo-url”)
    public void setLogo(String logoUrl) {
        this.logoUrl = logoUrl;
    }
}

As we can see the setter annotations allow easy naming of variable to multiple attributes if need be thus making the code easily adaptable with less overload.

Related Links:

Continue ReadingUsing Jackson Library in Open Event Android App for JSON Parsing

Using Flask SocketIO Library in the Apk Generator of the Open Event Android App

Recently Flask SocketIO library was used in the apk generator of the Open Event Android App as it gave access to the low latency bi-directional communications between the client and the server side. The client side of the apk generator was written in Javascript which helped us to use a SocketIO official client library to establish a permanent connection to the server.

The main purpose of using the library was to display logs to the user when the app generation process goes on. This gives the user an additional help to check what is the mistake in the file uploaded by them in case an error occurs. Here the library established a connection between the server and the client so that during the app generation process the server would send real time logs to the client so that they can be viewed by the user displayed in the frontend.

To use this library we first need to download it using pip command:

pip install flask-socketio

This library depends on the asynchronous services which can be selected amongst the following listed below:

  1. Eventlet
  2. Gevent
  3. Flask development server based on Werkzeug

Amongst the three listed, eventlet is the best performant option as it has support for long polling and WebSocket transports.

The next thing was importing this library in the flask application i.e the apk generator of the app as follows:

from flask_socketio import SocketIO

current_app = create_app()
socketio = SocketIO(current_app)

if __name__ == '__main__':
    socketio.run(current_app)

The main use of the above statement (socket.run(current_app)) is that with this the startup of the web server is encapsulated. When the application is run in debug mode it is preferred to use the Werkzeug development server. However in production mode the use of eventlet web server or gevent web server is recommended.

We wanted to show the status messages currently shown to the user in the form of logs. For this firstly the generator.py file was looked upon which had the status messages and these were sent to the client side of the generator by establishing a connection between them using this library. The code written on the server side to send messages to the client side was as follows:

def handle_message(message):
    if message not None:
        namespace = “/” + self.identifier;
        send(message, namespace = namespace)

As we can see from the above code the message had to be sent to that particular namespace. The main idea of using namespace was that if there were more than one users using the generator at the same time, it would mean the send() method would send the messages to all the clients connected which would lead to the mixing up of status messages and hence it was important to use namespace so that a message was sent to that particular client.

Once the server sent the messages to the client we needed to add functionality to the our client side to receive the messages from them and also update the logs area with suitable content received. For that first the socket connection was established once the generate button was clicked which generated the identifier for us which had to be used as the namespace for that process.

socket = io.connect(location.protocol + "//" + location.host + "/" + identifier, {reconnection: false});

This piece of code established the connection of the socket and helps the client side to receive and send messages from and to the server end.

Next thing was receiving messages from the server. For this the following code snippet was added:

socket.on('message', function(message) {
    $buildLog.append(message);
});

This piece of code receives the messages from the server for unnamed events. Once the connection is established and the messages are received, the logs are updated with each message being appended so as to show the user the real time information about their build status of their app.

This was how the idea of logs being shown in the apk generator was incorporated by making the required changes in the server and client side code using the Flask SocketIO library.

Related Links:

Continue ReadingUsing Flask SocketIO Library in the Apk Generator of the Open Event Android App

Writing Tests for ISO8601Date.java class of the Open Event Android App

ISO8601Date.java class of Open Event Android App was a util class written to perform the date manipulation functions and ensure the code base got more simpler and deterministic. However it was equally important to test the result from this util class so as to ensure the result returned by it was what we wanted. A test class named “DateTest.java” was written to ensure all the edge cases of conversion of the dates string from one timezone to another timezone were handled properly.

For writing unit tests, first we needed to add these libraries as dependencies in the app’s top level build.gradle file as shown below:

dependencies {
  testCompile 'junit:junit:4.12'
}

Then a JUnit 4 Test class, which was a Java class containing the required test methods was created. The structure of the class looked like this:

public class DateTest {
   @Test
   public void methodName() {
   }
}

Next step was including all the required methods which ensured the util class returned the correct results according to our needs. Various edge cases were taken into account by including functions like converting of date string from local time zone to specified timezone, from international timezone to local timezone, from local timezone to international timezone and many more. Some of the methods which were added in the class are shown below:

  • Test of conversion from local timezone to specified timezone:

This function aimed at ensuring the util class worked well with date conversion from local time zone to a specified timezone. An example as shown below was taken where the conversion of the date string was tested from UTC timezone to Singapore timezone.

@Test
public void shouldConvertLocalTimeZoneDateStringToSpecifiedTimeZoneDateString() {
    ISO8601Date.setTimeZone("UTC");
    ISO8601Date.setEventTimeZone("Asia/Singapore");

    String dateString1 = "2017-06-02T07:59:10Z";
    String actualString = ISO8601Date.getTimeZoneDateStringFromString(dateString1);
    String expectedString = "Thu, 01 Jun 2017, 23:59, UTC";
    Assert.assertEquals(expectedString, actualString);
}
  • Test of conversion from local timezone to international timezone:

This function aimed at ensuring the util class worked well with the date conversion from local timezone to international timezone. An example as shown below was taken where the conversion of the date string was tested from Amsterdam timezone to Singapore timezone.

@Test
public void shouldConvertLocalTimeZoneDateStringToInternationalTimeZoneDateString() {
    ISO8601Date.setTimeZone("Europe/Amsterdam");
    ISO8601Date.setEventTimeZone("Asia/Singapore");

    String dateString = "2017-06-02T02:29:10Z";
    String actualString = ISO8601Date.getTimeZoneDateStringFromString(dateString);
    String expectedString = "Thu, 01 Jun 2017, 20:29, GMT+02:00";
    Assert.assertEquals(expectedString, actualString);
}


Above were some functions added to ensure the conversion of a date string from one timezone to another was correct and thus ensured the util class was working properly and returned the results as required.

The last thing left was running the test to check the results the util class returned. For this we had to do two things:

  1. Sync the project with Gradle.
  2. Run the test by right clicking on the class and selecting “Run” option.

Through this we were able to run the test and check the output of the util class on different cases through the results which could be seen on the Android Monitor in the Android Studio.

Related Links:

  1. This link is about building effective unit tests in android. (https://developer.android.com/training/testing/unit-testing/index.html)
  2. This link is about the unit testing on date processing. (https://stackoverflow.com/questions/565289/unit-testing-code-that-does-date-processing-based-on-todays-date)
Continue ReadingWriting Tests for ISO8601Date.java class of the Open Event Android App

Adding Sentry Integration in Open Event Orga Android App

Sentry is a service that allows you to track events, issues and crashes in your apps and provide deep insights with context about them. This blog post will discuss how we implemented it in Open Event Orga App (Github Repo).

Configuration

First, we need to include the gradle dependency in build.gradle
compile ‘io.sentry:sentry-android:1.3.0’
Now, our project uses proguard for release builds which obfuscates the code and removes unnecessary class to shrink the app. For the crash events to make sense in Sentry dashboard, we need proguard mappings to be uploaded every time release build is generated. Thankfully, this is automatically handled by sentry through its gradle plugin, so to include it, we add this in our project level build.gradle in dependencies block

classpath 'io.sentry:sentry-android-gradle-plugin:1.3.0'

 

And then apply the plugin by writing this at top of our app/build.gradle

apply plugin: 'io.sentry.android.gradle'

 

And then configure the options for automatic proguard configuration and mappings upload

sentry {
   // Disables or enables the automatic configuration of proguard
   // for Sentry.  This injects a default config for proguard so
   // you don't need to do it manually.
   autoProguardConfig true

   // Enables or disables the automatic upload of mapping files
   // during a build.  If you disable this you'll need to manually
   // upload the mapping files with sentry-cli when you do a release.
   autoUpload false
}

 

We have set the autoUpload to false as we wanted Sentry to be an optional dependency to the project. If we turn it on, the build will crash if sentry can’t find the configuration, which we don’t want to happen.

Now, as we want Sentry to configurable, we need to set Sentry DSN as one of the configuration options. The easiest way to externalize configuration is to use environment variables. There are other methods to do it given in the official documentation for config https://docs.sentry.io/clients/java/config/

Lastly, for proguard configuration, we also need 3 other config options, namely:

defaults.project=your-project
defaults.org=your-organisation
auth.token=your-auth-token

 

For getting the auth token, you need to go to https://sentry.io/api/

Now, the configuration is complete and we’ll move to the code

Implementation

First, we need to initialise the sentry instance for all further actions to be valid. This is to be done when the app starts, so we add it in onCreate method Application class of our project by calling this method

// Sentry DSN must be defined as environment variable
// https://docs.sentry.io/clients/java/config/#setting-the-dsn-data-source-name
Sentry.init(new AndroidSentryClientFactory(getApplicationContext()));

 

Now, we’re all set to send crash reports and other events to our Sentry server. This would have required a lot of refactoring if we didn’t use Timber for logging. We are using default debug tree for debug build and a custom Timber tree for release builds.

if (BuildConfig.DEBUG)
   Timber.plant(new Timber.DebugTree());
else
   Timber.plant(new ReleaseLogTree());

 

The ReleaseLogTree extends Timber.Tree which is an abstract class requiring you to override this function:

@Override
protected void log(int priority, String tag, String message, Throwable throwable) {

 }

 

This function is called whenever there is a log event through Timber and this is where we send reports through Sentry. First, we return from the function if the event priority is debug or verbose

if(priority == Log.DEBUG || priority == Log.VERBOSE)
   return;

 

If the event if if info priority, we attach it to sentry bread crumb

if (priority == Log.INFO) {
    Sentry.getContext().recordBreadcrumb(new BreadcrumbBuilder()
          .setMessage(message)
          .build());
}

 

Breadcrumbs are stored and only send with an event. What event comprises for us is the crash event or something we want to be logged to dashboard whenever the user does it. But since info events are just user interactions throughout the app, we don’t want to crowd the issue dashboard with them. However, we want to understand what user was doing before the crash happened, and that is why we use bread crumbs to store the events and only send them attached to a crash event. Also, only the last 100 bread crumbs are stored, making it easier to parse through them.

Now, if there is an error event, we want to capture and send it to the server

if (priority == Log.ERROR) {
   if (throwable == null)
       Sentry.capture(message);
   else
       Sentry.capture(throwable);
}

 

Lastly, we want to set Sentry context to be user specific so that we can easily track and filter through issues based on the user. For that, we create a new class ContextManager with two methods:

  • setOrganiser: to be called at login
  • clearOrganiser: to be called at logout

public void setOrganiser(User user) {
   Map<String, Object> userData = new HashMap<>();
   userData.put("details", user.getUserDetail());
   userData.put("last_access_time", user.getLastAccessTime());
   userData.put("sign_up_time", user.getSignupTime());

   Timber.i("User logged in - %s", user);
   Sentry.getContext().setUser(
       new UserBuilder()
       .setEmail(user.getEmail())
       .setId(String.valueOf(user.getId()))
       .setData(userData)
       .build()
   );
}

 

In this method, we put all the information about the user in the context so that every action from here on is attached to this user.

public void clearOrganiser() {
   Sentry.clearContext();
}

 

And here, we just clear the sentry context.

This concludes the implementation of our sentry client. Now all Timber log events will through sentry and appropriate events will appear on the sentry dashboard. To read more about sentry features and Timber, visit these links:

Sentry Java Documentation (check Android section)

https://docs.sentry.io/clients/java/

Timber Library

https://github.com/JakeWharton/timber

Continue ReadingAdding Sentry Integration in Open Event Orga Android App

Implementing Attendee Detail BottomSheet UI in Open Event Orga App

In Open Event Orga App (Github Repo), we allow the option to check the attendee details before checking him/her in or out. Originally, a dialog was shown showing the attendee details, which did not contain much information about the attendee, ticket or the order. The disadvantage of such design was also that it was tied to only one view. We couldn’t show the check in dialog elsewhere in the app, like during QR scanning. So we had to switch back to the attendee view for showing the check in dialog. We decided to create a usable detached component in the form of a bottom sheet containing all required information. This blog will outline the procedure we employed to design the bottom sheet UI.

The attendee check in dialog looked like this:

So, first we decide what we need to show on the check in bottom sheet:

  • Attendee Name
  • Attendee Email
  • Attendee Check In Status
  • Order Status ( Completed, Pending, etc )
  • TIcket Type ( Free, Paid, Donation )
  • Ticket Price
  • Order Date
  • Invoice Number
  • Order ‘Paid Via’

As we are using Android Data Binding in our layout, we’ll start by including the variables required in the layout. Besides the obvious attendee variable, we need presenter instance to handle the check in and check out of the attendee and DateUtils class to parse the order date. Additionally, to handle the visibility of views, we need to include the View class too

<data>
   <import type="org.fossasia.openevent.app.utils.DateUtils" />
   <import type="android.view.View" />

   <variable
       name="presenter"
       type="org.fossasia.openevent.app.event.checkin.contract.IAttendeeCheckInPresenter" />

   <variable
       name="checkinAttendee"
       type="org.fossasia.openevent.app.data.models.Attendee" />
</data>

 

Then, we make the root layout to be CoordinatorLayout and add a NestedScrollView inside it, which contains a vertical linear layout in it. This vertical linear layout will contain our fields.

Note: For brevity, I’ll skip most of the layout attributes from the blog and only show the ones that correspond to the text

Firstly, we show the attendee name:

<TextView
   style="@style/TextAppearance.AppCompat.Headline"
   android:text='@{attendee.firstName + " " + attendee.lastName }'
   tools:text="Name" />

 

The perks of using data binding can be seen here, as we are using string concatenation in layout itself. Furthermore, data binding also handles null checks for us if we add a question mark at the end of the variable name ( attendee.firstName? ).

But our server ensures that both these fields are not null, so we skip that part.

Next up, we display the attendee email

<TextView
   android:text="@{ checkinAttendee.email }"
   tools:text="xyz@example.com" />

 

And then the check in status of the attendee

<TextView
   android:text="@{ checkinAttendee.checkedIn ? @string/checked_in : @string/checked_out }"
   android:textColor="@{ checkinAttendee.checkedIn ? @color/light_green_500 : @color/red_500 }"
   tools:text="CHECKED IN" />

 

Notice that we dynamically change the color and text based on the check in status of the attendee

Now we begin showing the fields with icons to their left. You can use Compound Drawable to achieve this effect, but we use vector drawables which are incompatible with compound drawables on older versions of Android, so we use a horizontal LinearLayout instead.

The first field is the order status denoting if the order is completed or in transient state

<LinearLayout android:orientation="horizontal">

   <ImageView app:srcCompat="@drawable/ic_transfer" />
   <TextView android:text="@{ checkinAttendee.order.status }" />
</LinearLayout>

 

Now, again for keeping the snippets relevant, I’ll skip the icon portion and only show the text binding from now on.

Next, we include the type of ticket attendee has. There are 3 types of ticket supported in Open Event API – free, paid, donation

<TextView
   android:text="@{ checkinAttendee.ticket.type }"  />

 

Next, we want to show the price of the ticket, but only when the ticket is of paid type.

I’ll include the previously omitted LinearLayout part in this snippet because it is the view we control to hide or show the field

<LinearLayout
   android:visibility='@{ checkinAttendee.ticket.type.equalsIgnoreCase("paid") ? View.VISIBLE : View.GONE }'>

   <ImageView app:srcCompat="@drawable/ic_coin" />
   <TextView
       android:text='@{ "$" + checkinAttendee.ticket.price }'
       tools:text="3.78" />
</LinearLayout>

 

As you can see, we are showing this layout only if the ticket type equals paid

The next part is about showing the date on which the order took place

<TextView
   android:text="@{ DateUtils.formatDateWithDefault(DateUtils.FORMAT_DAY_COMPLETE, checkinAttendee.order.completedAt) }" />

 

Here we are using internal DateUtils method to format the date into complete date time from the ISO 8601 standard date present in the order object

Now, we show the invoice number of the order

<TextView
   android:text="@{ checkinAttendee.order.invoiceNumber }" />

 

Lastly, we want to show how the ticket was paid for via

<LinearLayout
   android:visibility='@{ checkinAttendee.order.paidVia.equalsIgnoreCase("free") ? View.GONE : View.VISIBLE }'>

   <ImageView app:srcCompat="@drawable/ic_ray" />
   <TextView  android:text="@{ checkinAttendee.order.paidVia }" />
</LinearLayout>

 

Notice that here too we are controlling the visibility of the layout container and only showing it if the ticket type is paid

This ends our vertical linear layout showing the fields about attendee detail. Now, we add a floating action button to toggle the check in status of attendee

<FrameLayout
   android:layout_gravity="top|end">

   <android.support.design.widget.FloatingActionButton
       android:layout_gravity="center"
       android:onClick="@{() -> presenter.toggleCheckIn() }"
       app:backgroundTint="@{ checkinAttendee.checkedIn ? @color/red_500 : @color/light_green_500 }"
       app:srcCompat="@{ checkinAttendee.checkedIn ? @drawable/ic_checkout : @drawable/ic_checkin }"
       app:tint="@android:color/white" />

   <ProgressBar
       android:layout_gravity="center" />

</FrameLayout>

 

We have used a FrameLayout to wrap a FAB and progress bar together in top end of the bottom sheet. The progress bar shows the indeterminate progress of the toggling of attendee status. And you can see the click binder on FAB triggering the presenter method toggleCheckIn() and how the background color and icon change according to the check in status of the attendee.

This wraps up our layout design. Now we just have to create a BottomSheetDialogFragment, inflate this layout in it and bind the attendee variable and we are all set. The result with all fields visible looks like this:

To learn more about bottom sheet and android data binding, please refer to these links:

Continue ReadingImplementing Attendee Detail BottomSheet UI in Open Event Orga App

Invalidating user login using JWT in Open Event Orga App

User authentication is an essential part of Open Event Orga App (Github Repo), which allows an organizer to log in and perform actions on the event he/she organizes. Backend for the application, Open Event Orga Server sends an authentication token on successful login, and all subsequent privileged API requests must include this token. The token is a JWT (Javascript Web Token) which includes certain information about the user, such as identifier and information about from when will the token be valid, when will it expire and a signature to verify if it was tampered.

Parsing the Token

Our job was to parse the token to find two fields:

  • Identifier of user
  • Expiry time of the token

We stored the token in our shared preference file and loaded it from there for any subsequent requests. But, the token expires after 24 hours and we needed our login model to clear it once it has expired and shown the login activity instead.

To do this, we needed to parse the JWT and compare the timestamp stored in the exp field with the current timestamp and determine if the token is expired. The first step in the process was to parse the token, which is essentially a Base 64 encoded JSON string with sections separated by periods. The sections are as follows:

  • Header ( Contains information about algorithm used to encode JWT, etc )
  • Payload ( The data in JWT – exp. Iar, nbf, identity, etc )
  • Signature ( Verification signature of JWT )

We were interested in payload and for getting the JSON string from the token, we could have used Android’s Base64 class to decode the token, but we wanted to unit test all the util functions and that is why we opted for a custom Base64 class for only decoding our token.

So, first we split the token by the period and decoded each part and stored it in a SparseArrayCompat

public static SparseArrayCompat<String> decode(String token) {
   SparseArrayCompat<String> decoded = new SparseArrayCompat<>(2);

   String[] split = token.split("\\.");
   decoded.append(0, getJson(split[0]));
   decoded.append(1, getJson(split[1]));

   return decoded;
}

 

The getJson function is primarily decoding the Base64 string

private static String getJson(String strEncoded) {
   byte[] decodedBytes = Base64Utils.decode(strEncoded);
   return new String(decodedBytes);
}

The decoded information was stored in this way

0={"alg":"HS256","typ":"JWT"},  1={"nbf":1495745400,"iat":1495745400,"exp":1495745800,"identity":344}

Extracting Information

Next, we create a function to get the expiry timestamp from the token. We could use GSON or Jackson for the task, but we did not want to map fields into any object. So we simply used JSONObject class which Android provides. It took 5 ms on average to parse the JSON instead of 150 ms by GSON

public static long getExpiry(String token) throws JSONException {
   SparseArrayCompat<String> decoded = decode(token);

   // We are using JSONObject instead of GSON as it takes about 5 ms instead of 150 ms taken by GSON
   return Long.parseLong(new JSONObject(decoded.get(1)).get("exp").toString());
}

 

Next, we wanted to get the ID of user from token to determine if a new user is logging in or an old one, so that we can clear the database for new user.

public static int getIdentity(String token) throws JSONException {
   SparseArrayCompat<String> decoded = decode(token);

   return Integer.parseInt(new JSONObject(decoded.get(1)).get("identity").toString());
}

Validating the token

After this, we needed to create a function that tells if a stored token is expired or not. With all the right functions in place, it was just a matter of comparing current time with the stored timestamp

public static boolean isExpired(String token) {
   long expiry;

   try {
       expiry = getExpiry(token);
   } catch (JSONException jse) {
       return true;
   }

   return System.currentTimeMillis()/1000 >= expiry;
}

 

Since the token provides timestamp from epoch in terms of seconds, we needed to divide the current time in milliseconds by 1000 and the function returned true if current timestamp was greater than the expiry time of token.

After writing a few unit tests for both functions, we just needed to plug them in our login model at the time of authentication.

At the time of starting of the application, we use this function to check if a user is logged in or not:

public boolean isLoggedIn() {
   String token = utilModel.getToken();

   return token != null && !JWTUtils.isExpired(token);
}

 

So, if there is no token or the token is expired, we do not automatically login the user and show the login screen.

Implementing login

The next task were

  • Sequest the server to login
  • Store the acquired token
  • Delete database if it is a new user

Before implementing the above logic, we needed to implement a function to determine if the person logging in is previous user, or new one. For doing so, we first loaded the saved user from our database, if the query is empty, surely it is a new user logging in. So we return false, and if there is a user in the database, we match its ID with the logged in user’s ID:

public Single<Boolean> isPreviousUser(String token) {
   return databaseRepository.getAllItems(User.class)
       .first(EMPTY)
       .map(user -> !user.equals(EMPTY) && user.getId() == JWTUtils.getIdentity(token));
}

 

We have added a default user EMPTY in the first operator so that RxJava returns it if there are no users in the database and then we simply map the user to a boolean denoting if they are same or different using the EMPTY user and getIdentity method from JWTUtils

Finally, we use all this information to implement our self contained login request:

eventService
   .login(new Login(username, password))
   .flatMapSingle(loginResponse -> {
       String token = loginResponse.getAccessToken();
       utilModel.saveToken(token);

       return isPreviousUser(token);
   })
   .flatMapCompletable(isPrevious -> {
       if (!isPrevious)
           return utilModel.deleteDatabase();

       return Completable.complete();
   });

 

Let’s see what is happening here. A request using username and password is made to the server which returns a login response containing a JWT, which we store for future use. Next, we flatMapSingle to the Single returned by the isPreviousUser method. And we finally clear the database if it is not a previous user.

Creating these self contained models help reduce complexity in presenter or view layer and all data is handled in one layer making presenter layer model agnostic.

To learn more about JWT and some of the Rx operators I mentioned here, please visit these links:

Continue ReadingInvalidating user login using JWT in Open Event Orga App

The Pocket Science Lab: Who Needs it, and Why

Science and technology share a symbiotic relationship. The degree of success of experimentation is largely dependent on the accuracy and flexibility of instrumentation tools at the disposal of the scientist, and the subsequent findings in fundamental sciences drive innovation in technology itself. In addition to this, knowledge must be free as in freedom. That is, all information towards constructing such tools and using them must be freely accessible for the next generation of citizen scientists. A common platform towards sharing results can also be considered in the path to building a better open knowledge network.

But before we get to scientists, we need to consider the talent pool in the student community that gave rise to successful scientists, and the potential talent pool that lost out on the opportunity to better contribute to society because of an inadequate support system. And this brings us to the Pocket Science Lab

How can PSLab help electronics engineers & students?

This device packs a variety of fundamental instruments into one handy package, with a Bill-of-materials that’s several orders of magnitude less than a distributed set of traditional instruments.

It does not claim to be as good as a Giga Samples Per second oscilloscope, or a 22-bit multimeter, but has the potential to offer a greater learning experience. Here’s how:

  • A fresh perspective to characterize the real world. The visualization tools that can be coded on an Android device/Desktop (3D surface plots, waterfall charts, thermal distributions etc ), are far more advanced than what one can expect from a reasonably priced oscilloscope. If the same needs to be achieved with an ordinary scope, a certain level of technical expertise is expected from the user who must interface the oscilloscope with a computer, and write their own acquisition & visualization app.
  • Reduce the entry barrier for advanced experiments.: All the tools are tightly integrated in a cost-effective package, and even the average undergrad student that has been instructed to walk on eggshells around a conventional scope, can now perform elaborate data acquisition tasks such as plotting the resonant frequency of a tuning fork as a function of the relative humidity/temperature. The companion app is being designed to offer varying levels of flexibility as demanded by the target audience.

  • Is there a doctor in the house? With the feature set available in the PSlab , most common electronic components can be easily studied , and will save hours while prototyping new designs.  Components such as resistors, capacitors, diodes, transistors, Op-amps, LEDs, buffers etc can be tested.

How can PSLab help science enthusiasts ?

Physicists, Chemists and biologists in the applied fields are mostly dependent on instrument vendors for their measurement gear. Lack of an electronic/technical background hinders their ability to improve the gear at their disposal, and this is why a gauss meter which is basically a magnetometer coupled with a crude display in an oversized box with an unnecessarily huge transformer can easily cost upwards of $150 . The PSLab does not ask the user to be an electronics/robotics expert , but helps them to get straight to the acquisition part. It takes care of the communication protocols, calibration requirements, and also handles visualization via attractive plots.

A physicist might not know what I2C is , but is more than qualified to interpret the data acquired from a physical sensor, and characterize its accuracy.

  • The magnetometer (HMC5883L) can be used to demonstrate the dependence of the axial magnetic field on distance from the center of a solenoid
  • The pressure,temperature sensor (BMP280) can be used to verify the gas laws, and verify thermodynamic phenomena against prevalent theories.

Similarly, a chemist can use an RGB sensor (TCS3200) to put the colour of a solution into numbers, and develop a colorimeter in the process. Colorimeters are quite handy for determining molality of coloured solutions., and commercial ones are rather expensive. What it also needs is a set of LEDs with known wavelengths, and most manufacturers offer proper characterisation information.

What does it mean for the hobbyist?

It is capable of greatly speeding up the troubleshooting process . It can also instantly characterize the expected data from various sensors so that the hobbyist can code accordingly. For example, ‘beyond what tilt threshold & velocity should my humanoid robot swing its arms forward in order to prevent a broken nose?’ . That’s not a question that can be easily answered by said hobbyist who is currently in the process of developing his/her own acquisition system.

How can we involve the community?

The PSLab features an experiment designer that speeds acquisition by providing spreadsheets, analytical tools, and visualisation options all in one place. An option for users to upload their new experiments/utilities to the cloud, and subject those to a peer-review process has been planned. Following which , these new experiments can be pumped back into the ecosystem which will find more uses for it, improve it, and so on.

For example , a user can combine the waveform generator with an analog multiplier IC, and develop a spectrum analyzer.

The case for self-reliance

The average undergraduate laboratory currently employs dedicated instruments for each experiment as prescribed by the curriculum. These instruments often only include the measurement tools essential to the experiment, and students merely repeat the procedure verbatim. That’s not experimentation, it’s rather just verification. PSLab offers a wide array of additional instruments that can be employed by the student to enhance the experiment with their own inputs.

For example, a commonly used diode IV curve-tracer kit usually has a couple of power supplies, a voltmeter, and an ammeter. But, if a student wishes to study the impact of temperature on the band gap, he will hard pressed for the additional tools, and software to combine the acquisition process. With the PSLab, however , he/she can pick from a variety of temperature sensors (LM35, BMP180, Si7021 .. ) depending on the requirement, and explore beyond the book. They are thus better prepared to enter research labs .

And in conclusion , this project has immense potential to help create the next generation of scientists, engineers and creators.

Resources

 

Continue ReadingThe Pocket Science Lab: Who Needs it, and Why