Implementing QR Code Detector in Open Event Organizer App

One of the main features of Open Event Organizer App is to scan a QR code from an attendee’s ticket to validate his/her entry to an event. The app uses Google’s Vision API library, com.google.android.gms.vision.barcode for QR code detection. In this blog, I talk about how to use this library to implement QR code detection with dynamic frame support in an Android App. The library uses a term barcode for all the supported formats including QR code. Hence in the blog, I use the term barcode for QR code format.

We use Google’s dagger for dependency injections in the app. So all the barcode related dependencies are injected in the activity using the dagger. Basically, there are these two classes – BarcodeDetector and CameraSource. The basic workflow is to create BarcodeDetector object which handles QR code detection. Add a SurfaceView in the layout which is used by the CameraSource to show preview to the user. Pass both of these to CameraSource. Enough talk, let’s look into the code while moving forward from here on. If you are not familiar with dagger dependency injection, I strictly suggest you have a look at some tutorial introducing dagger dependency injection.

So we have a barcode module class which takes care of creating  BarcodeDetector and CameraSource.

@Provides
BarcodeDetector providesBarCodeDetector(Context context) {
   BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context)
       .setBarcodeFormats(Barcode.QR_CODE)
       .build();
   return barcodeDetector;
}

@Provides
CameraSource providesCameraSource(Context context, BarcodeDetector barcodeDetector) {
   return new CameraSource
       .Builder(context, barcodeDetector)
       ...
       .build();
}

 

You can see in the code that BarcodeDetector is passed to the CameraSource builder. Now comes preview part. The user of the app should be able to see what is actually detected. Google has provided samples showing how to do that. It provides some classes that you can just add to your projects. The classes with the links are – BarcodeGraphic, CameraSourcePreview, GraphicOverlay and BarcodeGraphicTracker.

CameraSourcePreview is the custom view which is used in the QR detecting layout for preview. It handles all the SurfaceView related stuff with the additional BarcodeGraphic view which extends GraphicOveraly which is used to draw dynamic info based on the QR code detected. We use this class to draw a frame around the QR code detected. BarcodeGraphicTracker is used to receive newly detected items, add a graphical representation to an overlay, update the graphics as the item changes, and remove the graphics when the item goes away.

Override draw method of BarcodeGraphic according to your need of how you want to show results on the screen once barcode is detected. The method in the Organizer app looks like:

@Override
public void draw(Canvas canvas) {
   if (barcode == null) {
       return;
   }
   // Draws the bounding box around the barcode.
   RectF rect = new RectF(barcode.getBoundingBox());
   ...
   int width = (int) ((rect.right - rect.left)/3);
   int height = (int) ((rect.top - rect.bottom)/3);

   canvas.drawBitmap(Bitmap.createScaledBitmap(frameBottomLeft, width, height, false), rect.left, rect.top, null);
   ...
   canvas.drawRect(rect, rectPaint);
}

 

The class has a Barcode field which gets updated on barcode detection. In the above method, the field rect gets dimensions of the bounding box of the QR code detector. And accordingly, frames are drawn at the vertices of the rect . Include CameraSourcePreview inclosing GraphicOverlay in the activity’s layout.

<...CameraSourcePreview
   android:id="@+id/preview"
   android:layout_width="match_parent"
   android:layout_height="match_parent">

   <...GraphicOverlay />

</...CameraSourcePreview>

 

CameraSourcePreview and GraphicOverlay are saved in the activity from the layout. Pass CameraSource and GraphicOverlay to the CameraSourcePreview using the method start. Now the last part left is setting the processor to the BarcodeDetector to add a connection to the GraphicOverlay. Use BarcodeGraphicTracker which connects GraphicOverlay to BarcodeDetector. This is done by passing BarcodeTrackerFactory which has create method for BarcodeGraphicTracker to Multiprocessor. The code looks like:

barcodeDetector.setProcessor(
   new MultiProcessor.Builder<>(
       new BarcodeTrackerFactory(graphicOverlay)).build());

 

Now BarcodeDetector is connected to the layout. This will update the preview on the layout as overridden in the draw method of BarcodeGraphic on each barcode detection.

Links:
Google’s Vision API – link
Google Dagger github repo link – https://github.com/google/dagger

Continue ReadingImplementing QR Code Detector in Open Event Organizer App

Resource Injection Using ButterKnife in Loklak Wok Android

Loklak Wok Android being a sophisticated Android app uses a lot of views, and of those most are manipulated at runtime. In Android to play with a View or ViewGroup defined in XML at runtime requires developers to add the following line:

(TypeOfView) parentView.findViewById(R.id.id_of_view);

 

This leads to lengthy code. And very often, more than one Views respond to a particular event. For example, hiding Views if a network request fails and showing a message to the user to “Try Again!”. Let’s say you have to hide 4 Views, are you going to do the following:

view1.setVisibility(View.GONE);
view2.setVisibility(View.GONE);
view3.setVisibility(View.GONE);
view4.setVisibility(View.GONE);
textView.setVisibility(View.VISIBLE); // has "Try Again!" message.

// more 5 lines of code when hiding textView and displaying 4 other Views

 

Surely not! And the old fashioned way to get a string value defined as a resource in string.xml

String appName = getActivity().getResources().getString(R.id.app_name);

 

Surely, all this works good, but being a developer while working on a sophisticated app you would like to focus on the logic of the app, rather than scratching your head to debug whether you properly did a findViewById or not, did you typecast it to the proper View, or where did you miss to change the visibility of a view in response to an event.

Well, all of this can be easily handled by using a library which provides you the dependency, here resources. All you need to do is just declare your resources, and that’s it, the library provides the resources to you, yes you don’t need to initialize it using findViewById. So let’s dive in and see how ButterKnife is used in Loklak Wok Android to handle these issues.

Adding ButterKnife to Android Project

In the app/build.gradle:

dependencies {
    compile 'com.jakewharton:butterknife:8.6.0'
    annotationProcessor 'com.jakewharton:butterknife-compiler:8.6.0'
    ...
}

Dealing with Views in Fragments

When views are declared, BindView annotation is used with its parameter as the ID of the view, for example, views in TweetHarvestingFragment :

@BindView(R.id.toolbar)
Toolbar toolbar;
@BindView(R.id.harvested_tweets_count)
TextView harvestedTweetsCountTextView;
@BindView(R.id.harvested_tweets_container)
RecyclerView recyclerView;
@BindView(R.id.network_error)
TextView networkErrorTextView;

NOTE: Views declared can’t be private.

Once Views are declared, then it needs to be injected, it is done using ButterKnife.bind(Object target, View Source). Here in TweetHarvestingFragment the target will be the fragment itself and source i.e. the parent view will be rootView (obtained by inflating the layout file of fragment). All this needs to be done in onCreateView method

View rootView = inflater.inflate(R.layout.fragment_tweet_harvesting, container, false);
ButterKnife.bind(this, rootView);

That’s it, we are done!

The same paradigm can be used to bind views to a ViewHolder of a RecyclerView, as implemented in HarvestTweetViewHolder:

@BindView(R.id.user_fullname)
TextView userFullname;
@BindView(R.id.username)
TextView username;
@BindView(R.id.tweet_date)
TextView tweetDate;
@BindView(R.id.harvested_tweet_text)
TextView harvestedTweetTextView;

public HarvestedTweetViewHolder(View itemView) {
   super(itemView);
   ButterKnife.bind(this, itemView);
}

 

Injecting resources like strings, dimensions, colors, drawables etc. is even easier, only the related annotation and ID needs to be provided. Example the string app_name is used in TweetHarvestingFragment to display the app name i.e. “Loklak Wok” in toolbar

@BindString(R.string.app_name)
String appName;

// directly used inside onCreateView to set the title in toolbar
toolbar.setTitlet(appName);

 

Using ButterKnife onClickListeners can be implemented in separate method, a clean way to define click events rather than polluting onCreate(in Activity) or onCreateView(in Fragment), as implemented in SuggestFragment

@OnClick(R.id.clear_image_button)
public void onClickedClearImageButton(View view) {
   tweetSearchEditText.setText("");
}

 

Multiple Views responding to a single Event

Using @BindViews annotation a list of multiple views can be created, and then a common ButterKnife.Action can be defined to act on the list of views. In TweetHarvestingFragment visibility of some views are changed if a network request fails or succeeds using this feature of ButterKnife:

// declaring List of Views
@BindViews({
       R.id.harvested_tweets_count,
       R.id.harvested__tweet_count_message,
       R.id.harvested_tweets_container})
List<View> networkViews;

// defining action for views
private final ButterKnife.Action<View> VISIBLE = (view, index) -> view.setVisibility(View.VISIBLE);
private final ButterKnife.Action<View> GONE = (view, index) -> view.setVisibility(View.GONE);

// applying action on the views
ButterKnife.apply(networkViews, VISIBLE);
ButterKnife.apply(networkViews, GONE);

Resources

Continue ReadingResource Injection Using ButterKnife in Loklak Wok Android

Add Autocomplete SearchView in Open Event Android App

The Open Event Android App has a map for showing all locations of sessions. All the locations have a marker in the map. It is difficult to find a particular location on the map because to know the name of location user has to click on the marker. Adding autocomplete SearchView will improve user experience by providing an ability to search the location by name and by suggesting name according to the search query. In this post I explain how to add autocomplete SearchView in the fragment or activity.

Add search icon in actionbar

The first step to do is to create a menu xml file and add a search menu item in it. Then inflate this menu xml file in Fragment in onCreateOptionsMenu() method.

1. Create menu.xml file

In this file add search menu element. Inside menu element add search menu item. Define id, title, and icon of search menu item. Add android.support.v7.widget.SearchView” as actionViewClass which will be used as action view when the user clicks on the icon.

<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools">
    
   <item
        android:id="@+id/action_search"
        android:icon="@drawable/ic_search_white_24dp"
        android:title="@string/search"
        app:actionViewClass="android.support.v7.widget.SearchView"
        app:showAsAction="ifRoom | collapseActionView"/>
</menu>

2. Inflate menu.xml file in Fragment

In the fragment’s onCreateOptionsMenu() method inflate menu.xml file using MenuInflater’s inflate() method. Then find search menu item using menu’s findItem() method by passing id of search menu item as parameter.

public void onCreateOptionsMenu(Menu menu, MenuInflater inflater) {
        super.onCreateOptionsMenu(menu, inflater);
        inflater.inflate(R.menu.menu_map, menu);
        MenuItem item = menu.findItem(R.id.action_search);
}

Add and initialize SearchView  

Now after adding search icon we need to add SearchView and SearchAutoComplete fields in the fragment.

private SearchView searchView;
private SearchView.SearchAutoComplete   mSearchAutoComplete;

Initialize SearchView in onCreateOptionMenu() method by passing search menu item in the getActionView() method of MenuItemCompat.

Here SearchAutoComplete is a child object of SearchView so initialize it using findViewById method of SearchView by passing the id as parameter.

searchView = (SearchView) MenuItemCompat.getActionView(item);
mSearchAutoComplete = (SearchView.SearchAutoComplete) searchView.findViewById(android.support.v7.appcompat.R.id.search_src_text);

Define properties of SearchAutoCompleteView

By default background of drop down menu in SearchAutoComplete is black. You can change background using setDropDownBackgroundResource() method. Here i’m making it white by providing white drawable resource.

mSearchAutoComplete.setDropDownBackgroundResource(R.drawable.background_white);
mSearchAutoComplete.setDropDownAnchor(R.id.action_search);
mSearchAutoComplete.setThreshold(0)

The setDropDownAnchor() method sets the view to which the auto-complete drop down list should anchor. The setThreshold() method specifies the minimum number of characters the user has to type in the edit box before the drop down list is shown.

Create array adapter

Now it’s time to make the ArrayAdapter object which will provide the data set (strings) which will be used to run search queries.

ArrayAdapter<String> adapter = new ArrayAdapter<>(getActivity(), android.R.layout.simple_list_item_1, searchItems);

Here searchItems is List of strings. Now set this adapter to the mSearchAutoComplete object using setAdapter() method.

mSearchAutoComplete.setAdapter(adapter);

Now we are all set to run the app on device or emulator. Here’s demo how it will look

Conclusion

The SearchView with an ability to give suggestions serves the great user experience in the application.

Additional resources:

Continue ReadingAdd Autocomplete SearchView in Open Event Android App

Using Vector Images in SUSI Android

SUSI is an artificial intelligence for interactive chat bots. For making it more user friendly and interactive we add a lot of images in the form of drawable resources in the SUSI Android App (https://github.com/fossasia/susi_android). Most of these drawables are in the form of PNGs. There are certain problems associated with the use of PNG images.

  1. PNGs cannot be scaled without losing quality. Due to which for the same PNG image we have to include separate images of varied quality. Otherwise the image will become blur.
  2. PNGs tends to take large disk space which can be easily reduced with the use of vector images.
  3. PNGs have fixed color and dimensions which cannot be changed.

Due to the above shortcomings of PNG images we decided to use vector drawable images instead of them.

Advantages associated with Vector images

  1. They can be scaled to any size without the loss in quality. Thus we need to include only a single image in the app and not of varied qualities.
  2. They are very small in size as compared to PNGs.
  3. They can be easily modified programmatically in XML file unlike PNGs.

Using Vector Images in Android Studio

Android Studio provide tools by which we can directly import vector drawables in the project. To import Vector images go to File>New>Vector Assets in studio.

From here we can choose the icon we want to include in our project and click OK. The icon will appear in the drawables directory and can be used anywhere in the projects.

Implementation in SUSI Android

In Susi Android we have used various vector images such as arrows, pointer and even the logo of the app. Here below is the logo of SUSI.

This is actually a vector image below we will see the code required to get this logo as the output.

<vector android:height="50dp" android:viewportHeight="279.37604"

  android:viewportWidth="1365.2" android:width="220dp" xmlns:android="http://schemas.android.com/apk/res/android">

<path android:fillColor="#ffffff"

      android:pathData="M127.5,7.7c-26.8,3.3 -54.2,16.8 -75.9,37.4 -11.8,11.1 -20.4,22.9 -28.1,38.4 -8.9,17.8 -12.8,32.1 -13.7,51l-0.3,6 39,0 39,0 0.3,-4c0.7,-12.1 6.8,-24.1 17.2,-34.5 8.5,-8.4 16.2,-13.4 25.9,-16.7l6.6,-2.2 81.3,-0.1 81.2,0 0,-38 0,-38 -84.7,0.1c-46.7,0.1 -86.1,0.4 -87.8,0.6z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M319.2,11.3l-4.3,4.3 0.3,103c0.4,113.2 0,105.9 6.4,118.6 10.8,21.3 35.1,41.9 56.2,47.3 8.5,2.3 99.1,2.2 107.7,0 18.7,-4.9 39.2,-20.7 51.5,-39.7 3.4,-5.1 7.1,-12.2 8.3,-15.8l2.2,-6.5 0.5,-103.3 0.5,-103.3 -4.5,-4.4 -4.6,-4.5 -31.5,0 -31.5,0 -4.7,4.8 -4.7,4.8 0,93 0,93 -3.3,3.2 -3.3,3.2 -29,0 -29,0 -2.6,-2.7 -2.7,-2.8 -0.7,-94.2 -0.7,-94.2 -4.3,-4 -4.2,-4.1 -31.9,0 -31.9,0 -4.2,4.3z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M680,7.6c-31.6,4.8 -56.1,17.3 -79,40.3 -23.2,23.3 -36.3,50.5 -38.9,80.9 -0.5,5.9 -0.7,11 -0.4,11.4 0.2,0.5 17.7,0.8 38.8,0.8l38.4,0 0.6,-4.8c3.2,-23.2 21.3,-44.1 44.7,-51.3 5.6,-1.8 10.6,-1.9 86.6,-1.9l80.7,0 -0.3,-38 -0.2,-38 -84.3,0.1c-46.3,0.1 -85.3,0.3 -86.7,0.5z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M869.1,13.4l-4.1,6.4 0,126.4 0,126.3 4.8,6.7 4.7,6.8 31.6,0 31.6,0 4.7,-7 4.6,-7 0,-125.7 0,-125.8 -4.7,-6.7 -4.8,-6.8 -32.1,0 -32.1,0 -4.2,6.4z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M222.5,152.2c-0.2,0.7 -0.9,4.2 -1.5,7.7 -3.4,19.5 -19.4,38 -40,46.4l-5.5,2.2 -83,0.5 -83,0.5 -0.3,37.8 -0.2,37.8 89.2,-0.3 89.3,-0.3 9.6,-2.7c57.7,-16.3 100.1,-67.4 102.1,-123.3l0.3,-7 -38.3,-0.3c-30.1,-0.2 -38.3,0 -38.7,1z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M774.5,152.2c-0.2,0.7 -0.9,4.1 -1.5,7.5 -3.3,19.2 -18.8,37.3 -39.4,46.2l-6.1,2.6 -83,0.5 -83,0.5 -0.3,37.7 -0.2,37.8 85.9,0c93.7,0 91.4,0.1 110.1,-5.9 26.4,-8.5 53.3,-28.4 69.8,-51.7 15.2,-21.3 25.1,-50.1 24,-69.9l-0.3,-6 -37.8,-0.3c-29.7,-0.2 -37.8,0 -38.2,1z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff" android:pathData="m1146.99,0 l-1.38,1.19c-0.76,0.66 -1.85,1.61 -2.43,2.13 -0.58,0.51 -1.75,1.54 -2.61,2.28 -1.52,1.31 -1.58,1.41 -2.4,3.53 -0.46,1.2 -0.92,2.37 -1.01,2.59 -30.55,82.93 -61.62,165.72 -96.03,259.63 0,0.08 1.61,1.88 3.57,3.98l3.57,3.84 33.47,-0.04 33.47,-0.04c12.28,-35.6 25.13,-72.47 37.4,-107.27 0.06,-0.25 0.28,-0.64 0.5,-0.88 0.37,-0.41 0.61,-0.43 4.2,-0.43 3.63,0 3.83,0.02"/>

  <path android:fillColor="#ffffff" android:pathData="m967.09,279.18c-2.48,-3.74 -4.97,-7.04 -8.09,-11.76l0.09,-43.92c3.34,-5.26 5.31,-6.73 8.42,-11.51 17.91,0.02 34.3,0.26 50.88,0.26 3.21,4.88 4.09,6.72 7.81,12.66 -0.05,13.98 0.1,27.96 -0.12,41.94 -2.9,4.2 -4.27,7.42 -7.78,12.18 -18.81,-0.04 -35.43,0.2 -51.21,0.15z"/>

  <path android:fillColor="#ffffff"

      android:pathData="m1287.3,6.59 l-4.1,6.4 0,126.4 0,126.3 4.8,6.7 4.7,6.8 31.6,0 31.6,0 4.7,-7 4.6,-7 0,-125.7 0,-125.8 -4.7,-6.7 -4.8,-6.8 -32.1,0 -32.1,0 -4.2,6.4z" android:strokeColor="#00000000"/>


</vector>

In this code we can easily change the color and minor details for the logo which could have been not possible if the logo was in PNG format. Also we don’t need multiple logo images of varied qualities as it can be scaled without decreasing quality.

Resources

Continue ReadingUsing Vector Images in SUSI Android

Applying Filters on Images using Native Functions in Phimpme Android

In the Phimpme application, the user can apply multiple colorful filters on images captured from application’s camera or already available images on the device. This application of filters on images is performed using native image processing functions. We implemented many filters for enhancing the image. Implementation of few of the filter functions is shown below.

Filters are applied to an image by modifying the color values of pixels in the Phimpme application. This is similar to the implementation of image enhancing functions in the editor of Phimpme. My post on that is available here.

Black and White filter:

Black and white filter can be called as gray scaling the image. In a gray scale image, there will only be a single color channel. If multiple channels are present, the corresponding pixel values in all channels will be same. Here in Phimpme, we have an RGB image. It has 3 color channels. Every pixel has three values. Black and white filter can be implemented by replacing those three different values with the average of those values. The implementation of the function and the resultant image with the comparison is shown below.

void applyBlackAndWhiteFilter(Bitmap* bitmap) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  register unsigned char grey;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  for (i = length; i--;) {
     grey = (red[i] + green[i] + blue[i]) / 3;
     red[i] = truncate((int) grey);
     green[i] = truncate((int) grey);
     blue[i] = truncate((int) grey);
  }
}

    

Ansel Filter

This Ansel Filter is a monotone filter present in Phimpme which is similar to black and white. Here in this filter, the contrast will be little high and gives the image artistic look. This is achieved in Phimpme by hard overlaying the gray pixel components of the image. The rest is same as the black and white filter. The implementation of hard overlay blending and the Ansel function is shown below with the resultant images.

static unsigned char hardLightLayerPixelComponents(unsigned char maskComponent, unsigned char imageComponent) {

  return (maskComponent > 128) ? 255 - (( (255 - (2 * (maskComponent-128)) ) * (255-imageComponent) )/256) : (2*maskComponent*imageComponent)/256;
}

void applyAnselFilter(Bitmap* bitmap) {
/*initializations*/
  unsigned char br,bg,bb;
  for (i = length; i--; ) {
       grey = (red[i] + green[i] + blue[i]) / 3;
       int eff = hardLightLayerPixelComponents(grey, grey);
       red[i] = truncate(eff);
       green[i] = truncate(eff);
       blue[i] = truncate(eff);
  }
}

    

Sepia Filter

The Sepia Filter in Phimpme results in a monotone image with orangish yellow tone. Its implementation uses pre-defined look up tables(LUTs) for all the three channels. The luminosity of a particular pixel is found out and then the red, green, blue values are found out from the look up tables(LUTs) corresponding to that luminosity. The look up table arrays we used for the sepia effect in Phimpme are given below and the implementation is also shown below.

const unsigned char sepiaRedLut[256] = {24, 24, 25, 26, 27, 28, 29, 30, 30, 30, 31, 32, 33, 34, 35, 36, 37, 37, 38, 38, 39, 40, 41, 42, 43, 43, 44, 45, 46, 47, 47, 48, 49, 50, 50, 51, 52, 53, 54, 55, 56, 57, 57, 58, 58, 59, 60, 61, 62, 63, 64, 64, 65, 66, 67, 68, 69, 70, 71, 71, 72, 72, 73, 74, 75, 76, 77, 78, 78, 79, 80, 81, 82, 83, 84, 85, 85, 86, 87, 88, 89, 89, 90, 91, 92, 93, 93, 94, 95, 96, 97, 97, 98, 99, 100, 101, 102, 102, 103, 104, 105, 106, 107, 108, 109, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 146, 147, 148, 149, 150, 151, 152, 153, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 178, 180, 181, 182, 183, 184, 185, 186, 186, 187, 188, 189, 190, 191, 193, 194, 195, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255};

const unsigned char sepiaGreenLut[256] = {16, 16, 16, 17, 18, 18, 19, 20, 20, 20, 21, 22, 22, 23, 24, 24, 25, 25, 26, 26, 27, 28, 28, 29, 30, 30, 31, 31, 32, 33, 33, 34, 35, 36, 36, 36, 37, 38, 39, 39, 40, 41, 42, 43, 43, 44, 45, 46, 47, 47, 48, 48, 49, 50, 51, 51, 52, 53, 54, 54, 55, 55, 56, 57, 58, 59, 60, 61, 61, 61, 62, 63, 64, 65, 66, 67, 67, 68, 68, 69, 70, 72, 73, 74, 75, 75, 76, 77, 78, 78, 79, 80, 81, 81, 82, 83, 84, 85, 86, 87, 88, 90, 90, 91, 92, 93, 94, 95, 96, 97, 97, 98, 99, 100, 101, 103, 104, 105, 106, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 122, 123, 123, 124, 125, 127, 128, 129, 130, 131, 132, 132, 134, 135, 136, 137, 138, 139, 141, 141, 142, 144, 145, 146, 147, 148, 149, 150, 151, 152, 154, 155, 156, 157, 158, 160, 160, 161, 162, 163, 165, 166, 167, 168, 169, 170, 171, 173, 174, 175, 176, 177, 178, 179, 180, 182, 183, 184, 185, 187, 188, 189, 189, 191, 192, 193, 194, 196, 197, 198, 198, 200, 201, 202, 203, 205, 206, 207, 208, 209, 210, 211, 212, 213, 215, 216, 217, 218, 219, 220, 221, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255};

const unsigned char sepiaBlueLut[256] = {5, 5, 5, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 9, 10, 10, 11, 11, 11, 11, 12, 12, 13, 13, 14, 14, 14, 14, 15, 15, 16, 16, 17, 17, 17, 18, 18, 19, 20, 20, 21, 21, 21, 22, 22, 23, 23, 24, 25, 25, 26, 27, 28, 28, 29, 29, 30, 31, 31, 31, 32, 33, 33, 34, 35, 36, 37, 38, 38, 39, 39, 40, 41, 42, 43, 43, 44, 45, 46, 47, 47, 48, 49, 50, 51, 52, 53, 53, 54, 55, 56, 57, 58, 59, 60, 60, 61, 62, 63, 65, 66, 67, 67, 68, 69, 70, 72, 73, 74, 75, 75, 76, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 90, 91, 92, 93, 93, 95, 97, 98, 99, 100, 101, 102, 104, 104, 106, 107, 108, 109, 111, 112, 114, 115, 115, 117, 118, 120, 121, 122, 123, 124, 125, 127, 128, 129, 131, 132, 133, 135, 136, 137, 138, 139, 141, 142, 144, 145, 147, 147, 149, 150, 151, 153, 154, 156, 157, 159, 159, 161, 162, 164, 165, 167, 168, 169, 170, 172, 173, 174, 176, 177, 178, 180, 181, 182, 184, 185, 186, 188, 189, 191, 192, 193, 194, 196, 197, 198, 200, 201, 203, 204, 205, 206, 207, 209, 210, 211, 213, 214, 215, 216, 218, 219, 220, 221, 223, 224, 225, 226, 227, 229, 230, 231, 232, 234, 235, 236, 237, 238, 239, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255};
void applySepia(Bitmap* bitmap){
/*bitmap initializations*/
for (i = length; i--; ) {
  register float r = (float) red[i] / 255;
  register float g = (float) green[i] / 255;
  register float b = (float) blue[i] / 255;
  register float luminosity =  (0.21f * r + 0.72f * g + 0.07 * b) * 255;
      red[i] = truncate((int)( sepiaRedLut[(int)luminosity]));
      green[i] = truncate((int)(sepiaGreenLut[(int)luminosity]));
      blue[i] = truncate((int)(sepiaBlueLut[(int)luminosity]));
  }
}

     

Cyano Filter

As the name suggests this filter adds a cyan tone to the image. For implementing this cyano filter, we first found the black and white value of the pixel and then the ceilingComponent value of the pixels of three color channels. Then the ceilingComponent Values and the gray values are overlayed to give the resultant image. Finding the ceilComponent Values and the filter implementation is shown below.

#define componentCeiling(x) ((x > 255) ? 255 : x)

static unsigned char overlayPixelComponents(unsigned int overlayComponent, unsigned char underlayComponent, float alpha) {
  float underlay = underlayComponent * alpha;
  return (unsigned char)((underlay / 255) * (underlay + ((2.0f * overlayComponent) / 255) * (255 - underlay)));

}

void applyCyano(Bitmap* bitmapl) {
  //Cache to local variables
//Bitamp initialization
  register unsigned int i;
  register unsigned char grey, r, g, b;
  for (i = length; i--;) {
     grey = ((red[i] * 0.222f) + (green[i] * 0.222f) + (blue[i] * 0.222f));
     r = componentCeiling(61.0f + grey);
     g = componentCeiling(87.0f + grey);
     b = componentCeiling(136.0f + grey);
     grey = (red[i] + green[i] + blue[i]) / 3;
     red[i] = truncate((int)(overlayPixelComponents(grey, r, 0.9f)));
     green[i] = truncate((int)(overlayPixelComponents(grey, g, 0.9f)));
     blue[i] = truncate((int)(overlayPixelComponents(grey, b, 0.9f)));
  }
}

      

Grain Filter

It is clear from the name that this filter adds grain to the image giving an artistic effect. It can be implemented in a very simple manner by assigning gray values to random pixels of an image. The condition inside the main for loop of the below implementation controls the proportion of added grain with respect to the whole image. For generating a random value, the timer has to be initialized first. The whole implementation of the function is shown below.

void applyGrain(Bitmap* bitmap) {
/*initializations*/
   time_t t;
   srand((unsigned) time(&t));
  for (i = length; i--;) {
       int rval = rand()%255;
       if (rand()%100 < 15)){
           int grey = (red[i] + green[i] + blue[i]) / 3;
           red[i] = truncate(rval);
           green[i] = truncate(rval);
           blue[i] = truncate(rval);
       }
  }
}

      

Threshold Filter

Thresholding an image gives a binary image i.e the pixels of the image will have only two values. One for a value less than the threshold and other for values greater than the threshold. The threshold value is adjusted by seek bar in Phimpme. An image looks very artistic for a particular value on the seek bar. Its implementation is shown below.

void applyThreshold(Bitmap* bitmap, int val) {
/*bitmap initializations*/
  unsigned char grey, color;
  int thres = 220 - (int)((val/100.0) * 190);
  for (i = length; i--;) {
       grey = (red[i] + green[i] + blue[i]) / 3;
       if (grey < thres) color = 0;
       else color = 255;
       red[i] = truncate((int)(color));
       green[i] = truncate((int)(color));
       blue[i] = truncate((int)(color));
  }
}

     

Resources:

Continue ReadingApplying Filters on Images using Native Functions in Phimpme Android

Handling graph plots using MPAndroid chart in PSLab Android App

In PSLab Android App, we expose the Oscilloscope and Logic Analyzer functionality of PSLab hardware device. After reading data-points to plot, we need to show plot data on graphs for better understanding and visualisation. Sometimes we need to save graphs to show output/findings of the experiment. Hence we will be using MPAndroidChart library to plot and save graphs as it provides a nice and clean methods to do so.

First add MPAndroid Chart as dependency in your app build.gradle to include the library

dependencies {
 compile 'com.github.PhilJay:MPAndroidChart:v3.0.2'
}

For chart view in your layout file, there are many available options like Line Chart, Bar Chart, Pie Chart, etc. For this post I am going to use Line Chart.

Add LineChart in your layout file

<com.github.mikephil.charting.charts.LineChart
        android:id="@+id/lineChart"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

Take a reference to LineChart of layout file in your Activity/Fragment

LineChart lineChart = (LineChart) findViewById(R.id.chart);// Activity
LineChart lineChart = (LineChart) view.findViewById(R.id.chart);// Fragment

Now we add dataset to LineChart of layout file, I am going to add data for two curves sine and cosine function to plot sine and cosine wave on LineChart. We create two different LineDataSet one for the sine curve entries and other for the cosine curve entries.

List <Entry> sinEntries = new ArrayList<>(); // List to store data-points of sine curve 
List <Entry> cosEntries = new ArrayList<>(); // List to store data-points of cosine curve

// Obtaining data points by using Math.sin and Math.cos functions
for( float i = 0; i < 7f; i += 0.02f ){
sinEntries.add(new Entry(i,(float)Math.sin(i)));
cosEntries.add(new Entry(i,(float)Math.cos(i)));
}

List<ILineDataSet> dataSets = new ArrayList<>(); // for adding multiple plots

LineDataSet sinSet = new LineDataSet(sinEntries,"sin curve");
LineDataSet cosSet = new LineDataSet(cosEntries,"cos curve");

// Adding colors to different plots 
cosSet.setColor(Color.GREEN);
cosSet.setCircleColor(Color.GREEN);
sinSet.setColor(Color.BLUE);
sinSet.setCircleColor(Color.BLUE);

// Adding each plot data to a List
dataSets.add(sinSet);
dataSets.add(cosSet);

// Setting datapoints and invalidating chart to update with data points
lineChart.setData(new LineData(dataSets));
lineChart.invalidate();

After adding datasets to chart and invalidating it, chart is refreshed with the data points which were added in dataset.

After plotting graph output would look like the image below:

You can change the dataset and invalidate chart to update it with latest dataset.

To save graph plot, make sure you have permission to write to external storage, if not add it into your manifest file

<manifest ...>
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    ...
</manifest>

To save the photo of chart into Gallery:

lineChart.saveToGallery("title");

To save a some specific location:

lineChart.saveToPath("title", "Location on SD Card");

If you want to do some resizing in chart or save two three charts in a single image, you can do so by taking out the Bitmaps and processing them to meet your requirements:

lineChart.getChartBitmap();

Resources

Continue ReadingHandling graph plots using MPAndroid chart in PSLab Android App

Enhancing Images using Native functions in Phimpme Android

Enhancing the image can be performed by adjusting the brightness, contrast, saturation etc. of that image. In the Phimpme Android Image Application, we implemented many enhancement operations. All these image enhancement operations are performed by the native image processing functions.

An image is made up of color channels. A gray-scale image has a single channel, colored opaque image has three channels and colored image with transparency has four channels. Each color channel of an image represents a two dimensional matrix of integer values. An image of resolution 1920×1080 has 1920 elements in its row and 1080 such rows. The integer values present in the matrices will be ranging from 0 to 255. For a grayscale image there will be a single channel. So, for that image, 0 corresponds to black color and 255 corresponds to white color. By changing the value present in the matrices, the image can be modified.

The implementation of the enhancement functions in Phimpme Application are given below.

Brightness

Brightness adjustment is the easiest of the image processing functions in Phimpme. Brightness can be adjusted by increasing or decreasing the values of all elements in all color channel matrices. Its implementation is given below.

void tuneBrightness(Bitmap* bitmap, int val) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  signed char bright = (signed char)(((float)(val-50)/100)*127);
  for (i = length; i--; ) {
       red[i] =  truncate(red[i]+bright);
       green[i] = truncate(green[i]+bright);
       blue[i] = truncate(blue[i]+bright);
  }
}

  

low brightness, normal, high brightness(in the order) images are shown above

For the above function, the argument val is given by the seekbar implemented in java activity. Its value ranges from 0 – 100, so a new variable is introduced to change the range of the input argument in the function. You can see that in the for loop there is function named truncate. As the name suggests it truncates the input argument’s value to accepted range. It is added to the top of the c file as below

#define truncate(x) ((x > 255) ? 255 : (x < 0) ? 0 : x)

Contrast

Contrast of an image is adjusted in Phimpme application by increasing the brightness of the brighter pixel and decreasing value of the darker pixel. This is achieved by using the following formula for the adjustment contrast in editor of phimpme application.

pixel[i] = {(259 x (C + 255))/(255 x (259 - C))} x (pixel[i] - 128)

In the above formula, C is the contrast value and pixel[i] is the value of the element in the image matrix that we are modifying for changing the contrast.

 

low contrast, normal, high contrast(in the order) images are shown above

So, after this formula for modifying every pixel value, the function looks like below

void tuneContrast(Bitmap* bitmap, int val) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  int contrast = (int)(((float)(val-50)/100)*255);
  float factor = (float)(259*(contrast + 255))/(255*(259-contrast));

  for (i = length; i--; ) {
       red[i] = truncate((int)(factor*(red[i]-128))+128);
       green[i] = truncate((int)(factor*(green[i]-128))+128);
       blue[i] = truncate((int)(factor*(blue[i]-128))+128);
  }
}

Hue

The below image explains hue shift by showing what happens when shift in hue takes place over time. The image with hue 0 looks identical with image with hue 360. Hue shift is cyclic. The definition and formulae corresponding hue is found in wikipedia page here. Using that formulae and converting them back, i.e we got rgb values from hue in Phimpme application. Its implementation is shown below.

[img source:wikipedia]

void tuneHue(Bitmap* bitmap, int val) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  double H = 3.6*val;
  double h_cos = cos(H*PI/180);
  double h_sin = sin(H*PI/180);
  double r,g,b;

  for (i = length; i--; ) {
       r = (double)red[i]/255;
       g = (double)green[i]/255;
       b = (double)blue[i]/255;
       red[i] = truncate((int)(255*((.299+.701*h_cos+.168*h_sin)*r +  (.587-.587*h_cos+.330*h_sin)*g + (.114-.114*h_cos-.497*h_sin)*b)));

       green[i] = truncate((int)(255*((.299-.299*h_cos-.328*h_sin)*r + (.587+.413*h_cos+.035*h_sin)*g + (.114-.114*h_cos+.292*h_sin)*b)));

       blue[i] = truncate((int)(255*((.299-.3*h_cos+1.25*h_sin)*r +  (.587-.588*h_cos-1.05*h_sin)*g + (.114+.886*h_cos-.203*h_sin)*b)));
  }
}

Saturation

Saturation is the colorfulness of the image. You can see the below null saturation, unmodified and high saturated images in the respective order. The technical definition and formulae for getting the saturation value from the rgb value is given in the wikipedia page here. In Phimpme application we used those formulae to get the rgb values from the saturation value.

Its implementation is given below.

  

low saturation, normal, high saturation(in the order) images are shown above

void tuneSaturation(Bitmap* bitmap, int val) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green
  unsigned char* blue = (*bitmap).blue;
  double sat = 2*((double)val/100);
  double temp;
  double r_val = 0.299, g_val = 0.587, b_val = 0.114;
  double r,g,b;
  for (i = length; i--; ) {
      r = (double)red[i]/255;
      g = (double)green[i]/255;
      b = (double)blue[i]/255;
      temp = sqrt( r * r * r_val +
                     g * g * g_val +
                       b * b * b_val );
      red[i] = truncate((int)(255*(temp + (r - temp) * sat)));
      green[i] = truncate((int)(255*(temp + (g - temp) * sat)));
      blue[i] = truncate((int)(255*(temp + (b - temp) * sat)));
  }
}

Temperature

If the color temperature of the image is high, i.e the image with the warm temperature will be having more reds and less blues. For a cool temperature image reds are less and blues are more. So In Phimpme Application, we implemented this simply by adjusting the brightness of the red channel matrix and blue channel matrix as we did in brightness adjustment. We didn’t modify the green channel here.

  

low temperature, normal, high temperature(in the order) images are shown above

void tuneTemperature(Bitmap* bitmap, int val) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  int temperature = (int)1.5*(val-50);
  for (i = length; i--; ) {
       red[i] = truncate(red[i] + temperature);
       blue[i] = truncate(blue[i] - temperature);
  }
}

Tint

In Phimpme application, we adjusted the tint of an image in the same way of adjusting the temperature. But in this instead of modifying the red and blue channels, we modified the green channel of the image. An image with more tint will have a tone of magenta color and if it is decreased the image will have a greenish tone. The below shown code shows how we implemented this function in image editor of Phimpme application.

  

low tint, normal, high tint(in the order) images are shown above

void tuneTint(Bitmap* bitmap, int val) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  int tint = (int)(1.5*(val-50));

  for (i = length; i--; ) {
       green[i] = truncate(green[i] - tint);
  }
}

Vignette

Vignetting is the reduciton in the brightness of the image towards the edges than the center. It is applied to draw the attention of the viewer to the center of the image.

 

normal and vignetted images are shown above

For implementing vignette in Phimpme application, we reduced the brightness of the pixel corresponding to a radial gradient value which is generated based on the pixel’s distance from the corner and center. It’s function in Phimpme as is shown below.

double dist(int ax, int ay,int bx, int by){
   return sqrt(pow((double) (ax - bx), 2) + pow((double) (ay - by), 2));
}

void tuneVignette(Bitmap* bitmap, int val) {
  register unsigned int i,x,y;
  unsigned int width = (*bitmap).width, height = (*bitmap).height;
  unsigned int length = width * height;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  double radius = 1.5-((double)val/100), power = 0.8;
  double cx = (double)width/2, cy = (double)height/2;
  double maxDis = radius * dist(0,0,cx,cy);
  double temp,temp_s;
   for (y = 0; y < height; y++){
       for (x = 0; x < width; x++ ) {
           temp = dist(cx, cy, x, y) / maxDis;
           temp = temp * power;
           temp_s = pow(cos(temp), 4);
           red[x+y*width] = truncate((int)(red[x+y*width]*temp_s));
           green[x+y*width] = truncate((int)(green[x+y*width]*temp_s));
           blue[x+y*width] = truncate((int)(blue[x+y*width]*temp_s));
       }
   }
}

All these above mentioned functions are called from main.c file by creating JNI functions corresponding to each. These JNI functions are further defined with proper name in Java and arguments are passed to it. If you are not clear with JNI, refer my previous posts.

Resources

Continue ReadingEnhancing Images using Native functions in Phimpme Android

Sorting Events in Open Event Organizer Android App

While working on Open Event Organizer project, we had to display events in a single list in custom order with proper sub headings. Initially, we were thinking of using tabbed activity and showing events in respective tabs. But the thing with tabs is that it requires you to nest fragments and then each of them will have adapters. Also, we have used Model View Presenter pattern in the project, so this is another reason we did not use view pager as it would increase the number of presenter and view classes for the same feature. So we decided to display events in a single list instead. The custom order decided was that events would be divided into three categories – live, upcoming and past. In each category, a recent event will be at the top of another.

Adding SubHeadings support to the Recycler View

So the first thing was adding subheading support to the recycler view. We have used timehop’s sticky header decorators library for subheadings implementation. First, your adapter should implement the interface StickyRecyclerHeadersAdapter provided by the library. In our case the implemented methods look like:

@Override
public long getHeaderId(int position) {
  Event event = events.get(position);
  return DateService.getEventStatus(event).hashCode();
}

@Override
public EventsHeaderViewHolder onCreateHeaderViewHolder(ViewGroup viewGroup) {
  return new EventsHeaderViewHolder(EventSubheaderLayoutBinding.inflate(LayoutInflater.from(viewGroup.getContext()), viewGroup, false));
}

@Override
public void onBindHeaderViewHolder(EventsHeaderViewHolder holder, int position) {
  Event event = events.get(position);
  holder.bindHeader(DateService.getEventStatus(event));
}

@Override
public int getItemCount() {
  return events.size();
}

 

The first one is getHeaderId which returns a unique id for a group of items which should appear under a single subheading. In this case, DateService.getEventStatus returns status of an event (either live, past or upcoming) and so hashcode of it is returned as a unique id for that header. OnCreateHeaderViewHolder is same as onCreateViewHolder of your adapter. Return your header view here. Similarly in onBindViewHolder, bind data to the header. getItemCount returns total number of items.

Sorting Events

The important thing to do was sorting events in the order decided. We had to implement the Comparable interface to Event model which will compare any two events using our custom rules such that after sorting we get events in the order – Live, Upcoming and Past with recent one at the top in each category. The compareTo method of Event model looks like:

public int compareTo(@NonNull Event otherEvent) {
  Date now = new Date();
  try {
     Date startDate = DateUtils.getDate(getStartTime());
     Date endDate = DateUtils.getDate(getEndTime());
     Date otherStartDate = DateUtils.getDate(otherEvent.getStartTime());
     Date otherEndDate = DateUtils.getDate(otherEvent.getEndTime());
     if (endDate.before(now) || otherEndDate.before(now)) {
         // one of them is past and other can be past or live or upcoming
         return endDate.after(otherEndDate) ? -1 : 1;
     } else {
         if (startDate.after(now) || otherStartDate.after(now)) {
             // one of them is upcoming other can be upcoming or live
             return startDate.before(otherStartDate) ? -1 : 1;
         } else {
             // both are live
             return startDate.after(otherStartDate) ? -1 : 1;
         }
     }
  } catch (ParseException e) {
  e.printStackTrace();
  }
  return 1;
}

 

The compareTo method returns a positive integer value for greater than, the negative integer value for less than and 0 if equal. Accordingly, we have implemented the method as per our need. At first case, we check if one of the events is past by comparing end dates with now. So the other event can be past, live or upcoming. In all the cases we will need to have an event top of another if an end date of the event is before the end date of another. In next case, only live and upcoming events pair will reach to this case. So, in this case, we check if one of them is upcoming so that other can be either upcoming or live. In both the cases, we need to have an event with start date before another’s start date at the top. Hence just comparing start dates of them will do the trick. For the last case, we are left with both live events. So here we need an event with start date after another event at the top. Hence just comparing start date if it is after other’s start date then it comes on top of another.

Using this method, events are sorted and supplied to the adapter which implements StickyRecyclerHeadersAdapter. Hence in the list, events are displayed in Live, Upcoming and Past categories as expected with respective section headers and in each category, a recent event comes on top of another.

Links:
Sticky headers decorator library- https://github.com/timehop/sticky-headers-recyclerview

Continue ReadingSorting Events in Open Event Organizer Android App

How to use Digital Ocean and Docker to setup Test CMS for Phimpme

One of the core feature of Phimpme app is sharing images to other different accounts, including various open source CMS such as WordPress, Drupal etc and other open source data storage account such as OwnCloud, NextCloud etc.

One can not have everything at place, but for development and testing purpose it is required in our end. So problem I was facing to get things done in most optimize way. I thought setting things on hosted server would be good, because it saves lots of time in setting locally in our system, adding all the dependencies. And also we cannot share the account as it is limited to our local environment.

Digital Ocean caught my attention in providing hosting service. It is very easy to use with their droplet creation. Select as per your system requirement and service requirement, the droplet will be ready in few moments and we can use it anywhere.

Note: DigitalOcean is a paid service. Student can use Github Education Pack for free credits on Digital Ocean. I used the same.

I currently worked on Nextcloud integration so here in this blog I will tell how to quickly create nextcloud server using Digital Ocean and Docker.

Step 1: Creating Droplet

DigitalOcean completely work on droplets and one can anytime create and destroy different droplets associated with their account.

Choose an Image

So there are three options of choosing the image of Droplet.

Distributions : Which is other operating systems you want to use

One Click app: It is a very good feature as it creates everything for use in just one click. But again, it doesn’t provide everything, like there is no NextCloud. That’s why I used docker to take its image.

Snapshots: This is if you saved your droplet already, so it will pick it and creates similar to the saved image. Here I selected Docker from one-click apps section.

Selecting the size

This is for selecting the size of the server we are creating, For small development purpose $5 plan is good. There is a good thing in DigitalOcean as it didn’t charge on the monthly basis to the use. It works on hourly basis and charge according to that. Also providing the SSD Disk for fast IO operations.

Choose a datacenter Region

Add SSH

This is very important to add a ssh key. Otherwise you have to setup root password or used the shell they provide. To work on your computer terminal, its good that you setup an ssh key already and it to the account.

How to get ssh key in your system: https://help.github.com/

Rename the number of droplet and name of the droplet and create.

Now it will show there in your droplet section

Step 2: Access the Server

As we have already added the ssh key to our droplet. Now we can access it from our terminal. Open the terminal and type this

➜  ~ ssh root@<your IP> 

It will logged in to you

root@docker-512mb-blr1-01:~# 

Our objective is setting a NextCloud Account.

Here now I will use Docker. Firstly, What is Docker?

Go here to read: https://www.docker.com/what-docker

I will explain docker in other words. Like I setted up everything which I need. Now If I have to destroy this all and want to use it after some days. Or if my friends wants to use the setted platform. What is the option here?

Recreate and everything everytime? NO.

Just create docker image, save it pull the image when you want, and run it to serve on the serve. Your friends need, provide them the docker image.

Isn’t it cool and much time saving.

Browse the Docker Hub

In the hub we can find docker images for various platforms officially maintained by the authors.

Nextcloud have their official account on Docker to provide latest images to the developers.

Here is the link : https://hub.docker.com/_/nextcloud/

Pull the image in your server.

root@docker-512mb-blr1-01:~# docker pull nextcloud
Using default tag: latest
latest: Pulling from library/nextcloud
9f0706ba7422: Pull complete
4c407763908f: Pull complete
82e2bc3a45c1: Pull complete
c84e1013aed1: Pull complete
a3b5e03d7e24: Pull complete
917f836a88be: Pull complete
b2dc54431819: Pull complete
a60b574790b8: Pull complete
49ef0f1aff88: Pull complete
7773a865ee49: Pull complete
9e0e5cc56a9d: Pull complete
bfade1c7421e: Pull complete
ece8ceb33bed: Pull complete
c691d2747a3e: Pull complete
4b5e96bf54c9: Pull complete
6fbe30ae456b: Pull complete
e0c534b35a6b: Pull complete
4d2687f4b6f3: Pull complete
00197422846a: Pull complete
6ab57168c49c: Pull complete
9e1260db005f: Pull complete
Digest: sha256:1bb5c256f19dcec60d8468c00bc7dc74efdf93390666cb82e20bcacbbbd9746c
Status: Downloaded newer image for nextcloud:latest
root@docker-512mb-blr1-01:~#

Following the documentation

I need to run this command $ docker run -d -p 8080:80 nextcloud

It serves the account on localhost.

Check on your https://<IP>:8080

So in this way I easily setup different account for testing and integration purpose in Phimpme Android app. It really saves lots of time and speed up the process.

You can easily destroy the droplet after work is done.

Student can use the free credits from GitHub Education Pack.

Source:

 

Continue ReadingHow to use Digital Ocean and Docker to setup Test CMS for Phimpme

Shrinking Model Classes Boilerplate in Open Event Android Projects Using Jackson and Lombok

JSON is the de facto standard format used for REST API communication, and for consuming any of such API on Android apps like Open Event Android Client and Organiser App, we need Plain Old Java Objects, or POJOs to map the JSON attributes to class properties. These are called models, for they model the API response or request. Basic structure of these models contain

  • Private properties representing JSON attributes
  • Getters and Setters for these properties used to change the object or access its data
  • A toString() method which converts object state to a string, useful for logging and debugging purposes
  • An equals and hashcode method if we want to compare two objects

These can be easily and automatically be generated by any modern IDE, but add unnecessarily to the code base for a relatively simple model class, and are also difficult to maintain. If you add, remove, or rename a method, you have to change its getters/setters, toString and other standard data class methods.

There are a couple of ways to handle it:

  • Google’s Auto Value: Creates Immutable class builders and creators, with all standard methods. But, as it generates a new class for the model, you need to add a library to make it work with JSON parsers and Retrofit. Secondly, there is no way to change the object attributes as they are immutable. It is generally a good practice to make your models immutable, but if you are manipulating their data in your application and saving it in your database, it won’t be possible except than to create a copy of that object. Secondly, not all database storage libraries support it
  • Kotlin’s Data Classes: Kotlin has a nice way to made models using data classes, but it has certain limitations too. Only parameters in primary constructor will be included in the data methods generated, and for creating a no argument constructor (required for certain database libraries and annotation processors), you either need to assign default value to each property or call the primary constructor filling in all the default values, which is a boilerplate of its own. Secondly, sometimes 3rd party libraries are needed to work correctly with data classes on some JSON parsing frameworks, and you probably don’t want to just include Kotlin in your project just for data classes
  • Lombok: We’ll be talking about it later in this blog
  • Immutables, Xtend Lang, etc

This is one kind of boilerplate, and other kind is JSON property names. As we know Java uses camelcase notation for properties, and JSON attributes are mostly:

  • Snake Cased: property_name
  • Kebab Cased: property-name

Whether you are using GSON, Jackson or any other JSON parsing framework, it works great for non ambiguous property names which match as same in both JSON and Java, but requires special naming attributes for translating JSON attributes to camelcase and vice versa. Won’t you want your parser to intelligently convert property_name to propertyName without having you write the tedious mapping which is not only a boilerplate, but also error prone in case your API changes and you forget to update the annotations, or make a spelling mistake, as they are just non type-safe strings.

These boilerplates cause serious regressions during development for what should be a simple Java model for a simple API response. These both kinds of boilerplate are also related to each other as all JSON parsers look for getters and setters for private fields, so there are two layers of potential errors in modeling JSON to Java Models. This should be a lot simpler than it is. So, in this blog, we’ll see how we can configure our project to be 0 boilerplate tolerant and error free. We reduced approximately 70% boilerplate using this configuration in our projects. For example, the Event model class we had (our biggest) reduced from 590 lines to just 74!

We will use a simpler class for our example here:

public class CallForPapers {

    private String announcement;
    @JsonProperty("starts-at")
    private String startsAt;
    private String privacy;
    @JsonProperty("ends-at")
    private String endsAt;

    // Getters and Setters

    @Override
    public String toString() {
        return "CallForPapers{" +
                "announcement='" + announcement + '\'' +
                ", startsAt='" + startsAt + '\'' +
                ", privacy='" + privacy + '\'' +
                ", endsAt='" + endsAt + '\'' +
                '}';
    }
}

 

Note that getters and setters have been omitted for brevity. The actual class is 57 lines long

As you can see, we are using @JsonProperty annotation to properly map the starts-at attribute to startsAt property and similarly on endsAt. First, we’ll remove this boilerplate from our code. Note that this seems a bit overkill for 2 attributes, but imagine the time you’ll save by not having to maintain 100s of attributes for the whole project.

Jackson is smart enough to map different naming styles to one another in both serializing and deserializing. This is done by using Naming Strategy class in Jackson. There is an option to globally configure it, but I found that it did not work for our case, so we had to apply it to each model. It can be simply done by adding another annotation on the top of your class declaration and removing the JsonProperty attribute from your fields

@JsonNaming(PropertyNamingStrategy.KebabCaseStrategy.class)
public class CallForPapers {

    private String announcement;
    private String startsAt;
    private String privacy;
    private String endsAt;

    // Getters and Setters

    @Override
    public String toString() {
        return "CallForPapers{" +
                "announcement='" + announcement + '\'' +
                ", startsAt='" + startsAt + '\'' +
                ", privacy='" + privacy + '\'' +
                ", endsAt='" + endsAt + '\'' +
                '}';
    }
}

 

Our class looks like this now. But be careful to properly name your getters and setters because now, Jackson will map attributes by method names, so if you name the setter for startsAt -> setStartsAt it will automatically understand that the attribute to be mapped is “starts-at”. But, if the method name is something else, then it won’t be able to correctly map the fields. If your properties are not private, then Jackson may instead use them to map fields, so be sure to name your public properties in a correct manner.

Note: If your API does not use kebab case, there are plenty of other options or naming strategies present in Jackson, one example will be

  • PropertyNamingStrategy.SnakeCaseStrategy for attributes like “starts_at”

 Needless to say, this will only work if your API uses a uniform naming strategy

Now we have removed quite a burden from the development lifecycle, but 70% of class is still getters, setters, toString and other data methods. Now, we’ll configure lombok to automatically generate these for us. First, we’ll need to add lombok in our project by adding provided dependency in build.gradle and sync the project

provided 'org.projectlombok:lombok:1.16.18'

 

And now you’d want to install Lombok plugin in Android Studio by going to Files > Settings > Plugins and searching and installing Lombok Plugin and restarting the IDE

 

After you have restarted the IDE, navigate to your model and add @Data annotation at the top of your class and remove all getters/setters, toString, equals, hashcode and if the plugin was installed correctly and lombok was installed from the gradle dependencies, these will be automatically generated for you at build time without any problem. A way for you to see the generated methods is to the structure perspective in the Project Window.

 

There are many more fun tools in lombok and more fine grained control options are provided. Our class looks like this now

@Data
@JsonNaming(PropertyNamingStrategy.KebabCaseStrategy.class)
public class CallForPapers {
    private String announcement;
    private String startsAt;
    private String privacy;
    private String endsAt;
}

 

Reduced to 16 lines (including imports and package). Now, there are some corner cases that you want to iron out for the integration between lombok and Jackson to work correctly.

Lombok uses property names for generating its getters and setters. But there’s a different convention for handling booleans. For the sake of simplicity, we’ll only talk about primitive boolean. You can check out the links below to learn more about class type. The primitive boolean property of the standard Java format, for example hasSessions will generate hasSessions and getter and setHasSessions. Jackson is smart but it expects a getter named getHasSessions creating problems in serialization. Similarly, for a property name isComplete, generate getter and setter will be isComplete and setComplete, creating a problem in deserialization too. Actually, there are ways how Jackson can get boolean values mapped correctly with these getters/setters, but that method needs to rename property itself, changing the getters and setters generated by Lombok. There is actually a way to tell Lombok to not generate this format of getter/setter. You’d need to create a file named lombok.config in your project directory app/ and write this in it

lombok.anyConstructor.suppressConstructorProperties = true
lombok.addGeneratedAnnotation = false
lombok.getter.noIsPrefix = true

 

There are some other settings in it that make it configured for Android specific project

There are some known issues in Android related to Lombok. As lombok itself is an annotation processor, and there is no order for annotation processors to run, it may create problems with other annotation processors. Dagger had issues with it until they fixed it in their later versions. So you might need to check out if any of your libraries depend upon the lombok generated code like getters and setters. Certain database libraries use that and Android Data Binding does too. Currently, there is no solution to the problem as they will throw an error about not finding a getter/setter because they ran before lombok. A possible workaround is to make properties public so that instead of using getters and setters, these libraries use them instead. This is not a good practice, but as this is a data class and you are already creating getters and setters for all fields, this is not a security vulnerability.

There are tons of options for both Jackson and Lombok with a lot of features to help the development process, so be sure to check out these links:

Continue ReadingShrinking Model Classes Boilerplate in Open Event Android Projects Using Jackson and Lombok