Add Autocomplete SearchView in Open Event Android App

The Open Event Android App has a map for showing all locations of sessions. All the locations have a marker in the map. It is difficult to find a particular location on the map because to know the name of location user has to click on the marker. Adding autocomplete SearchView will improve user experience by providing an ability to search the location by name and by suggesting name according to the search query. In this post I explain how to add autocomplete SearchView in the fragment or activity.

Add search icon in actionbar

The first step to do is to create a menu xml file and add a search menu item in it. Then inflate this menu xml file in Fragment in onCreateOptionsMenu() method.

1. Create menu.xml file

In this file add search menu element. Inside menu element add search menu item. Define id, title, and icon of search menu item. Add android.support.v7.widget.SearchView” as actionViewClass which will be used as action view when the user clicks on the icon.

<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools">
    
   <item
        android:id="@+id/action_search"
        android:icon="@drawable/ic_search_white_24dp"
        android:title="@string/search"
        app:actionViewClass="android.support.v7.widget.SearchView"
        app:showAsAction="ifRoom | collapseActionView"/>
</menu>

2. Inflate menu.xml file in Fragment

In the fragment’s onCreateOptionsMenu() method inflate menu.xml file using MenuInflater’s inflate() method. Then find search menu item using menu’s findItem() method by passing id of search menu item as parameter.

public void onCreateOptionsMenu(Menu menu, MenuInflater inflater) {
        super.onCreateOptionsMenu(menu, inflater);
        inflater.inflate(R.menu.menu_map, menu);
        MenuItem item = menu.findItem(R.id.action_search);
}

Add and initialize SearchView  

Now after adding search icon we need to add SearchView and SearchAutoComplete fields in the fragment.

private SearchView searchView;
private SearchView.SearchAutoComplete   mSearchAutoComplete;

Initialize SearchView in onCreateOptionMenu() method by passing search menu item in the getActionView() method of MenuItemCompat.

Here SearchAutoComplete is a child object of SearchView so initialize it using findViewById method of SearchView by passing the id as parameter.

searchView = (SearchView) MenuItemCompat.getActionView(item);
mSearchAutoComplete = (SearchView.SearchAutoComplete) searchView.findViewById(android.support.v7.appcompat.R.id.search_src_text);

Define properties of SearchAutoCompleteView

By default background of drop down menu in SearchAutoComplete is black. You can change background using setDropDownBackgroundResource() method. Here i’m making it white by providing white drawable resource.

mSearchAutoComplete.setDropDownBackgroundResource(R.drawable.background_white);
mSearchAutoComplete.setDropDownAnchor(R.id.action_search);
mSearchAutoComplete.setThreshold(0)

The setDropDownAnchor() method sets the view to which the auto-complete drop down list should anchor. The setThreshold() method specifies the minimum number of characters the user has to type in the edit box before the drop down list is shown.

Create array adapter

Now it’s time to make the ArrayAdapter object which will provide the data set (strings) which will be used to run search queries.

ArrayAdapter<String> adapter = new ArrayAdapter<>(getActivity(), android.R.layout.simple_list_item_1, searchItems);

Here searchItems is List of strings. Now set this adapter to the mSearchAutoComplete object using setAdapter() method.

mSearchAutoComplete.setAdapter(adapter);

Now we are all set to run the app on device or emulator. Here’s demo how it will look

Conclusion

The SearchView with an ability to give suggestions serves the great user experience in the application.

Additional resources:

Continue ReadingAdd Autocomplete SearchView in Open Event Android App

Create an AutocompleteTextView dropdown for the email input in the Open Event Orga Android App

In the first version of the Open Event Organizer App, the event organizer was required to enter his full email each time he logged out of his account and therefore it was hindering the user experience. AutoCompleteTextView with shared preferences is a solution to this problem. This feature provides an editable text view that shows completion suggestions automatically while the user is typing. The list of suggestions is displayed in a drop down menu. The user can choose an item to replace the content of the edit box with. It is extremely useful in enhancing user experience.

The solution we implemented was to create an autocomplete textview for the email input, store the email address of the user on a successful login in the shared preference in a set of strings to prevent duplicacy and display it in the dropdown on subsequent login attempts.

Implementation

Change your TextInputLayout structure to accommodate the autocompletetextview. Remember to create a separate autocompletetextview object with the specific id of the view.

<android.support.v7.widget.AppCompatAutoCompleteTextView
       android:id="@+id/email_dropdown"
       android:layout_width="match_parent"
       android:layout_height="wrap_content"
       android:hint="@string/email"
       android:inputType="textEmailAddress" />

 

Create Utility methods to get/store the emails in the shared preferences. The set data structure has been used here so that there is no duplicacy while storing the emails in the shared preferences.

public Set<String> getStringSet(String key, Set<String> defaultValue) {
   return sharedPreferences.getStringSet(key, defaultValue);
}

public void saveStringSet(String key, Set<String> value) {
   SharedPreferences.Editor editor = sharedPreferences.edit();
   editor.putStringSet(key, value);
   editor.apply();
}

public void addStringSetElement(String key, String value) {
   Set<String> set = getStringSet(key, new HashSet<>());
   set.add(value);
   saveStringSet(key, set);
}

 

Create helper methods to add an email and retrieve the list of emails from the shared preferences to provide it to the views.

private void saveEmail(String email) {
   utilModel.addStringSetElement(Constants.SHARED_PREFS_SAVED_EMAIL, email);
}

private Set<String> getEmailList() {
   return utilModel.getStringSet(Constants.SHARED_PREFS_SAVED_EMAIL, null);
}

 

Create an autocompleteTextView object in your activity with the help of the R id from the layout and set the adapter with the set of strings retrieved from the shared preferences. You could create a custom adapter for this case too, but as far as the Open Event Orga App was concerned, using the array adapter made sense.

autoCompleteEmail.setAdapter(new ArrayAdapter<>(this, android.R.layout.simple_list_item_1,
   new ArrayList<String>(emails)));

 

Conclusion

It is important that the user is served with the best possible experience of the application and the autocomplete text view for the email serves just that.

Resources

Continue ReadingCreate an AutocompleteTextView dropdown for the email input in the Open Event Orga Android App

Using Vector Images in SUSI Android

SUSI is an artificial intelligence for interactive chat bots. For making it more user friendly and interactive we add a lot of images in the form of drawable resources in the SUSI Android App (https://github.com/fossasia/susi_android). Most of these drawables are in the form of PNGs. There are certain problems associated with the use of PNG images.

  1. PNGs cannot be scaled without losing quality. Due to which for the same PNG image we have to include separate images of varied quality. Otherwise the image will become blur.
  2. PNGs tends to take large disk space which can be easily reduced with the use of vector images.
  3. PNGs have fixed color and dimensions which cannot be changed.

Due to the above shortcomings of PNG images we decided to use vector drawable images instead of them.

Advantages associated with Vector images

  1. They can be scaled to any size without the loss in quality. Thus we need to include only a single image in the app and not of varied qualities.
  2. They are very small in size as compared to PNGs.
  3. They can be easily modified programmatically in XML file unlike PNGs.

Using Vector Images in Android Studio

Android Studio provide tools by which we can directly import vector drawables in the project. To import Vector images go to File>New>Vector Assets in studio.

From here we can choose the icon we want to include in our project and click OK. The icon will appear in the drawables directory and can be used anywhere in the projects.

Implementation in SUSI Android

In Susi Android we have used various vector images such as arrows, pointer and even the logo of the app. Here below is the logo of SUSI.

This is actually a vector image below we will see the code required to get this logo as the output.

<vector android:height="50dp" android:viewportHeight="279.37604"

  android:viewportWidth="1365.2" android:width="220dp" xmlns:android="http://schemas.android.com/apk/res/android">

<path android:fillColor="#ffffff"

      android:pathData="M127.5,7.7c-26.8,3.3 -54.2,16.8 -75.9,37.4 -11.8,11.1 -20.4,22.9 -28.1,38.4 -8.9,17.8 -12.8,32.1 -13.7,51l-0.3,6 39,0 39,0 0.3,-4c0.7,-12.1 6.8,-24.1 17.2,-34.5 8.5,-8.4 16.2,-13.4 25.9,-16.7l6.6,-2.2 81.3,-0.1 81.2,0 0,-38 0,-38 -84.7,0.1c-46.7,0.1 -86.1,0.4 -87.8,0.6z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M319.2,11.3l-4.3,4.3 0.3,103c0.4,113.2 0,105.9 6.4,118.6 10.8,21.3 35.1,41.9 56.2,47.3 8.5,2.3 99.1,2.2 107.7,0 18.7,-4.9 39.2,-20.7 51.5,-39.7 3.4,-5.1 7.1,-12.2 8.3,-15.8l2.2,-6.5 0.5,-103.3 0.5,-103.3 -4.5,-4.4 -4.6,-4.5 -31.5,0 -31.5,0 -4.7,4.8 -4.7,4.8 0,93 0,93 -3.3,3.2 -3.3,3.2 -29,0 -29,0 -2.6,-2.7 -2.7,-2.8 -0.7,-94.2 -0.7,-94.2 -4.3,-4 -4.2,-4.1 -31.9,0 -31.9,0 -4.2,4.3z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M680,7.6c-31.6,4.8 -56.1,17.3 -79,40.3 -23.2,23.3 -36.3,50.5 -38.9,80.9 -0.5,5.9 -0.7,11 -0.4,11.4 0.2,0.5 17.7,0.8 38.8,0.8l38.4,0 0.6,-4.8c3.2,-23.2 21.3,-44.1 44.7,-51.3 5.6,-1.8 10.6,-1.9 86.6,-1.9l80.7,0 -0.3,-38 -0.2,-38 -84.3,0.1c-46.3,0.1 -85.3,0.3 -86.7,0.5z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M869.1,13.4l-4.1,6.4 0,126.4 0,126.3 4.8,6.7 4.7,6.8 31.6,0 31.6,0 4.7,-7 4.6,-7 0,-125.7 0,-125.8 -4.7,-6.7 -4.8,-6.8 -32.1,0 -32.1,0 -4.2,6.4z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M222.5,152.2c-0.2,0.7 -0.9,4.2 -1.5,7.7 -3.4,19.5 -19.4,38 -40,46.4l-5.5,2.2 -83,0.5 -83,0.5 -0.3,37.8 -0.2,37.8 89.2,-0.3 89.3,-0.3 9.6,-2.7c57.7,-16.3 100.1,-67.4 102.1,-123.3l0.3,-7 -38.3,-0.3c-30.1,-0.2 -38.3,0 -38.7,1z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff"

      android:pathData="M774.5,152.2c-0.2,0.7 -0.9,4.1 -1.5,7.5 -3.3,19.2 -18.8,37.3 -39.4,46.2l-6.1,2.6 -83,0.5 -83,0.5 -0.3,37.7 -0.2,37.8 85.9,0c93.7,0 91.4,0.1 110.1,-5.9 26.4,-8.5 53.3,-28.4 69.8,-51.7 15.2,-21.3 25.1,-50.1 24,-69.9l-0.3,-6 -37.8,-0.3c-29.7,-0.2 -37.8,0 -38.2,1z" android:strokeColor="#00000000"/>

  <path android:fillColor="#ffffff" android:pathData="m1146.99,0 l-1.38,1.19c-0.76,0.66 -1.85,1.61 -2.43,2.13 -0.58,0.51 -1.75,1.54 -2.61,2.28 -1.52,1.31 -1.58,1.41 -2.4,3.53 -0.46,1.2 -0.92,2.37 -1.01,2.59 -30.55,82.93 -61.62,165.72 -96.03,259.63 0,0.08 1.61,1.88 3.57,3.98l3.57,3.84 33.47,-0.04 33.47,-0.04c12.28,-35.6 25.13,-72.47 37.4,-107.27 0.06,-0.25 0.28,-0.64 0.5,-0.88 0.37,-0.41 0.61,-0.43 4.2,-0.43 3.63,0 3.83,0.02"/>

  <path android:fillColor="#ffffff" android:pathData="m967.09,279.18c-2.48,-3.74 -4.97,-7.04 -8.09,-11.76l0.09,-43.92c3.34,-5.26 5.31,-6.73 8.42,-11.51 17.91,0.02 34.3,0.26 50.88,0.26 3.21,4.88 4.09,6.72 7.81,12.66 -0.05,13.98 0.1,27.96 -0.12,41.94 -2.9,4.2 -4.27,7.42 -7.78,12.18 -18.81,-0.04 -35.43,0.2 -51.21,0.15z"/>

  <path android:fillColor="#ffffff"

      android:pathData="m1287.3,6.59 l-4.1,6.4 0,126.4 0,126.3 4.8,6.7 4.7,6.8 31.6,0 31.6,0 4.7,-7 4.6,-7 0,-125.7 0,-125.8 -4.7,-6.7 -4.8,-6.8 -32.1,0 -32.1,0 -4.2,6.4z" android:strokeColor="#00000000"/>


</vector>

In this code we can easily change the color and minor details for the logo which could have been not possible if the logo was in PNG format. Also we don’t need multiple logo images of varied qualities as it can be scaled without decreasing quality.

Resources

Continue ReadingUsing Vector Images in SUSI Android

Applying Filters on Images using Native Functions in Phimpme Android

In the Phimpme application, the user can apply multiple colorful filters on images captured from application’s camera or already available images on the device. This application of filters on images is performed using native image processing functions. We implemented many filters for enhancing the image. Implementation of few of the filter functions is shown below.

Filters are applied to an image by modifying the color values of pixels in the Phimpme application. This is similar to the implementation of image enhancing functions in the editor of Phimpme. My post on that is available here.

Black and White filter:

Black and white filter can be called as gray scaling the image. In a gray scale image, there will only be a single color channel. If multiple channels are present, the corresponding pixel values in all channels will be same. Here in Phimpme, we have an RGB image. It has 3 color channels. Every pixel has three values. Black and white filter can be implemented by replacing those three different values with the average of those values. The implementation of the function and the resultant image with the comparison is shown below.

void applyBlackAndWhiteFilter(Bitmap* bitmap) {
  register unsigned int i;
  unsigned int length = (*bitmap).width * (*bitmap).height;
  register unsigned char grey;
  unsigned char* red = (*bitmap).red;
  unsigned char* green = (*bitmap).green;
  unsigned char* blue = (*bitmap).blue;
  for (i = length; i--;) {
     grey = (red[i] + green[i] + blue[i]) / 3;
     red[i] = truncate((int) grey);
     green[i] = truncate((int) grey);
     blue[i] = truncate((int) grey);
  }
}

    

Ansel Filter

This Ansel Filter is a monotone filter present in Phimpme which is similar to black and white. Here in this filter, the contrast will be little high and gives the image artistic look. This is achieved in Phimpme by hard overlaying the gray pixel components of the image. The rest is same as the black and white filter. The implementation of hard overlay blending and the Ansel function is shown below with the resultant images.

static unsigned char hardLightLayerPixelComponents(unsigned char maskComponent, unsigned char imageComponent) {

  return (maskComponent > 128) ? 255 - (( (255 - (2 * (maskComponent-128)) ) * (255-imageComponent) )/256) : (2*maskComponent*imageComponent)/256;
}

void applyAnselFilter(Bitmap* bitmap) {
/*initializations*/
  unsigned char br,bg,bb;
  for (i = length; i--; ) {
       grey = (red[i] + green[i] + blue[i]) / 3;
       int eff = hardLightLayerPixelComponents(grey, grey);
       red[i] = truncate(eff);
       green[i] = truncate(eff);
       blue[i] = truncate(eff);
  }
}

    

Sepia Filter

The Sepia Filter in Phimpme results in a monotone image with orangish yellow tone. Its implementation uses pre-defined look up tables(LUTs) for all the three channels. The luminosity of a particular pixel is found out and then the red, green, blue values are found out from the look up tables(LUTs) corresponding to that luminosity. The look up table arrays we used for the sepia effect in Phimpme are given below and the implementation is also shown below.

const unsigned char sepiaRedLut[256] = {24, 24, 25, 26, 27, 28, 29, 30, 30, 30, 31, 32, 33, 34, 35, 36, 37, 37, 38, 38, 39, 40, 41, 42, 43, 43, 44, 45, 46, 47, 47, 48, 49, 50, 50, 51, 52, 53, 54, 55, 56, 57, 57, 58, 58, 59, 60, 61, 62, 63, 64, 64, 65, 66, 67, 68, 69, 70, 71, 71, 72, 72, 73, 74, 75, 76, 77, 78, 78, 79, 80, 81, 82, 83, 84, 85, 85, 86, 87, 88, 89, 89, 90, 91, 92, 93, 93, 94, 95, 96, 97, 97, 98, 99, 100, 101, 102, 102, 103, 104, 105, 106, 107, 108, 109, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 146, 147, 148, 149, 150, 151, 152, 153, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 178, 180, 181, 182, 183, 184, 185, 186, 186, 187, 188, 189, 190, 191, 193, 194, 195, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255};

const unsigned char sepiaGreenLut[256] = {16, 16, 16, 17, 18, 18, 19, 20, 20, 20, 21, 22, 22, 23, 24, 24, 25, 25, 26, 26, 27, 28, 28, 29, 30, 30, 31, 31, 32, 33, 33, 34, 35, 36, 36, 36, 37, 38, 39, 39, 40, 41, 42, 43, 43, 44, 45, 46, 47, 47, 48, 48, 49, 50, 51, 51, 52, 53, 54, 54, 55, 55, 56, 57, 58, 59, 60, 61, 61, 61, 62, 63, 64, 65, 66, 67, 67, 68, 68, 69, 70, 72, 73, 74, 75, 75, 76, 77, 78, 78, 79, 80, 81, 81, 82, 83, 84, 85, 86, 87, 88, 90, 90, 91, 92, 93, 94, 95, 96, 97, 97, 98, 99, 100, 101, 103, 104, 105, 106, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 122, 123, 123, 124, 125, 127, 128, 129, 130, 131, 132, 132, 134, 135, 136, 137, 138, 139, 141, 141, 142, 144, 145, 146, 147, 148, 149, 150, 151, 152, 154, 155, 156, 157, 158, 160, 160, 161, 162, 163, 165, 166, 167, 168, 169, 170, 171, 173, 174, 175, 176, 177, 178, 179, 180, 182, 183, 184, 185, 187, 188, 189, 189, 191, 192, 193, 194, 196, 197, 198, 198, 200, 201, 202, 203, 205, 206, 207, 208, 209, 210, 211, 212, 213, 215, 216, 217, 218, 219, 220, 221, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255};

const unsigned char sepiaBlueLut[256] = {5, 5, 5, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 9, 10, 10, 11, 11, 11, 11, 12, 12, 13, 13, 14, 14, 14, 14, 15, 15, 16, 16, 17, 17, 17, 18, 18, 19, 20, 20, 21, 21, 21, 22, 22, 23, 23, 24, 25, 25, 26, 27, 28, 28, 29, 29, 30, 31, 31, 31, 32, 33, 33, 34, 35, 36, 37, 38, 38, 39, 39, 40, 41, 42, 43, 43, 44, 45, 46, 47, 47, 48, 49, 50, 51, 52, 53, 53, 54, 55, 56, 57, 58, 59, 60, 60, 61, 62, 63, 65, 66, 67, 67, 68, 69, 70, 72, 73, 74, 75, 75, 76, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 90, 91, 92, 93, 93, 95, 97, 98, 99, 100, 101, 102, 104, 104, 106, 107, 108, 109, 111, 112, 114, 115, 115, 117, 118, 120, 121, 122, 123, 124, 125, 127, 128, 129, 131, 132, 133, 135, 136, 137, 138, 139, 141, 142, 144, 145, 147, 147, 149, 150, 151, 153, 154, 156, 157, 159, 159, 161, 162, 164, 165, 167, 168, 169, 170, 172, 173, 174, 176, 177, 178, 180, 181, 182, 184, 185, 186, 188, 189, 191, 192, 193, 194, 196, 197, 198, 200, 201, 203, 204, 205, 206, 207, 209, 210, 211, 213, 214, 215, 216, 218, 219, 220, 221, 223, 224, 225, 226, 227, 229, 230, 231, 232, 234, 235, 236, 237, 238, 239, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255};
void applySepia(Bitmap* bitmap){
/*bitmap initializations*/
for (i = length; i--; ) {
  register float r = (float) red[i] / 255;
  register float g = (float) green[i] / 255;
  register float b = (float) blue[i] / 255;
  register float luminosity =  (0.21f * r + 0.72f * g + 0.07 * b) * 255;
      red[i] = truncate((int)( sepiaRedLut[(int)luminosity]));
      green[i] = truncate((int)(sepiaGreenLut[(int)luminosity]));
      blue[i] = truncate((int)(sepiaBlueLut[(int)luminosity]));
  }
}

     

Cyano Filter

As the name suggests this filter adds a cyan tone to the image. For implementing this cyano filter, we first found the black and white value of the pixel and then the ceilingComponent value of the pixels of three color channels. Then the ceilingComponent Values and the gray values are overlayed to give the resultant image. Finding the ceilComponent Values and the filter implementation is shown below.

#define componentCeiling(x) ((x > 255) ? 255 : x)

static unsigned char overlayPixelComponents(unsigned int overlayComponent, unsigned char underlayComponent, float alpha) {
  float underlay = underlayComponent * alpha;
  return (unsigned char)((underlay / 255) * (underlay + ((2.0f * overlayComponent) / 255) * (255 - underlay)));

}

void applyCyano(Bitmap* bitmapl) {
  //Cache to local variables
//Bitamp initialization
  register unsigned int i;
  register unsigned char grey, r, g, b;
  for (i = length; i--;) {
     grey = ((red[i] * 0.222f) + (green[i] * 0.222f) + (blue[i] * 0.222f));
     r = componentCeiling(61.0f + grey);
     g = componentCeiling(87.0f + grey);
     b = componentCeiling(136.0f + grey);
     grey = (red[i] + green[i] + blue[i]) / 3;
     red[i] = truncate((int)(overlayPixelComponents(grey, r, 0.9f)));
     green[i] = truncate((int)(overlayPixelComponents(grey, g, 0.9f)));
     blue[i] = truncate((int)(overlayPixelComponents(grey, b, 0.9f)));
  }
}

      

Grain Filter

It is clear from the name that this filter adds grain to the image giving an artistic effect. It can be implemented in a very simple manner by assigning gray values to random pixels of an image. The condition inside the main for loop of the below implementation controls the proportion of added grain with respect to the whole image. For generating a random value, the timer has to be initialized first. The whole implementation of the function is shown below.

void applyGrain(Bitmap* bitmap) {
/*initializations*/
   time_t t;
   srand((unsigned) time(&t));
  for (i = length; i--;) {
       int rval = rand()%255;
       if (rand()%100 < 15)){
           int grey = (red[i] + green[i] + blue[i]) / 3;
           red[i] = truncate(rval);
           green[i] = truncate(rval);
           blue[i] = truncate(rval);
       }
  }
}

      

Threshold Filter

Thresholding an image gives a binary image i.e the pixels of the image will have only two values. One for a value less than the threshold and other for values greater than the threshold. The threshold value is adjusted by seek bar in Phimpme. An image looks very artistic for a particular value on the seek bar. Its implementation is shown below.

void applyThreshold(Bitmap* bitmap, int val) {
/*bitmap initializations*/
  unsigned char grey, color;
  int thres = 220 - (int)((val/100.0) * 190);
  for (i = length; i--;) {
       grey = (red[i] + green[i] + blue[i]) / 3;
       if (grey < thres) color = 0;
       else color = 255;
       red[i] = truncate((int)(color));
       green[i] = truncate((int)(color));
       blue[i] = truncate((int)(color));
  }
}

     

Resources:

Continue ReadingApplying Filters on Images using Native Functions in Phimpme Android

Addition of Bookmarks to the Homescreen in the Open Event Android App

In the Open Event Android app we had already built the new homescreen but the users only had access to bookmarks in a separate page which could be accessed from the navbar.If the bookmarks section were to be incorporated in the homescreen itself, it would definitely improve its access to the user. In this blog post, I’ll be talking about how this was done in the app.

These 2 images show the homescreen and the bookmarks section respectively.

No Bookmark View
Bookmark View

 

 

 

 

 

 

 

 

 

This was the proposed homescreen page for the app. This would provide easy access to important stuff to the user such as event venue,date,description etc. Also the same homescreen would also have the bookmarks showing at the top if there are any.

The list of bookmarks in the first iteration of design was modeled to be a horizontal list of cards.

Bookmarks Merging Process

These are some variables for reference.

private SessionsListAdapter sessionsListAdapter;
 private RealmResults<Session> bookmarksResult;
 private List<Session> mSessions = new ArrayList<>();

The code snippet below highlights the initial setup of the bookmarks recycler view for the horizontal List of cards. All of this is being done in the onCreateView callback of the AboutFragment.java file which is the fragment file for the homescreen.

bookmarksRecyclerView.setVisibility(View.VISIBLE);
 sessionsListAdapter = new SessionsListAdapter(getContext(), mSessions, bookmarkedSessionList);
 sessionsListAdapter.setBookmarkView(true);
 bookmarksRecyclerView.setAdapter(sessionsListAdapter);
 bookmarksRecyclerView.setLayoutManager(new LinearLayoutManager(getContext(),LinearLayoutManager.HORIZONTAL,false));

The SessionListAdapter is an adapter that was built to handle multiple types of displays of the same viewholder i.e SessionViewHolder . This SessionListAdapter is given a static variable as an argument which is just notifies the adapter to switch to the bookmarks mode for the adapter.

private void loadData() {
    bookmarksResult = realmRepo.getBookMarkedSessions();
    bookmarksResult.removeAllChangeListeners();
    bookmarksResult.addChangeListener((bookmarked, orderedCollectionChangeSet) -> {
        mSessions.clear();
        mSessions.addAll(bookmarked);
 
        sessionsListAdapter.notifyDataSetChanged();
 
        handleVisibility();
    });
 }

This function loadData() is responsible for extracting the sessions that are bookmarked from the local Realm database. We the update the BookmarkAdapter on the homescreen with the list of the bookmarks obtained. Here we see that a ChangeListener is being attached to our RealmResults. This is being done so that we do our adapter notify only after the data of the bookmarked sessions has been processed from a background thread.

if(bookmarksResult != null)
    bookmarksResult.removeAllChangeListeners();

And it is good practice to remove any ChangeListeners that we attach during the fragment life cycle in the onStop() method to avoid memory leaks.

So now we have successfully added bookmarks to the homescreen.

Resources

Continue ReadingAddition of Bookmarks to the Homescreen in the Open Event Android App

Handling graph plots using MPAndroid chart in PSLab Android App

In PSLab Android App, we expose the Oscilloscope and Logic Analyzer functionality of PSLab hardware device. After reading data-points to plot, we need to show plot data on graphs for better understanding and visualisation. Sometimes we need to save graphs to show output/findings of the experiment. Hence we will be using MPAndroidChart library to plot and save graphs as it provides a nice and clean methods to do so.

First add MPAndroid Chart as dependency in your app build.gradle to include the library

dependencies {
 compile 'com.github.PhilJay:MPAndroidChart:v3.0.2'
}

For chart view in your layout file, there are many available options like Line Chart, Bar Chart, Pie Chart, etc. For this post I am going to use Line Chart.

Add LineChart in your layout file

<com.github.mikephil.charting.charts.LineChart
        android:id="@+id/lineChart"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

Take a reference to LineChart of layout file in your Activity/Fragment

LineChart lineChart = (LineChart) findViewById(R.id.chart);// Activity
LineChart lineChart = (LineChart) view.findViewById(R.id.chart);// Fragment

Now we add dataset to LineChart of layout file, I am going to add data for two curves sine and cosine function to plot sine and cosine wave on LineChart. We create two different LineDataSet one for the sine curve entries and other for the cosine curve entries.

List <Entry> sinEntries = new ArrayList<>(); // List to store data-points of sine curve 
List <Entry> cosEntries = new ArrayList<>(); // List to store data-points of cosine curve

// Obtaining data points by using Math.sin and Math.cos functions
for( float i = 0; i < 7f; i += 0.02f ){
sinEntries.add(new Entry(i,(float)Math.sin(i)));
cosEntries.add(new Entry(i,(float)Math.cos(i)));
}

List<ILineDataSet> dataSets = new ArrayList<>(); // for adding multiple plots

LineDataSet sinSet = new LineDataSet(sinEntries,"sin curve");
LineDataSet cosSet = new LineDataSet(cosEntries,"cos curve");

// Adding colors to different plots 
cosSet.setColor(Color.GREEN);
cosSet.setCircleColor(Color.GREEN);
sinSet.setColor(Color.BLUE);
sinSet.setCircleColor(Color.BLUE);

// Adding each plot data to a List
dataSets.add(sinSet);
dataSets.add(cosSet);

// Setting datapoints and invalidating chart to update with data points
lineChart.setData(new LineData(dataSets));
lineChart.invalidate();

After adding datasets to chart and invalidating it, chart is refreshed with the data points which were added in dataset.

After plotting graph output would look like the image below:

You can change the dataset and invalidate chart to update it with latest dataset.

To save graph plot, make sure you have permission to write to external storage, if not add it into your manifest file

<manifest ...>
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    ...
</manifest>

To save the photo of chart into Gallery:

lineChart.saveToGallery("title");

To save a some specific location:

lineChart.saveToPath("title", "Location on SD Card");

If you want to do some resizing in chart or save two three charts in a single image, you can do so by taking out the Bitmaps and processing them to meet your requirements:

lineChart.getChartBitmap();

Resources

Continue ReadingHandling graph plots using MPAndroid chart in PSLab Android App

Packing and Unpacking Data in PSLab Android App

In PSLab we communicate with PSLab Hardware device, to exchange data, i.e we give a command or instruction and it responses accordingly. So this giving and receiving is in terms of packed byte string. Thus, we need some solid knowledge to pack and unpack data. In python communication library, there is struct module available. In JAVA we can use NIO’s ByteBuffer or implement our own functions. In this blog post I discuss both methods.  

In Python we have struct module for packing data in byte strings. As different languages interpret data types differently like Java takes 4 bytes for int and C++ takes 2 bytes for int. To send and receive data properly, we pack data in a byte string and unpack on other side with it’s data type properties. In PSLab, we have to communicate with device for various applications like getting calibration data during power up time as raw data doesn’t make much sense until calibration is applied on it.

You also need to take care of order of sequence of bytes like there are generally two types of order in which a sequence of bytes are stored in memory location:

  • Big – Endian: In which MSB is stored first.

    Source: Wikipedia
  • Little – Endian: In which LSB is stored first.

    Source: Wikipedia

In Python

The standard sizes and format characters of particular data type can be seen in the image below.

Format C Type Python Type Standard
x Pad byte No value
c char string of length 1 1
b signed char integer 1
B unsigned char integer 1
? _Bool bool 1
h short integer 2
H unsigned short integer 2
i int integer 4
I unsigned int integer 4
l long integer 4
L unsigned long integer 4
q long long integer 8
Q unsigned long long integer 8
f float float 4
d double float 8
s char[] string
p char[] string
P void* integer

Source: Python Docs

For Packing data

import struct
struct.Struct(“B”).pack(254)   # Output ->  b’\xfe’
a = struct.Struct(“I”).pack(2544)   # Output -> b’\xf0\t\x00\x00′

Now a is the byte string that has packed value as 2544, this can be send to some device byte by byte and reconstructed on receiving side by knowing how many bytes does the data type received contains.

For Unpacking data

import struct
struct.unpack(“I”,a)  # Output -> (2544,)

In JAVA

For Packing data

Suppose you have to pack an integer, in java int takes 32 bits (4 bytes)

Using JAVA’s NIO’s ByteBuffer

byte[] bytes = ByteBuffer.allocate(4).putInt(2544).array();

If you want hardcore method to see what exactly is happening, use

byte[] intToByteArray(int value){
 return new byte[]{
     (byte)value >>> 24,
     (byte)value >>> 16,
     (byte)value >>> 8,
     (byte)value
  };
}

“>>>” is used for unsigned shifting, you can use according to your requirements.

After you have your byte array, you can easily create a string out of it and transmit.

For Unpacking data

Using JAVA’s NIO’s ByteBuffer

int fromByteArray(byte[] bytes){
int a = ByteBuffer.wrap(bytes).getInt();
return a;
}

It assumes that byte array is stored as Big Endian, if bytes in byte array is stored as Little Endian, add order() after wrap()

int fromByteArray(byte[] bytes){
int a = ByteBuffer.wrap(bytes).order(ByteOrder.LITTLE_ENDIAN).getInt();
return a;
}

Note: Make sure the bytes array that you provide has same number of bytes as that of the data type that you are trying to unpack. For example: if you want int, bytes array should have 4 bytes as int type in JAVA has 4 bytes. If you want short, bytes array should have 2 bytes as short type in JAVA has 2 bytes.

To visualise underlying implementation, see

int from byteArray(byte[] bytes){
return bytes[0] << 24 | bytes[1] << 16 | bytes[2] << 8 | bytes[3];
}

In all above implementation big-endian order was assumed, you can modify function if you are using little-endian or some other sequence.

References

Continue ReadingPacking and Unpacking Data in PSLab Android App

Better Bookmark Display Viewholder in Home Screen of the Open Event Android App

Earlier in the Open Event Android app we had built the homescreen with the bookmarks showing up at the top as a horizontal list of cards but it wasn’t very user-friendly in terms of UI. Imagine that a user bookmarks over 20-30 sessions, in order to access them he/she might have to scroll horizontally a lot in order to access his/her bookmarked session. So this kind of UI was deemed counter-intuitive. A new UI was proposed involving the viewholder used in the schedule page i.e DayScheduleViewHolder, where the list would be vertical instead of horizontal. An added bonus was that this viewholder conveyed the same amount of information on lesser white space than the earlier viewholder i.e SessionViewHolder.

Old Design
New Design

 

 

 

 

 

 

 

 

 

Above are two images, one in the initial design, the second in the new design. In the earlier design the number of bookmarks visible to the user at a time was at most 1 or 2 but now with the UI upgrade a user can easily see up-to 5-6 bookmarks at a time. Additionally there is more relevant content visible to the user at the same time.

Additionally this form of design also adheres to Google’s Material Design guidelines.

Code Comparison of the two Iterations

Initial Design
sessionsListAdapter = new SessionsListAdapter(getContext(), mSessions, bookmarkedSessionList);
 sessionsListAdapter.setBookmarkView(true);
 bookmarksRecyclerView.setAdapter(sessionsListAdapter);
 bookmarksRecyclerView.setLayoutManager(new LinearLayoutManager(getContext(),LinearLayoutManager.HORIZONTAL,false));

Here we are using the SessionListAdapter for the bookmarks. This was previously being used to display the list of sessions inside the track and location pages. It is again being used here to display the horizontal list of bookmarks.To do this we are using the function setBookmarkView(). Here mSessions consists the list of bookmarks that would appear in the homescreen.

Current Design
bookmarksRecyclerView.setVisibility(View.VISIBLE);
 bookMarksListAdapter = new DayScheduleAdapter(mSessions,getContext());
 bookmarksRecyclerView.setAdapter(bookMarksListAdapter);
 bookmarksRecyclerView.setLayoutManager(new LinearLayoutManager(getContext()));

Now we are using the DayScheduleAdapter which is the same adapter used in the schedule page of the app. Now we use a vertical layout instead of a horizontal layout  but with a new viewholder design. I will be talking about the last line in the code snippet shortly.

ViewHolder Design (Current Design)

<RelativeLayout android:id="@+id/content_frame">
    <LinearLayout android:id="@+id/ll_sessionDetails">
        <TextView android:id="@+id/slot_title"/>
        <RelativeLayout>
            <TextView android:id="@+id/slot_start_time”/>
            <TextView android:id="@+id/slot_underscore"/>
            <TextView android:id="@+id/slot_end_time"/>

           <TextView android:id="@+id/slot_comma"/>
            <TextView android:id="@+id/slot_location”/>
            <Button android:id="@+id/slot_track"/>
            <ImageButton android:id="@+id/slot_bookmark"/>
        </RelativeLayout>

    </LinearLayout>
    <View android:id=”@+id/divider”/>
 </RelativeLayout>

This layout file is descriptive enough to highlight each element’s location.

In this viewholder we can also access to the track page from the track tag and can also remove bookmarks instantly.

The official widget to make a scroll layout in android is ScrollView. Basically, adding a RecyclerView inside ScrollView can be difficult . The problem was that the scrolling became laggy and weird.  Fortunately, with the appearance of Material Design , NestedScrollView was released and this becomes much easier.

bookmarksRecyclerView.setNestedScrollingEnabled(false);

With this small snippet of code we are able to insert a RecyclerView inside the NestedScrollView without any scroll lag.
So now we have successfully updated the UI/UX for the homescreen  to meet the requirements as given by the Material Design guidelines.

Resources

Continue ReadingBetter Bookmark Display Viewholder in Home Screen of the Open Event Android App

Reducing UI lags with AsyncTask in PSLab Android

In the Oscilloscope Activity, communication with the PSLab device goes in parallel with updations of the graph which result in inoperable UI if both these functions are performed in the main thread. This would severely degrade the user experience. In order to avoid this, we simply used AsyncTask. AsyncTasks are used to perform communications with the device in the background thread and update UI when the task in background thread completes. AsyncTask thus solves the problem of making UI super laggy while performing certain time-consuming functions. The UI remains responsive throughout.

More about AsyncTask

AsyncTask is an abstract Android class which helps the Android applications to handle the Main UI thread in a more efficient way. AsyncTask class allows to perform long lasting background operations and update the results in UI thread without affecting the main thread.

Implementing AsyncTask in Android applications

  • Create a new class inside Activity class and extend AsyncTask:

private class Task extends AsyncTask<Void, Void, Void> {
  	protected Long doInBackground(aVoid) {
     	}
 	protected void onProgressUpdate(aVoid) {
     	}
 	protected void onPostExecute(aVoid) {
     	}
}
  • Execute the task:

new Task().execute();

How they are used in Oscilloscope Activity?

The following diagram explains how AsyncTasks are used in Oscilloscope Activity. 

AsyncTask in PSLab Android App

A public class extending AsyncTask is defined, this task is executed from another thread.

public class Task extends AsyncTask<String, Void, Void> {
   ArrayList<Entry> entries;
   String analogInput;

 

doInBackgroundMethod performs the part related to communication with the PSLab device.

Here we are capturing the data from the hardware using captureTraces and fetchTraces method.

 @Override
   protected Void doInBackground(String... params) {
       try {
           analogInput = params[0];
           //no. of samples and timegap still need to be determined
           scienceLab.captureTraces(1, 800, 10, analogInput, false, null);
           Log.v("Sleep Time", "" + (800 * 10 * 1e-3));
           Thread.sleep((long) (800 * 10 * 1e-3));
           HashMap<String, double[]> data = scienceLab.fetchTrace(1); //fetching data
           double[] xData = data.get("x");
           double[] yData = data.get("y");
           entries = new ArrayList<Entry>();
           for (int i = 0; i < xData.length; i++) {
               entries.add(new Entry((float) xData[i], (float) yData[i]));
           }
       }
       catch (NullPointerException e){
           cancel(true);
       } catch (InterruptedException e) {
           e.printStackTrace();
       }
       return null;
   }

 

After the thread, completely executed onPostExecute is called to update the UI/graph. This method runs on the main thread.

 @Override
   protected void onPostExecute(Void aVoid) {
       super.onPostExecute(aVoid);
       LineDataSet dataset = new LineDataSet(entries, analogInput);
       LineData lineData = new LineData(dataset);
       dataset.setDrawCircles(false);
       mChart.setData(lineData);
       mChart.invalidate();    //refresh the chart
       synchronized (lock){
           lock.notify();
       }
   }
}

 

This simply solves the problem of lags and the Oscilloscope works like a charm.

Resources

Continue ReadingReducing UI lags with AsyncTask in PSLab Android

Sorting Events in Open Event Organizer Android App

While working on Open Event Organizer project, we had to display events in a single list in custom order with proper sub headings. Initially, we were thinking of using tabbed activity and showing events in respective tabs. But the thing with tabs is that it requires you to nest fragments and then each of them will have adapters. Also, we have used Model View Presenter pattern in the project, so this is another reason we did not use view pager as it would increase the number of presenter and view classes for the same feature. So we decided to display events in a single list instead. The custom order decided was that events would be divided into three categories – live, upcoming and past. In each category, a recent event will be at the top of another.

Adding SubHeadings support to the Recycler View

So the first thing was adding subheading support to the recycler view. We have used timehop’s sticky header decorators library for subheadings implementation. First, your adapter should implement the interface StickyRecyclerHeadersAdapter provided by the library. In our case the implemented methods look like:

@Override
public long getHeaderId(int position) {
  Event event = events.get(position);
  return DateService.getEventStatus(event).hashCode();
}

@Override
public EventsHeaderViewHolder onCreateHeaderViewHolder(ViewGroup viewGroup) {
  return new EventsHeaderViewHolder(EventSubheaderLayoutBinding.inflate(LayoutInflater.from(viewGroup.getContext()), viewGroup, false));
}

@Override
public void onBindHeaderViewHolder(EventsHeaderViewHolder holder, int position) {
  Event event = events.get(position);
  holder.bindHeader(DateService.getEventStatus(event));
}

@Override
public int getItemCount() {
  return events.size();
}

 

The first one is getHeaderId which returns a unique id for a group of items which should appear under a single subheading. In this case, DateService.getEventStatus returns status of an event (either live, past or upcoming) and so hashcode of it is returned as a unique id for that header. OnCreateHeaderViewHolder is same as onCreateViewHolder of your adapter. Return your header view here. Similarly in onBindViewHolder, bind data to the header. getItemCount returns total number of items.

Sorting Events

The important thing to do was sorting events in the order decided. We had to implement the Comparable interface to Event model which will compare any two events using our custom rules such that after sorting we get events in the order – Live, Upcoming and Past with recent one at the top in each category. The compareTo method of Event model looks like:

public int compareTo(@NonNull Event otherEvent) {
  Date now = new Date();
  try {
     Date startDate = DateUtils.getDate(getStartTime());
     Date endDate = DateUtils.getDate(getEndTime());
     Date otherStartDate = DateUtils.getDate(otherEvent.getStartTime());
     Date otherEndDate = DateUtils.getDate(otherEvent.getEndTime());
     if (endDate.before(now) || otherEndDate.before(now)) {
         // one of them is past and other can be past or live or upcoming
         return endDate.after(otherEndDate) ? -1 : 1;
     } else {
         if (startDate.after(now) || otherStartDate.after(now)) {
             // one of them is upcoming other can be upcoming or live
             return startDate.before(otherStartDate) ? -1 : 1;
         } else {
             // both are live
             return startDate.after(otherStartDate) ? -1 : 1;
         }
     }
  } catch (ParseException e) {
  e.printStackTrace();
  }
  return 1;
}

 

The compareTo method returns a positive integer value for greater than, the negative integer value for less than and 0 if equal. Accordingly, we have implemented the method as per our need. At first case, we check if one of the events is past by comparing end dates with now. So the other event can be past, live or upcoming. In all the cases we will need to have an event top of another if an end date of the event is before the end date of another. In next case, only live and upcoming events pair will reach to this case. So, in this case, we check if one of them is upcoming so that other can be either upcoming or live. In both the cases, we need to have an event with start date before another’s start date at the top. Hence just comparing start dates of them will do the trick. For the last case, we are left with both live events. So here we need an event with start date after another event at the top. Hence just comparing start date if it is after other’s start date then it comes on top of another.

Using this method, events are sorted and supplied to the adapter which implements StickyRecyclerHeadersAdapter. Hence in the list, events are displayed in Live, Upcoming and Past categories as expected with respective section headers and in each category, a recent event comes on top of another.

Links:
Sticky headers decorator library- https://github.com/timehop/sticky-headers-recyclerview

Continue ReadingSorting Events in Open Event Organizer Android App