Using Sensors with PSLab Android App

The PSLab Android App as of now supports quite a few sensors. Sensors are an essential part of many science experiments and therefore PSLab has a feature to support plug & play sensors. The list of sensors supported by PSLab can be found here.

  • AD7718 – 24-bit 10-channel Low voltage Low power Sigma Delta ADC
  • AD9833 – Low Power Programmable Waveform generator
  • ADS1115 – Low Power 16 bit ADC
  • BH1750 – Light Intensity sensor
  • BMP180 – Digital Pressure Sensor
  • HMC5883L – 3-axis digital magnetometer
  • MF522 – RFID Reader
  • MLX90614 – Infrared thermometer
  • MPU6050 – Accelerometer & gyroscope
  • MPU925x – Accelerometer & gyroscope
  • SHT21 – Humidity sensor
  • SSD1306 – Control for LED matrix
  • Sx1276 – Low Power Long range Transceiver
  • TSL2561 – Digital Luminosity Sensor

All the sensors except Sx1276 communicate using the I2C protocol whereas the Sx1276 uses the SPI protocol for communication. There is a dedicated set of ports on the PSLab board for the communication under the label I2C with the ports named 3.3V, GND, SCL & SDA.

blog_post_7_1

Fig; PSLab board sketch

Any I2C sensor has ports named 3.3V/VCC, GND, SCL, SDA at least along with some other ports in some sensors. The connections are as follows:

  1. 3.3V on PSLab – 3.3V/VCC on sensor
  2. GND on PSLab – GND on sensor
  3. SCL on PSLab – SCL on sensor
  4. SDA on PSLab – SDA on sensor

The diagram here shows the connections

For using the sensors with the Android App, there is a dedicated I2C library written in communication in Java for the communication. Each sensor has its own specific set of functionalities and therefore has its own library file. However, all these sensors share some common features like each one of them has a getRaw method which fetches the raw sensor data. For getting the data from a sensor, the sensor is initially connected to the PSLab board.

The following piece of code is responsible for detecting any devices that are connected to the PSLab board through the I2C bus. Each sensor has it’s own unique address and can be identified using it. So, the AutoScan function returns the addresses of all the connected sensors and the sensors can be uniquely identified using those addresses.

public ArrayList<Integer> scan(Integer frequency) throws IOException {
	if (frequency == null) frequency = 100000;
	config(frequency);
	ArrayList<Integer> addresses = new ArrayList<>();
	for (int i = 0; i < 128; i++) {
		int x = start(i, 0);
		if ((x & 1) == 0) {
			addresses.add(i);
		}
		stop();
	}
	return addresses;
}

 

As per the addresses fetched, the sensor library corresponding to that particular sensor can be imported and the getRaw method can be called. The getRaw method will return the raw sensor data. For example here is the getRaw method of ADS1115.

public int[] getRaw() throws IOException, InterruptedException {
	String chan = typeSelection.get(channel);
	if (channel.contains("UNI"))
		return new int[]{(int) readADCSingleEnded(Integer.parseInt(chan))};
	else if (channel.contains("DIF"))
		return new int[]{readADCDifferential(chan)};
	return new int[0];
}

Here the raw data is returned in the form of voltages in mV.

Similarly, the other sensors return some values like luminosity sensor TSL2561 returns values of luminosity in Lux, the accelerometer & gyroscope MPU6050 returns the angles of the 3-axes.

In order to initiate the process of getting raw data from the sensor in Sensor Activity, the object for the sensor is created and the method of getRaw is called. The following is the implementation for ADS1115. The rest of the sensors also have an implementation similar to this. There are try-catch statements in the code to handle some of the exceptions thrown during process of method calls.

ADS1115 ADS1115 = null;
try {
	ADS1115 = new ADS1115(i2c);
} catch (IOException | InterruptedException e) {
	e.printStackTrace();
}

int[] dataADS1115 = null;
String datadispADS1115 = null;
try {
	if (ADS1115 != null) {
		dataADS1115 = ADS1115.getRaw();
	}
} catch (IOException | InterruptedException e) {
	e.printStackTrace();
}

if (dataADS1115 != null) {
	for(int i = 0; i < dataADS1115.length; i++)
		datadispADS1115 += String.valueOf(dataADS1115[i]);
	}

tvSensorGetRaw.setText(datadispADS1115);

 

Additional Resources

  1. Sensor implementation in PSLab Python repository – https://github.com/fossasia/pslab-python/tree/development/PSL/SENSORS
  2. Using the sensors with Arduino in case you have worked with Arduino before – The basic connections are same as PSLab http://www.instructables.com/id/Arduino-MPU-6050-Getting-It-to-Work/
Continue ReadingUsing Sensors with PSLab Android App

Creating Custom Components in the PSLab Android App

PSLab Android App supports a lot of features and each of these features need components & views for their implementation. A typical UI of PSLab is shown in the figure below. Considering the number of views & components used in the figure, implementation of each view & component separately would lead to a huge volume of repetitive and inefficient code. As it is evident that the EditText and two buttons beside it keep repeating a lot, it is wiser to create a single custom component consisting of an EditText and two buttons. This not only leads to efficient code but also results in a drastic reduction of the volume of code.

Android has a feature which allows creating components. For almost all the cases, the pre-defined views in Android serve our purpose of creating the UIs. However, sometimes there is a need to create custom components to reduce code volume and improve quality. Custom components are used when a particular set of component needed by us is not present in the Android view collection or when a pattern of components is frequently repeated or when we need to reduce the code complexity.

The above set can be replaced by defining a custom component which includes an edittext and two buttons and then treating it like just any other component. To get started with creating a custom component, the steps are the following:

Create a layout for the custom component to be designed

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
   android:orientation="horizontal" android:layout_width="match_parent"
   android:layout_height="match_parent">

   <Button
       android:id="@+id/button_control_plus"
       android:layout_width="0dp"
       android:layout_weight="0.5"
       android:layout_height="20dp"
       android:background="@drawable/button_minus" />

   <EditText
       android:id="@+id/edittext_control"
       android:layout_width="0dp"
       android:layout_weight="2"
       android:layout_height="24dp"
       android:layout_marginTop="@dimen/control_margin_small"
       android:inputType="numberDecimal"
       android:padding="@dimen/control_edittext_padding"
       android:background="@drawable/control_edittext" />

   <Button
       android:id="@+id/button_control_minus"
       android:layout_width="0dp"
       android:layout_weight="0.5"
       android:layout_height="20dp"
       android:background="@drawable/button_plus" />
</LinearLayout>

The layout file edittext_control.xml is created with three views and each one of them has been assigned an ID along with all the other relevant parameters.

Incorporate the newly created custom layout in the Activity/Fragment layout file

<org.fossasia.pslab.others.Edittextwidget
       android:id="@+id/etwidget_control_advanced1"
       android:layout_height="wrap_content"
       android:layout_width="0dp"
       android:layout_weight="2"
       android:layout_marginLeft="@dimen/control_margin_small"
       android:layout_marginStart="@dimen/control_margin_small"
/>

The custom layout can be added the activity/fragment layout just like any other view and can be assigned properties similarly.

Create the activity file for the custom layout

public class Edittextwidget extends LinearLayout{

   private EditText editText;
   private Button button1;
   private Button button2;
   private double leastCount;
   private double maxima;
   private double minima;

 
   public Edittextwidget(Context context, AttributeSet attrs, int defStyle) {
       super(context, attrs, defStyle);
       applyAttrs(attrs);
   }

   public Edittextwidget(Context context, AttributeSet attrs) {
       super(context, attrs);
       applyAttrs(attrs);
   }

   public Edittextwidget(Context context) {
       super(context);
   }

  public void init(Context context, final double leastCount, final double minima, final double maxima) {
       View.inflate(context, R.layout.edittext_control, this);
       editText = (EditText) findViewById(R.id.edittext_control);
       button1 = (Button) findViewById(R.id.button_control_plus);
       button2 = (Button) findViewById(R.id.button_control_minus);

       button1.setOnClickListener(new OnClickListener() {
           @Override
           public void onClick(View v) {
               Double data = Double.valueOf(editText.getText().toString());
               data = data - leastCount;
               data = data > maxima ? maxima : data;
               data = data < minima ? minima : data;
               editText.setText(String.valueOf(data));
           }
       });

       button2.setOnClickListener(new OnClickListener() {
           @Override
           public void onClick(View v) {
               Double data = Double.valueOf(editText.getText().toString());
               data = data + leastCount;
               data = data > maxima ? maxima : data;
               data = data < minima ? minima : data;
               editText.setText(String.valueOf(data));
           }
       });
   }

   private void applyAttrs(AttributeSet attrs) {
       TypedArray a = getContext().obtainStyledAttributes(attrs, R.styleable.Edittextwidget);
       final int N = a.getIndexCount();
       for (int i = 0; i < N; ++i) {
           int attr = a.getIndex(i);
           switch (attr) {
               case R.styleable.Edittextwidget_leastcount:
                   this.leastCount = a.getFloat(attr, 1.0f);
                   break;
               case R.styleable.Edittextwidget_maxima:
                   this.maxima = a.getFloat(attr, 1.0f);
                   break;
               case R.styleable.Edittextwidget_minima:
                   this.minima = a.getFloat(attr, 1.0f);
           }
       }
       a.recycle();
   }
}

In the activity file Editextwidget.java, the views of the custom layout are defined and functionalities are assigned to them. For example, here there are two buttons which work as increment/decrement buttons and an edittext which takes numeric input. The buttons are initiated just like the way they are done in other activity/fragment using OnClickListener.

Define the attributes for the custom layout

<declare-styleable name="Edittextwidget">
     <attr name="leastcount" format="float" />
     <attr name="maxima" format="float" />
     <attr name="minima" format="float" />
</declare-styleable>

The attributes for the custom layout are defined in the attrs.xml file. Each attribute is assigned a name and a format which can be int, float, double, string etc.

Finally call the methods of the custom layout from the desired activity/fragment

Edittextwidget etwidgetControlAdvanced1 = (Edittextwidget)view.findViewById(R.id.etwidget_control_advanced1);

etwidgetControlAdvanced1.init(getContext(), 1.0, 10.0, 5000.0);

The init method of Edittextwidget.java is called while passing the relevant parameters like context, least count, maxima and minima.

Additional Resources on Custom Components

  1. Official Android Guide on Custom components – https://developer.android.com/guide/topics/ui/custom-components.html
  2. Simple example of creating a custom component to get started – https://www.tutorialspoint.com/android/android_custom_components.htm
Continue ReadingCreating Custom Components in the PSLab Android App

Trigger Controls in Oscilloscope in PSLab

PSLab Desktop App has a feature of oscilloscope. Modern day oscilloscopes found in laboratories support a lot of advanced features and addition of trigger controls in oscilloscope was one such attempt in adding an advanced feature in the oscilloscope. As the current implementation of trigger is not robust enough, this feature would help in better stabilisation of waveforms.

Captured waveforms often face the problem of distortion and trigger helps to solve this problem. Trigger in oscilloscope is an essential feature for signal characterisation.  as it synchronises the horizontal sweep of the oscilloscope to the proper point of the signal. The trigger control enables users to stabilise repetitive waveforms as well as capture single-shot waveforms. By repeatedly displaying similar portion of the input signal, the trigger makes repetitive waveform look static. In order to visualise how an oscilloscope looks with or without a trigger see the following figures below.

blog_post_5_1

blog_post_5_2

Fig 1: (a) Without trigger  (b) With trigger

The Fig:1(a) is the actual waveform received by the oscilloscope and it can be easily noticed that interpreting it is confusing due to the overlapping of multiple waveforms together. So, in Fig:1(b) the trigger control stabilises the waveforms and captures just one waveform.

In general the commonly used trigger modes in laboratory oscilloscopes are:-

  • Auto – This trigger mode allows the oscilloscope to acquire a waveform even when it does not detect a trigger condition. If no trigger condition occurs while the oscilloscope waits for a specific period (as determined by the time-base setting), it will force itself to trigger.
  • Normal – The Normal mode allows the oscilloscope to acquire a waveform only when it is triggered. If no trigger occurs, the oscilloscope will not acquire a new waveform, and the previous waveform, if any, will remain on the display.
  • Single – The Single mode allows the oscilloscope to acquire one waveform each time you press the RUN button, and the trigger condition is detected.
  • Scan – The Scan mode continuously sweeps waveform from left to right.

Implementing Trigger function in PSLab

PSLab has a built in basic functionality of trigger control in the configure_trigger method in sciencelab.py. The method gets called when trigger is enabled in the GUI. The trigger is activated when the incoming wave reaches a certain voltage threshold and the PSLab also provides an option of either selecting the rising or falling edge for trigger. Trigger is especially useful in experiments handling waves like sine waves, square wave etc. where trigger helps to get a clear picture.

In order to initiate trigger in the PSLab desktop app, the configure_trigger method in sciencelab.py is called. The configure_trigger method takes some parameters for input but they are optional. If values are not specified the default values are assumed.

def configure_trigger(self, chan, name, voltage, resolution=10, **kwargs):
        
  prescaler = kwargs.get('prescaler', 0)
        try:
            self.H.__sendByte__(CP.ADC)
            self.H.__sendByte__(CP.CONFIGURE_TRIGGER)
            self.H.__sendByte__(
                (prescaler << 4) | (1 << chan))  
            if resolution == 12:
                level = self.analogInputSources[name].voltToCode12(voltage)
                level = np.clip(level, 0, 4095)
            else:
                level = self.analogInputSources[name].voltToCode10(voltage)
                level = np.clip(level, 0, 1023)

            if level > (2 ** resolution - 1):
                level = (2 ** resolution - 1)
            elif level < 0:
                level = 0

            self.H.__sendInt__(int(level))  # Trigger
            self.H.__get_ack__()
        
        except Exception as ex:
  	    self.raiseException(ex, "Communication Error , Function : " + inspect.currentframe().f_code.co_name)

The method takes the following parameters in the method call

  • chan – Channel . 0, 1,2,3. corresponding to the channels being recorded by the capture routine(not the analog inputs).
  • name – The name of the channel. ‘CH1’… ‘V+’.
  • voltage – The voltage level that should trigger the capture sequence(in Volts).

The similar feature will also be used in oscilloscope in the Android app with the code corresponding to this method  in ScienceLab written in Java.

Additional Resources

  1. Read more about Trigger here – http://www.radio-electronics.com/info/t_and_m/oscilloscope/oscilloscope-trigger.php
  2. Learn more about trigger modes in oscilloscopes – https://www.picotech.com/library/oscilloscopes/advanced-digital-triggers
  3. PSLab Python repository to know the underlying code – https://github.com/fossasia/pslab-python

 

Continue ReadingTrigger Controls in Oscilloscope in PSLab

Native Functions for Performing Image Processing in Phimpme Android

In Android, image processing can be performed using Java or RenderScript or Native(C/C++). The performance of native code(C/C++) for image processing is much better than Java and RenderScript. So we used native code for image processing in the photo editor of the Phimpme image application. In this blog, I explain how image processing is performed in Phimpme Android.

Setting up build script for native code.

NDK helps us to develop Android applications using native languages like C and C++ so that heavy tasks can be performed in relatively less time. We can also use libraries built using C/C++ in Android application using this NDK. NDK can be downloaded using the SDK manager of Android Studio and can be set up following instructions in Android developers’ site.

After setting up the NDK, we will create a simple application involving native code and understand the flow of functions from Java to native.

The java files are present in app/src/main/java directory. Similarly, all the native files are present in app/src/main/jni directory.

So now let’s create necessary files in jni directory.

  • main.c – Native functions are added here
  • Android.mk and Application.mk – make files for building native code using ndkbuild.

Android.mk

LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_SRC_FILES := main.c       // include the files that should be 
                                // built(.c, .cpp, .h)
LOCAL_LDLIBS += -llog
LOCAL_MODULE := modulename      //name of the module (custom)
include $(BUILD_SHARED_LIBRARY)

Application.mk

APP_OPTIM := release
APP_ABI := all                    //architectures for which native lib
                               //has to be built(can be one or many. 
                               //should be separated by comma)
APP_PLATFORM := android-25 

Add these lines to app’s build.gradle for building native code along with the Gradle build.

externalNativeBuild {
   ndkBuild {
       path 'src/main/jni/Android.mk'
   }
}

The above lines in build.gradle will run Android.mk during the Gradle build.

When the Android.mk runs, it compiles all native code and generates modulename.so files in .externalNativeBuild/ndkBuild directory for all the mentioned architectures. This .so file for a particular architecture is a library containing all the native code compatible with that architecture.

So when this library(.so file) is statically imported into Java code, native code gets linked to Java and enables calling native functions directly from Java.

Importing native Library into Java

Static import of this library can be done by writing the below lines in your Java class.

static {
    System.loadLibrary("modulename");
}

Creating a native function and calling it from Java?

Unlike normal java code, where you call a function by its name, here in native Android development, the name of the native function is different from what you call it in java. To understand this clearly, let’s see an example of simple hello world application. Define the native function in Java and call it normally like any other function.

package org.fossasia.phimpme;

import 

public class MainActivity extends Activity {
   @Override
   public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);       
       textView = (TextView)findViewById(R.id.textview);
       textView.setText(helloworld());
   }

    static {
        System.loadLibrary(“modulename”);
    }

     Public native String helloworld();
}

Here the definition of the native function is present in org.fossasia.phimpme package and MainActivity class. So the name of the function in the native file should be “Java_org_fossasia_phimpme_MainActivity_helloworld”.

It follows a general structure of concatenating these strings Java, package name, Class name, function name defined in Java and replacing all full stops(.) with underscores(_).

The first two arguments in native function are JNIEnv* and jobject. These are present always. For a zero argument definition of a function in Java, there will be these two arguments in the native function. If there are two arguments defined in Java, there will be two arguments additional to these two arguments in the native function. These help in passing data in and out of the native part.

Here’s an example of a native function to output a string to Java

main.c :

#include <jni.h>
jstring Java_org_fossasia_phimpme_MainActivity_helloworld  (JNIEnv * env, jobject obj){
    return (*env)->NewStringUTF(env, "Hello World from Native");
}

Now when you run this application, you see “Hello World from Native” displayed on the screen. I hope this post clears about the flow of the native functions and how to link them with Java.

References

https://developer.android.com/ndk/guides/index.html

Continue ReadingNative Functions for Performing Image Processing in Phimpme Android

Implementing UNDO and REDO in Image Editor of Phimpme Android

The main feature in any image editor application like Phimpme Android other than editing the image is the ability to revert the changes (UNDO) that are done and the ability to revert the reversion i.e reperform the changes (REDO).

In Phimpme Application, we implemented this by using an ArrayList of Bitmaps. We stored the copy of the image bitmap whenever the modification is done on it. This helped us to get back to the previous image when required. But there is a problem in this method. They method may produce OutOfMemory Error when storing the bitmap in ArrayList when memory gets full. So for dealing solving this in the Phimpme application, we added a try – catch block and when the out of memory exception is caught, we recycled and removed the initial modified image from list i.e the image of index 1 in ArrayList. Index 0 is the original image on which we are working on. When we recycled that image, it gives space for adding another image, So added the recent image at the end of the list.

addToUndoList() function is shown below.

private void addToUndoList() {
   try{
       recycleBitmapList(++currentShowingIndex);
       bitmapsForUndo.add(mainBitmap.copy(mainBitmap.getConfig(),true));
   }catch (OutOfMemoryError error){
       bitmapsForUndo.get(1).recycle();
       bitmapsForUndo.remove(1);
       bitmapsForUndo.add(mainBitmap.copy(mainBitmap.getConfig(),true));
   }
}

In the above function, bitmapsForUndo is ArrayList in which we added modified bitmaps in the editor of Phimpme application. mainBitmap is the image bitmap on which all modifications are being done in the editor. The sense of integer variable currentShowingIndex is clear from its name that it points to the index of the image that is currently getting shown.

Eg. Consider a case when you perform 5 edits on an image using Phimpme Editor, then 6 image bitmaps get stored in the ArrayList including the original image and currentShowingIndex will be 5. Now if you undo the steps twice the currentShowingIndex becomes 3. The bitmaps of the index 4 and 5 have not been removed from the ArrayList yet. So they will be useful if you want to redo the changes.

  

When you make an another edit, an image bitmap gets added at index 4 and that should be the last element of the ArrayList. But you see that there is a bitmap of index 5 making that the last element of the ArrayList, not the newly added one. So in order to achieve that, the elements present in the ArrayList whose index is greater than currentShowingIndex have to recycled and removed before adding a newly modified image bitmap to the ArrayList. The first line in the try block of the above functions is referring to the function that is going to implement this. That function’s implementation is given below

private void recycleBitmapList(int fromIndex){
   while (fromIndex < bitmapsForUndo.size()){
       bitmapsForUndo.get(fromIndex).recycle();
       bitmapsForUndo.remove(fromIndex);
   }
}

Removing the bitmap from the ArrayList doesn’t clear the memory. That bitmap has to be recycled before getting removed from the ArrayList which is performed in the above function of the Phimpme application’s image editor.  The above recycleBitmapList function recycles and removes the bitmaps which have an index greater than or equal to the index that is passed as an argument to that function.

This function should also be called in onDestroy function of android activity as

recycleBitmapList(0);

This recycles and removes the whole ArrayList.

As now the implementation of the creation and recycling of the ArrayList is done, we can use this ArrayList to create getter functions for the undo and redo bitmaps. When the getUndoBitmap() is called the currentShowingIndex should decrement by one if greater than zero. When getRedoBitmap() is called the currentShowingIndex has to be incremented by one until it gets equal to the index of the last element present in the array list.

These methods are shown below.

private Bitmap getUndoBitmap(){
   if (currentShowingIndex - 1 >= 0)
       currentShowingIndex -= 1;
   else currentShowingIndex = 0;

   return bitmapsForUndo
           .get(currentShowingIndex)
           .copy(bitmapsForUndo.get(currentShowingIndex).getConfig(), true);
}

private Bitmap getRedoBitmap(){
   if (currentShowingIndex + 1 <= bitmapsForUndo.size())
       currentShowingIndex += 1;
   else currentShowingIndex = bitmapsForUndo.size() - 1;

   return bitmapsForUndo
           .get(currentShowingIndex)
           .copy(bitmapsForUndo.get(currentShowingIndex).getConfig(), true);
}

Logic part is done by here. We integrated these getter functions to button click functions of Phimpme Image Editor. setButtonVisibility() is called whenever undo or redo is button is pressed. This function sets the enable state and visibility of the button i.e the undo button is visible and enabled only if undo is possible. So does for the redo button.

setButtonVisibility() is shown below.

private void setButtonsVisibility() {
   if (currentShowingIndex > 0) {
       undo.setColorFilter(Color.BLACK);
       undo.setEnabled(true);
   }else {
       undo.setColorFilter(Color.GRAY);
       undo.setEnabled(false);
   }

   if (currentShowingIndex + 1 < bitmapsForUndo.size()) {
       redo.setColorFilter(Color.BLACK);
       redo.setEnabled(true);
   }else {
       redo.setColorFilter(Color.GRAY);
       redo.setEnabled(false);
   }
}

The above function grays the button if it is in the disabled state and will be black when the enabled state conditions are satisfied.

Finally, the OnClick() function of the editor of Phimpme is shown below.

@Override
public void onClick(View v) {
   switch (v.getId()){
       case R.id.edit_undo:
           onUndoPressed();
           break;
       case R.id.edit_redo:
           onRedoPressed();
           break;
   }
}

private void onUndoPressed() {
   if (mainBitmap != null) {
       if (!mainBitmap.isRecycled()) {
           mainBitmap.recycle();
       }
   }
   mainBitmap = getUndoBitmap();
   mainImage.setImageBitmap(mainBitmap);
   setButtonsVisibility();
}

private void onRedoPressed() {
   if (mainBitmap != null) {
       if (!mainBitmap.isRecycled()) {
           mainBitmap.recycle();
       }
   }
   mainBitmap = getRedoBitmap();
   mainImage.setImageBitmap(mainBitmap);
   setButtonsVisibility();
}

This shows how we implemented undo and redo in Image Editor of Phimpme Image Application.

Continue ReadingImplementing UNDO and REDO in Image Editor of Phimpme Android

How to add Markers in Map Fragment of Open Event Android App

The Open Event Android project helps event organizers to generate Apps (apk format) for their events/conferences by providing API endpoint or zip generated using Open Event server. In the  Open Event Android App, we have a map for showing all locations of sessions. In this map users should be able to see different locations with multiple markers. In this post I explain how to add multiple markers in the Google map fragment and set the bound in the map so that all markers are visible on one screen with specified padding.

Create Map Fragment

The first step to do is to create a simple xml file and add a fragment element in it. Then create MapsFragment.java file and find fragment element added in the xml file using findFragmentById() method.

SupportMapFragment supportMapFragment = ((SupportMapFragment)
           getChildFragmentManager().findFragmentById(R.id.map));

1. Create fragment_map.xml file

In this file add FrameLayout as a top level element and add fragment element inside FrameLayout with the name “com.google.android.gms.maps.SupportMapFragment”.

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
   android:layout_width="match_parent"
   android:layout_height="match_parent"
   android:orientation="vertical">

       <fragment
           android:id="@+id/map"
           android:name="com.google.android.gms.maps.SupportMapFragment"
           android:layout_width="match_parent"
           android:layout_height="match_parent" />

</FrameLayout>

2. Create MapsFragment.java

MapsFragment.java extends SupportMapFragement and implements LocationListener, OnMapReadyCallback. Make instance of GoogleMap object. In onViewCreated method inflate fragment_map.xml file using inflater. Now find the fragment element added in the xml file using findFragmentById() method and assign it to SupportMapFragement instance.

public class MapsFragment extends SupportMapFragment implements LocationListener, OnMapReadyCallback {

    private GoogleMap mMap;

    @Override
    public void onViewCreated(View view, @Nullable Bundle     savedInstanceState) {
        View view = inflater.inflate(R.layout.fragment_map, container, false);
        SupportMapFragment supportMapFragment = ((SupportMapFragment)
        getChildFragmentManager().findFragmentById(R.id.map));
supportMapFragment.getMapAsync(this);

        return view;
    }

    @Override
    public void onMapReady(GoogleMap map) {  }
    ...
}

Create list of locations

Location object has three variables name, latitude & longitude. Create a list of locations for which we want to add markers to the map.

public class Location {
    private String name;
    private float latitude;
    private float longitude;
}

List<Microlocation> mLocations= new ArrayList<>();

 

Add location objects in mLocations in onViewCreated() method

mLocations.add(location);

 

You can add multiple locations using for loop or fetching from the database.

Add markers

Add following code in onMapReady(GoogleMap map) method. onMapReady(GoogleMap map) is called when map is ready to be used. setMapToolbarEnabled(true) used to show toolbar for marker. If the toolbar is enabled users will see a bar with various context-dependent actions, including ‘open this map in the Google Maps app’ and ‘find directions to the highlighted marker in the Google Maps app’.

if(map != null){
    mMap = map;
    mMap.getUiSettings().setMapToolbarEnabled(true);
}
showEventLocationOnMap();

 

Create showEventLocationOnMap() method and add following code

private void showLocationsOnMap(){

   float latitude;
   float longitude;
   Marker marker;

   //Add markers for all locations
   for (Location location : mLocations) {

       latitude = location.getLatitude();
       longitude = location.getLongitude();
       latlang = new LatLng(latitude, longitude);

       marker = mMap.addMarker(new MarkerOptions()
                .position(latlang)
                .title(location.getName()));
   }
}

 

So what we are doing here?

For each location object in mLocations list, we are creating LatLng(latitude, longitude)  object and adding it to the map using

marker = mMap.addMarker(new MarkerOptions().position(latlang).title(location.getName()));

Setting bound

We want to set the zoom level so that all markers are visible. To do this, we can use LatLngBounds.Builder. Add the position of the marker in the builder object using include method while adding the marker in the map.

LatLngBounds.Builder builder = new LatLngBounds.Builder();
builder.include(marker.getPosition());

 

Then make the LatLngBounds object from builder.

LatLngBounds bounds = builder.build();
CameraUpdate cameraUpdate = CameraUpdateFactory.newLatLngBounds(bounds, dpToPx(40));
mMap.moveCamera(cameraUpdate);

private int dpToPx(int dp) {
    return (int) (dp * Resources.getSystem().getDisplayMetrics().density);
}

 

dpTopx converts dp to px so that it set the same padding in all devices.

Conclusion

Markers in map give very nice user experience for an event or a conference because they show the relative position of places and we now offer this feature in Open Event Android.

Additional resources:

Continue ReadingHow to add Markers in Map Fragment of Open Event Android App

Implementing Barcode Scanning in Open Event Android Orga App using RxJava

One of the principal goals of Open Event Orga App (Github Repo) is to let the event organizer to scan the barcode identifier of an attendee at the time of entry in the event, in order to quickly check in that attendee. Although there are several scanning APIs available throughout the web for Android Projects, we chose Google Vision API as it handles multitude of formats and orientations automatically with little to no configuration, integrates seamlessly with Android, and is provided by Google itself, ensuring great support in future as well. Currently, the use case of our application is:

  • Scan barcode from the camera feed
  • Detect and show barcode information on screen
  • Iterate through the attendee list to match the detected code with unique attendee identifier
  • Successfully return to the caller with scanned attendee’s information so that a particular action can be taken

There are several layers of optimisations done in the Orga Application to make the user interaction fluid and concise. I’ll be sharing a few in this blog post regarding the configuration and handling of bursty data from camera source efficiently. To see the full implementation, you can visit the Github Repository of the project using the link provided above.

Configuration

The configuration of our project was done through Dagger 2 following the dependency injection pattern, which is not the focus of this post, but it is always recommended that you follow separation of concerns in your project and create a separate module for handling independent works like barcode scanner configuration. Normally, people would create factories with scanner, camera source and processors encapsulated. This enables the caller to have control over when things are initialized.

Our configuration provides us two initialised objects, namely, CameraSource and BarcodeDetector

@Provides
BarcodeDetector providesBarCodeDetector(Context context, Detector.Processor<Barcode> processor) {
    BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context)
        .setBarcodeFormats(Barcode.QR_CODE)
        .build();

    barcodeDetector.setProcessor(processor);

    return barcodeDetector;
}

@Provides
CameraSource providesCameraSource(Context context, BarcodeDetector barcodeDetector) {
    return new CameraSource
        .Builder(context, barcodeDetector)
        .setRequestedPreviewSize(640, 480)
        .setRequestedFps(15.0f)
        .setAutoFocusEnabled(true)
        .build();
}

The fields needed to create the objects are provided as arguments to the provider functions as all the dependencies required to construct these objects. Now focusing on the Detector.Processor requirement of the BarcodeDetector is the classic example on non injectable code. This is because the processor is to be supplied by the activity or any other object which wants to receive the callback with the detected barcode data. This means we could inject it at the time of creation of the Activity or Presenter itself. We could easily overcome by adding a constructor to this dagger module containing the Barcode.Processor at the time of injection, but that would violate our existing 0 configuration based model where we just get the required component from the Application class and inject it. So, we wrapped the the processor into a PublishSubject

@Provides
@Singleton
@Named("barcodeEmitter")
PublishSubject<Notification<Barcode>> providesBarcodeEmitter() {
    return PublishSubject.create();
}

@Provides
@Singleton
Detector.Processor<Barcode> providesProcessor(@Named("barcodeEmitter") PublishSubject<Notification<Barcode>> emitter) {
    return new Detector.Processor<Barcode>() {
        @Override
        public void release() {
            // No action to be taken
        }

        @Override
        public void receiveDetections(Detector.Detections<Barcode> detections) {
            SparseArray<Barcode> barcodeSparseArray = detections.getDetectedItems();
            if (barcodeSparseArray.size() == 0)
                emitter.onNext(Notification.createOnError(new Throwable()));
            else
                emitter.onNext(Notification.createOnNext(barcodeSparseArray.valueAt(0)));
        }
    };
}

This solves 2 of our problems, not only now all these dependencies are injectable with 0 configurations, but also our stream of barcodes is now reactive.

Note that not everyone is in favour of using Singleton, but you can decrease the scope using your own annotation. We prefer not creating life cycle bound objects, those are hard to manage and can cause potential memory leaks, and the creation of an anonymous inner class object every time listener activates is not good for memory too.

Also, note that Singleton classes will cause memory leaks too if you don’t release their reference at the time of destruction of life cycle bound object

Notice how the type of PublishSubject is not just a barcode, but Notification which wraps the bar code. That’s because we want to send both the value and error streams down uninterrupted to the caller. Otherwise, the data stream would have stopped on the emission of first onError call. Here, we detect the barcodeSparseArray size and accordingly send error or first value to the PublishSubject which will be accordingly subscribed by the activity or presenter

Handling Bursty Data

barcodeEmitter.subscribe(barcodeNotification -> {
    if (barcodeNotification.isOnError()) {
        presenter.onBarcodeDetected(null);
    } else {
        presenter.onBarcodeDetected(barcodeNotification.getValue());
    }
});

Here is how we are subscribing the notification emitter and passing the appropriate value to the presenter to handle, null if it is an error and the value if it is the next emission.

Note that you must dispose the disposable returned by the subscribe method on the subject when the Activity is to be destroyed or else it will keep the reference to the anonymous inner class created with the lambda for barcodeNotification and cause a memory leak

Now, let’s see how the presenter handles this data for:

  • Hiding and showing the barcode panel when barcode is on the screen accordingly
  • Showing the data extracted from the barcode scanner

These things can be implemented in a very standard way with a few conditionals, but most developers forget the fact that the data emission rate is enormous when concerning with live feed of data. In the Open Event Orga app, we have reduced it to 15 FPS as it is more than enough to scan barcodes for our use case, but it is still huge. The continuous stream of nulls and barcode data is useless to us unless it changes.

A little explanation about nulls and values here: You must have noticed above the conditions when we pass null and value, but I’ll explain again. A value will be passed if there is a detected barcode on screen, and null will be passed if there is no barcode detected. The Google Vision API will keep sending the same value for barcode at 15 FPS and so we’ll get this redundant stream of nulls and values which we should not concern with processing as this will load the CPU unnecessarily.

There are only 2 cases where we need to process it:

  • Null changes to Value -> Show barcode panel
    Value changes to null -> Hide barcode panel
  • Value changes irrespective of nulls -> Show barcode data on UI and search through the attendee identifiers

So, here too we’ll create 2 PublishSubject objects

private PublishSubject<Boolean> detect = PublishSubject.create();
private PublishSubject<String> data = PublishSubject.create();

And we’ll configure them both in this way to receive data on each barcode emission in the presenter:

public void onBarcodeDetected(Barcode barcode) {
   detect.onNext(barcode == null);
   if (barcode != null)
       data.onNext(barcode.displayValue);
}

This will make data only receive non-null changes and detect receive a boolean notifying if the current detected barcode was null or not.

Now, we see how each of these subjects is configured to pass the emissions downstream:

data.distinctUntilChanged()
    .subscribeOn(Schedulers.io())
    .observeOn(AndroidSchedulers.mainThread())
    .subscribe(this::processBarcode);

This one is pretty straightforward. It’ll only send data downstream if its value has changed from the previous emission, disregarding nulls. So, for a stream like this

A A A A A null null null null A A A A null B B B B B B null null null B B A A null A

It will only emit:

A                                                                         B                                                  A

Which is actually what we want, as we only need to process and show the distinct barcode data and match it with our attendee list. Now, with thousands of attendees, the first method would have triggered unnecessary and time-consuming computations over and over again on same identifiers with little gaps of time, which would have created mediocre results even in a multi-threaded environment. The second one saves us from repetitive calculations and also gives us enormous gaps between emissions, which is optimal for our use case.

The second case is not so obvious because we can’t ignore nulls here as we have to show and hide UI based on them. This means that unlike our previous stream if we just use distinctUntilChanged, it will look like this:

A A A A A null null null null A A A A null B B B B B B null null null B B A A null A

f                 t                               f               t      f                    t                        f             t       f

This is because, if you remember, we were sending down emissions of barcode == null on each emission for this Subject. So, in this case, as you may see, some of the values are so close enough that it will not be discernable in UI and also annoy users who’ll see the panel pop up for milliseconds before vanishing or vice-versa. The perfect operation for this case will be debounce

detect.distinctUntilChanged()
    .debounce(150, TimeUnit.MILLISECONDS)
    .subscribeOn(Schedulers.io())
    .observeOn(AndroidSchedulers.mainThread())
    .subscribe(receiving -> scanQRView.showBarcodePanel(!receiving));

This operator will drop any emission in the window of 150ms succession and only pick up those emissions which are 150ms apart from each other. Now, 150 ms is not a magic number, it is picked through hit and trial and what works best for your case. Lower the value and you will pick up more changes downstream, increase the value and you might miss the required events.

This makes our stream somewhat like this, cleaning out the cluttered events

f                 t                               f                                           t                        f

This is the screenshot of the implementation:  

 

And an animated gif of the scanning process:

This is all for this blog, you may use many other operators from the arsenal of RxJava, whatever fits your use case. As I have presumed the knowledge about Subjects, MVP and a little bit of Dagger in this post, I’ll link some of the resources where you can find more information about these:

http://reactivex.io/documentation/subject.html

https://github.com/ReactiveX/RxJava/wiki/Subject

http://reactivex.io/documentation/operators/distinct.html

http://www.vogella.com/tutorials/Dagger/article.html

https://antonioleiva.com/mvp-android/

Continue ReadingImplementing Barcode Scanning in Open Event Android Orga App using RxJava

Implementing a zoomable ImageView by Extending the Default ViewPager in Phimpme Android

When I was trying to give default gallery-like experience to the gallery of Phimpme Image Application, where you can zoom an image with pan and pinch controls along with the ability to navigate to another photo by swipe gestures, I faced a problem in which when the zoomed image is swiped expecting it to get panned, instead of that, the viewpager switched to another page.

This implementation of Viewpager with zoomable image in it might seem straightforward in the beginning but once you start implementing this in most common way i.e using default ViewPager for navigation between images and zooming libraries like TouchImageView, subsampling-scale-image-view or PhotoView for zooming the image with pinch and pan controls, you will notice that when you swipe left or right on the zoomed image, the pager navigates to other images instead of the zoomed image getting panned. The viewpager responds to the swipe event and causes page change and it doesn’t let zoomable view to respond to that event.

In the above screenshot, front image is zoomed and when we swipe left, instead of the image getting panned, it is switching to next page.

How to solve this?

As the problem is caused by the default ViewPager utilizing the swipe event without transferring it to child views, a custom viewpager can be created by extending default viewpager and having a touch intercept method which transfers the event to its child views. A sample implementation of this custom view pager which I used in the app is shown below.


public class CustomViewPager extends ViewPager {
    public CustomViewPager(Context context) {
        super(context);
    }

    public CustomViewPager(Context context, AttributeSet attrs)
    {
        super(context,attrs);
    }

    @Override
    public boolean onInterceptTouchEvent(MotionEvent ev) {
        try {
            return super.onInterceptTouchEvent(ev);
        } catch (IllegalArgumentException e) {
            return false;
        }
    }
}

Now we can replace normal viewpager with this custom view pager in image viewing activity and layout resource for that activity. The normal pager adapter which is used with default view pager can be used with this custom viewpager also. You can get a clear understanding of what I described here by having a look at the below implementation.

activity_imageview.xml

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout
   xmlns:android="http://schemas.android.com/apk/res/android"
   android:layout_width="match_parent"
   android:layout_height="match_parent">

   <package.name.path.to.CustomViewPager
       android:id="@+id/cviewpager"
       android:layout_width="match_parent"
       android:layout_height="match_parent"/>
</RelativeLayout>

ImageViewActivity.java

public class ImageViewActivity extends Activity {
    CustomViewPager cViewPager;
    ArrayList<String> imageList;
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_imageview);
        imageList = createList(); //some method for creating a list
        cViewPager = (CustomViewPager)findViewById(R.id.cviewpager);  
        mViewPager.setAdapter(new ImagePagerAdapter(imageList));
    }

    class ImagePagerAdapter extends PagerAdapter {
        ArrayList<String> imageList; 
        
        ImagePagerAdapter(ArrayList<String> imageList){
            this.imageList = imageList;
        }

        @Override
        public int getCount() {
            return (null != imageList) ? imageList.size() : 0;
        }

        @Override
        public boolean isViewFromObject(View view, Object object) {
            return view == object;
        }

        @Override
        public View instantiateItem(ViewGroup container, int position) {
            PhotoView photoView = new PhotoView(container.getContext());
            Glide.with(getContext())
                .load(UriFromFile(new File(imageList.get(position))))
                .asBitmap()
                .thumbnail(0.2f)
                .into(photoView);
            photoView.setMaximumScale(5.0F);
            photoView.setMediumScale(3.0F);
            container.addView(photoView, LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT);
            return photoView;
        }

        @Override
         public void destroyItem(ViewGroup container, int position, Object object) {
             container.removeView((View) object);
         }
     }
}

In the above pager adapter, the PhotoView library is used for a zoomable view. The image is loaded into photoview using image caching library Glide. The pager adapter here takes a List of paths to images on the device as the input argument. A simple method for creating such list had been discussed in my first post.

Here you can see that the on swiping left the zoomed image got panned.

This method of implementing zoomable view in ViewPager is used many gallery applications. One among those applications is LeafPic. We are now integrating that into Phimpme Image Application.

References:

https://developer.android.com/reference/android/support/v4/view/ViewPager.html

https://developer.android.com/training/gestures/viewgroup.html

https://github.com/MikeOrtiz/TouchImageView

https://github.com/chrisbanes/PhotoView

 

Continue ReadingImplementing a zoomable ImageView by Extending the Default ViewPager in Phimpme Android

Adding Multiple Themes in Phimpme Android

In Phimpme-Android we decided to add a new feature that is providing multiple themes to the users. We have 3 types of themes in Phimpme Dark Theme, Light Theme and Amoled Theme. In this post, I am explaining how I added multiple themes support in phimpme android.

 Choose Theme in Phimpme Dialog

You need a Helper class that will store the data about the theme and all the colors related to a theme.

Before you begin you need to create a Helper class. In Phimpme I created a Helper class as ThemeHelper

public class ThemeHelper {

 public static final int DARK_THEME = 2;
 public static final int LIGHT_THEME = 1;
 public static final int AMOLED_THEME = 3;

 private PreferenceUtil SP;
 private Context context;

 private int baseTheme;
 private int primaryColor;
 private int accentColor;

 public ThemeHelper(Context context) {
  this.SP = PreferenceUtil.getInstance(context);
  this.context = context;
  updateTheme();
 }
}

Which contains all the basic method to get colors for textview, icon, toolbar, switch, imageview, background and app primary color.

Now you to provide user to select theme and it can be done using dialog box. Once the user selected any of the theme we have to update that theme and it can be done by following code :

 public void updateTheme(){
  this.primaryColor = SP.getInt(context.getString(R.string.preference_primary_color),
        getColor(R.color.md_light_blue_300));
  this.accentColor = SP.getInt(context.getString(R.string.preference_accent_color),
        getColor(R.color.md_light_blue_500));
  baseTheme = SP.getInt(context.getString(R.string.preference_base_theme), LIGHT_THEME);
 }

Now we have updated the our theme in Phimpme now we have to set the color according to a theme.

To get colors of all components we need to add some function in our helper class which will provide us the colors according to the theme.

As I said we are having 3 themes in Phimpme so I used 3 case to compare which theme user has selected.

I have added the functions to get colors for background, text and subtext as follows in phimpme.

public int getBackgroundColor(){
  int color;
  switch (baseTheme){
    case DARK_THEME:color = getColor(R.color.md_dark_background);break;
    case AMOLED_THEME:color = getColor(R.color.md_black_1000);break;
    case LIGHT_THEME:
    default:color = getColor(R.color.md_light_background);
  }
  return color;
 }



 public int getTextColor(){
  int color;
  switch (baseTheme){
    case DARK_THEME:color = getColor(R.color.md_grey_200);break;
    case AMOLED_THEME:color = getColor(R.color.md_grey_200);break;
    case LIGHT_THEME:
    default:color = getColor(R.color.md_grey_800);
  }
  return color;
 }

 public int getSubTextColor(){
  int color;
  switch (baseTheme){
    case DARK_THEME:color = getColor(R.color.md_grey_400);break;
    case AMOLED_THEME:color = getColor(R.color.md_grey_400);break;
    case LIGHT_THEME:
    default:color = getColor(R.color.md_grey_600);
  }
  return color;
 }

In the above functions, I am comparing which theme user has selected and returned the color according to the theme.

Now set the color to text by using above function you don’t need care which theme user has selected because those function will check and return the color according to the theme.

So it can be done simply,

textview.setTextColor(getTextColor());
editText.setTextColor(getTextColor());
editText.setHintTextColor(getSubTextColor());

                                     Light Theme &  Dark Theme (Phimpme)

Resources:

https://github.com/fossasia/phimpme-android/blob/development/app/src/main/java/org/fossasia/phimpme/leafpic/util/ThemeHelper.java

http://www.hidroh.com/2015/02/16/support-multiple-themes-android-app/

Continue ReadingAdding Multiple Themes in Phimpme Android

Preparing a release for Phimpme Android

Most of the essential features are now in a stable state in our Phimpme Android app. So we decided to release a beta version of the app. In FOSSASIA we follow branch policy where in development all current development will take place and in master branch the stable code resides.

Releasing an app is not just building an apk and submitting to the distribution platform, certain guidelines should follow.

How I prepare a released apk for Phimpme

List down the feature

We discussed on our public channel what features are now in stable state and can be released. Features such as account manager and Share Activity is excluded because it is not complete and in under development mode. We don’t want to show and under development feature. So excluded them. And made a list of available features in different category of Camera, Gallery and Share.

Follow the branch policy.

The releasable and stable codebase should be on master branch. It is good to follow the branch policy because it helps if we encounter any problem with the released apk. We can directly go to our master branch and check there. Development branch have very volatile because of active development going on.

Every Contributor’s contribution is important

When we browse our old branches such as master in case of ours. We see generally it is behind 100s of commits to the development. In case of that when we create a PR for that then it generally contains all the old commits to make the branch up to the latest.

In this case while opening and merging do not squash the commits.

Testing from Developer’s end

Testing is very essential part in development. Before releasing it is a good practice that Developer should test the app from their end. We tested app features in different devices with varying Android OS version and screen size.

  • If there is any compatibility issue, report it right away and there are several tools in Android to fix.
  • Support variety of devices and screen sizes

Changing package name, application ID

Package name, application ID are vitals of an app. Which uniquely identifies them as app in world. Like I changed the package name of Phimpme app to org.fossasia.phimpme. Check all the permission app required.

Create Release build type

Build types are great to way categorize the apps. Debug and Release are two. There are various things in codebase which we want only in Debug modes. So when we create Release mode it neglect that part of the code.

Add build types in you application build.gradle

buildTypes {
   release {
       minifyEnabled false
   }

Rebuild app again and verify from the left tab bar

Generate Signed apk and Create keystore (.jks) file

Navigate to Build → Generate Signed APK

Fill all details and proceed further to generate the signed apk in your home directory.

Adding Signing configurations in build.gradle

Copy the keystore (.jks) file to the root of the project and add signing configurations

signingConfigs {
       config {
           keyAlias 'phimpme'
           keyPassword 'phimpme'
           storeFile file('../org.fossasia.phimpme.jks')
           storePassword 'XXXXXXX'
       }
   }

InstallRelease Gradle task

Navigate to the right sidebar of Android Studio click on Gradle


Click on installRelease to install the released apk. It take all the credentials from the signing configurations.

Resources

Continue ReadingPreparing a release for Phimpme Android