Adding Unit Test For Local JSON Parsing in Open Event Android App

The Open Event project uses JSON format for transferring event information like tracks, sessions, microlocations and other. The event exported in the zip format from the Open Event server also contains the data in JSON format. The Open Event Android application uses this JSON data. Before we use this data in the app, we have to parse the data to get Java objects that can be used for populating views. There is a chance that the model and the JSON format changes in future. It is necessary that the models are able to parse the JSON data and the change in the model or JSON format don’t break JSON parsing.  In this post I explain how to unit test local JSON parsing so that we can ensure that the models are able to parse the local JSON sample data successfully.

Firstly we need to access assets from the main source set into the unit test. There is no way to directly access assets from main source set. We need to first add assets in test/resources directory. If assets are present in test/resources directory then we can use it using ClassLoader in the unit test. But we can’t just copy assets from the main source set to resources directory. If there is any change in sample JSON then we need to maintain both resources and it may make the sample inconsistent. We need to make assets shared.

Add the following code in the app level build.gradle file.

android {
    ...
    sourceSets.test.resources.srcDirs += ["src/main/assets"]
}

It will add src/main/assets as a source directory for test/resources directory.So after building the project the test will have access to the assets.

Create readFile() method

Now create a method readFile(String name) which takes a filename as a parameter and returns data of the file as a string.

private String readFile(String name) throws IOException {
        String json = "";
        try {
            InputStream inputStream = this.getClass().getClassLoader().getResourceAsStream(name);
            int size = inputStream.available();
            byte[] buffer = new byte[size];
            inputStream.read(buffer);
            inputStream.close();
            json = new String(buffer, "UTF-8");
        } catch (IOException e) {
            e.printStackTrace();
        }
        return json;
}

Here the getResourceAsStream() function is used to open file as a InputStream. Then we are creating byte array object of size same as inputStream data. Using read function we are storing data of file into byte array. After this we are creating a String object using a byte array.

Create ObjectMapper object

Create and initialize an ObjectMapper object in the Test class.

private ObjectMapper objectMapper;

    @Before
    public void setUp() {
        objectMapper = OpenEventApp.getObjectMapper();
        objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, true);
}

Here setting DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES to true is very important. It will fail the test for any unrecognized fields in the sample.

Create doModelDeserialization() method

In the Open Event Android App we are using Jackson for JSON serialization and deserialization. Converting JSON data to java is called deserialization or parsing.

Now create doModelDeserialization() method which takes three parameters,

  • Class<T> type: Model Class type of the data
  • String name: Name of the JSON data file
  • boolean isList: true if JSON string contains the list of object else false

This method returns true if parsing is successful and false if there is any error in parsing.

private <T> boolean doModelDeserialization(Class<T> type, String name, boolean isList) throws IOException {
        if (isList) {
            List<T> items = objectMapper.readValue(readFile(name), objectMapper.getTypeFactory().constructCollectionType(List.class, type));
            if (items == null)
                return false;
        } else {
            T item = objectMapper.readValue(readFile(name), type);
            if (item == null)
                return false;
        }
        return true;
}

Here ObjectMapper is doing the main work of parsing data and returns parsed object using readValue() method.

Add Test

Now all the setup is done we just need to assert value returned by doModelDeserialization() method by passing appropriate parameters.

@Test
public void testLocalJsonDeserialization() throws IOException {
        assertTrue(doModelDeserialization(Event.class, "event", false));
        assertTrue(doModelDeserialization(Microlocation.class, "microlocations", true));
        assertTrue(doModelDeserialization(Sponsor.class, "sponsors", true));
        assertTrue(doModelDeserialization(Track.class, "tracks", true));
        assertTrue(doModelDeserialization(SessionType.class, "session_types", true));
        assertTrue(doModelDeserialization(Session.class, "sessions", true));
        assertTrue(doModelDeserialization(Speaker.class, "speakers", true));
}

Here only event JSON file doesn’t have a list of the objects so passing false as isList parameter for others we are passing true because its data contains a list of objects.

Conclusion:

Running unit tests after every build helps you to quickly catch and fix software regressions introduced by code changes to your app

Continue ReadingAdding Unit Test For Local JSON Parsing in Open Event Android App

Generic Social Links Implementation in Open Event Android App

The Open Event Android App has an About Fragment which displays all the info about the event like name, time, location etc. It also shows social media buttons for the event. The problem was that the implementation of showing the social media buttons was not generic. The implementation was working fine for current FOSSASIA sample. If we generate the app for other events it creates a problem. It shows static buttons without proper mapping from social media button to social media link which creates a problem like on clicking GitHub button it opens Facebook link (issue #1792).

One solution to this problem is to implement recyclerview with social media buttons. In this post I explain how I have made social links implementation generic using RecyclerView.

Add RecyclerView in layout

The first step to do is to add recyclerview in the layout xml file and to create a list item for recyclerview which holds the image for the social link button. Then in the About Fragment find recyclerview element added in the xml file using findFragmentById() method.

1. Add recyclerview in xml file

In the layout xml file of About Fragment add recyclerview element. Define id, width, height, and gravity of recyclerview. Then specify list item for recyclerview using listitem attribute.

<android.support.v7.widget.RecyclerView
            android:id="@+id/list_social_links"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:layout_gravity="center"
            android:overScrollMode="never"
            android:clipToPadding="false"
            tools:listitem="@layout/item_social_link" />

2. Create item_social_link.xml

Now create a item_social_link.xml file and add a FrameLayout element. Inside the FrameLayout add an ImageView with appropriate id, width, height and padding.

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/layout_social_link_parent"
    android:layout_width="wrap_content"
    android:layout_height="wrap_content">

    <ImageView
        android:id="@+id/img_social_link"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:background="?attr/selectableItemBackgroundBorderless"
        android:contentDescription="@string/social_link"
        android:padding="@dimen/padding_large"
        android:tint="@color/white" />
</FrameLayout>

Add and initialize RecyclerView

After adding recyclerview in xml file we need to add RecyclerView field in the About Fragment java file. Now add and initialize SocialLinksListAdapter which extends RecyclerView.Adapter and will be used for populating the recyclerview with the social links.

@BindView(R.id.list_social_links)
protected RecyclerView socialLinksRecyclerView;

private SocialLinksListAdapter socialLinksListAdapter;
private List<SocialLink> mSocialLinks = new ArrayList<>();

Here mSocialLinks is the list of social links which is fetched from the Event object.

Create ViewHolder for social link button

Now create SocialLinkViewHolder which extends RecyclerView.ViewHolder and holds one social link item defined in item_social_link.xml file. This file is where the magic happens. Add ImageView, FrameLayout, SocialLink and context fields in it.

public class SocialLinkViewHolder extends RecyclerView.ViewHolder {

    @BindView(R.id.img_social_link)
    protected ImageView imageView;

    @BindView(R.id.layout_social_link_parent)
    protected FrameLayout layout;

    private SocialLink socialLink;
    private Context context;

    public SocialLinkViewHolder(View itemView, Context context) {
        super(itemView);
        ButterKnife.bind(this, itemView);
        this.context = context;
    }

    public void bindSocialLink(@NonNull SocialLink socialLink) {...}

    private void setImageDrawable(@NonNull String name,@NonNull String link) {...}

    private Drawable getDrawable(@DrawableRes int id) {...}

    private void showView(boolean show) {...}
}

The bindSocialLink(SocialLink socialLink) method is called in the onBindViewHolder() method of the SocialLinksListAdapter. In this method initialize socialLink field and set drawable for ImageView according to SocialLink name using setImageDrawable() method.

public void bindSocialLink(@NonNull SocialLink socialLink) {
        this.socialLink = socialLink;
        setImageDrawable(socialLink.getName(), socialLink.getLink());
}

The setImageDrawable() method finds and sets appropriate image for social link button using the name of the social link. If the image for the social link is not found then it sets the visibility of the image view to GONE using showView(boolean show).

private void setImageDrawable(@NonNull String name,@NonNull String link) {

        if (Utils.isEmpty(name) || Utils.isEmpty(link) || Utils.getSocialLinkDrawableId(name) == 1) {
            showView(false);
            return;
        }
        imageView.setImageDrawable(getDrawable(Utils.getSocialLinkDrawableId(name)));
}

The showView(boolean show) method handles visibility of ImageView and ImageView’s parent FrameLayout using setVisibility() method.

private void showView(boolean show) {
        if (show) {
            imageView.setVisibility(View.VISIBLE);
            layout.setVisibility(View.VISIBLE);
        } else {
            imageView.setVisibility(View.GONE);
            layout.setVisibility(View.GONE);
        }
}

Set onClickListener to ImageView

Now it’s time to set the onClickListener in the constructor of the SocialLinkViewHolder to define what to do when user clicks on the ImageView.

imageView.setOnClickListener(view -> {
            if (socialLink != null && !Utils.isEmpty(socialLink.getLink())) {
                Utils.setUpCustomTab(context, socialLink.getLink());
            }
        });

Here we are setting custom tab for the social link using the setUpCustomTab() util method which will open the link.

Now run the app on the device or emulator. Here’s how it looks like,

Conclusion

The SocialLink impementation using the RecyclerView gives the great user experience in the application.

Additional resources

Continue ReadingGeneric Social Links Implementation in Open Event Android App

Presenters via Loaders in Open Event Organizer Android App

Open Event Organizer‘s App design follows Model View Presenter (MVP) architecture which facilitates heavy unit testing of the app. In this design pattern, each fragment/activity implements a view interface which uses a presenter interface to interact with a model interface. The presenter contains most of the data of the view. So it is very important to restore presenters after configuration changes like rotation. As on rotation, the complete activity is re-created hence all the fields are destroyed and as a result, everything is re-generated resulting in state loss on configuration change which is unexpected. Open Event Organizer App uses the loader to store/provide presenters to the activity/fragment. Loader survives configuration changes. The idea of using the loader to provide presenter is taken from Antonio Gutierrez’s blog on “Presenters surviving orientation changes with loaders“.

The first thing to do is make a PresenterLoader<T> class extending Loader<T> where T is your presenter’s base interface. The PresenterLoader class in the app looks like:

public class PresenterLoader<T extends IBasePresenter> extends Loader<T> {

   private T presenter;

   ...

   @Override
   protected void onStartLoading() {
       super.onStartLoading();
       deliverResult(presenter);
   }

   @Override
   protected void onReset() {
       super.onReset();
       presenter.detach();
       presenter = null;
   }

   public T getPresenter() {
       return presenter;
   }
}

 

The methods are pretty clear from the names itself. Once this is done, now you are ready to use this loader in for your fragment/activity. Creating a BaseFragment or BaseActivity will be clever as then you don’t have to add same logic everywhere. We will take a use case of an activity. A loader has a unique id by which it is saved in the app. Use unique id for each fragment/activity. Using the id, the loader is obtained in the app.

Loader<P> loader = getSupportLoaderManager().getLoader(getLoaderId());

 

When creating for the first time, the loader is set up with the loader callbacks where we actually set a presenter logic. In the Organizer App, we are using dagger dependency injection for injecting presenter in the app for the first time. If you are not using the dagger, you should create PresenterFactory class containing create method for the presenter. And pass the PresenterFactory object to the PresenterLoader in onCreateLoader. In this case, we are using dagger so it simplifies to this:

getSupportLoaderManager().initLoader(getLoaderId(), null, new LoaderManager.LoaderCallbacks<P>() {
   @Override
   public Loader<P> onCreateLoader(int id, Bundle args) {
       return new PresenterLoader<>(BaseActivity.this, getPresenterProvider().get());
   }

   @Override
   public void onLoadFinished(Loader<P> loader, P presenter) {
       BaseActivity.this.presenter = presenter;
   }

   @Override
   public void onLoaderReset(Loader<P> loader) {
       BaseActivity.this.presenter = null;
   }
});

 

getPresenterProvider method returns Lazy<Presenter> provider to ensure single presenter creation in the activity/fragment. The lifecycle to setup PresenterLoader in activity is onCreate and in the fragment is onActivityCreated. Use presenter field from next lifecycle that is start. If the presenter is used before the start, it creates null pointer exception. For example, if implementing with the BaseFragment, setup loader in onActivityCreated method.

@Override
   protected void onCreate(@Nullable Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       Loader<P> loader = getSupportLoaderManager().getLoader(getLoaderId());
       if (loader == null) {
           initLoader();
       } else {
           presenter = ((PresenterLoader<P>) loader).getPresenter();
       }
   }

 

Make sure that your base interface implements some of the basic methods. For example, onDetach, onAttach etc. getLoaderId method must be implemented in each fragment/activity using loaders. The method returns unique id for each fragment/activity. In Organizer App, the method returns layout id of the fragment/activity as a unique id.

Using the loader approach to store/restore presenters helps in surviving their instances in configuration changes in the app. Hence improves the performance.

Links:
Antonio Gutierrez’s blog post about Presenter surviving orientation changes with Loaders in Android
Android Documentation for Loaders

Continue ReadingPresenters via Loaders in Open Event Organizer Android App

Real time Sensor Data Analysis on PSLab Android

PSLab device has the capacity to connect plug and play sensors through the I2C bus. The sensors are capable of providing data in real time. So, the PSLab Android App and the Desktop app need to have the feature to fetch real time sensor values and display the same in the user interface along with plotting the values on a simple graph.

The UI was made following the guidelines of Google’s Material Design and incorporating some ideas from the Science Journal app. Cards are used for making each section of the UI. There are segregated sections for real time updates and plotting where the real time data can be visualised. A methods for fetching the data are run continuously in the background which receive the data from the sensor and then update the screen.

The following section denotes a small portion of the UI responsible for displaying the data on the screen continuously and are quite simple enough. There are a number of TextViews which are being constantly updated on the screen. Their number depends on the type and volume of data sent by the sensor.

<TextView
       android:layout_width="wrap_content"
       android:layout_height="30dp"
       android:layout_gravity="start"
       android:text="@string/ax"
       android:textAlignment="textStart"
       android:textColor="@color/black"
       android:textSize="@dimen/textsize_edittext"
       android:textStyle="bold" />

<TextView
       android:id="@+id/tv_sensor_mpu6050_ax"
       android:layout_width="wrap_content"
       android:layout_height="30dp"
       android:layout_gravity="start"
       android:textAlignment="textStart"
       android:textColor="@color/black"
       android:textSize="@dimen/textsize_edittext"
       android:textStyle="bold" />

 

The section here represents the portion of the UI responsible for displaying the graph. Like all other parts of the UI of PSLab Android, MPAndroidChart is being used here for plotting the graph.

<LinearLayout
       android:layout_width="match_parent"
       android:layout_height="160dp"
       android:layout_marginTop="40dp">

       <com.github.mikephil.charting.charts.LineChart
               android:id="@+id/chart_sensor_mpu6050"
               android:layout_width="match_parent"
               android:layout_height="match_parent"
               android:background="#000" />
</LinearLayout>

 

Since the updates needs to continuous, a process should be continuously run for updating the display of the data and the graph. There are a variety of options available in Android in this regard like using a Timer on the UI thread and keep updating the data continuously, using ASyncTask to run a process in the background etc.

The issue with the former is that since all the processes i.e. fetching the data and updating the textviews & graph will run on the UI thread, the UI will become laggy. So, the developer team chose to use ASyncTask and make all the processes run in the background so that the UI thread functions smoothly.

A new class SensorDataFetch which extends AsyncTask is defined and its object is created in a runnable and the use of runnable ensures that the thread is run continuously till the time the fragment is used by the user.

scienceLab = ScienceLabCommon.scienceLab;
i2c = scienceLab.i2c;
try {
    MPU6050 = new MPU6050(i2c);
} catch (IOException e) {
    e.printStackTrace();
}
Runnable runnable = new Runnable() {
    @Override
    public void run() {
        while (true) {
            if (scienceLab.isConnected()) {
                try {
                    sensorDataFetch = new SensorDataFetch();
                } catch (IOException e) {
                    e.printStackTrace();
                }
                sensorDataFetch.execute();
            }
        }
    }
};
new Thread(runnable).start();

 

The following is the code for the ASyncTask created. There are two methods defined here doInBackground and onPostExecute which are responsible for fetching the data and updating the display respectively.

The raw data is fetched using the getRaw method of the MPU6050 object and stored in an ArrayList. The data type responsible for storing the data will depend on the return type of the getRaw method of each sensor class and might be different for other sensors. The data returned by getRaw is semi-processed and the data just needs to be split in sections before presenting it for display.

The PSLab Android app’s sensor files can be viewed here and they can give a better idea about how the sensors are calibrated, how the intrinsic nonlinearity is taken care of, how the communication actually works etc.

After the data is stored, the control moves to the onPostExecute method, here the textviews on the display and the chart are updated. The updation is slowed down a bit so that the user can visualize the data received.

private class SensorDataFetch extends AsyncTask<Void, Void, Void> {
   MPU6050 MPU6050 = new MPU6050(i2c);
   ArrayList<Double> dataMPU6050 = new ArrayList<Double>();

   private SensorDataFetch(MPU6050 MPU6050) throws IOException {
   }

   @Override
   protected Void doInBackground(Void... params) {
       try {
           if (MPU6050 != null) {
               dataMPU6050 = MPU6050.getRaw();
           }
       } catch (IOException e) {
           e.printStackTrace();
       }
           return null;
   }

   protected void onPostExecute(Void aVoid) {
       super.onPostExecute(aVoid);
       tvSensorMPU6050ax.setText(String.valueOf(dataMPU6050.get(0)));
       tvSensorMPU6050ay.setText(String.valueOf(dataMPU6050.get(1)));
       tvSensorMPU6050az.setText(String.valueOf(dataMPU6050.get(2)));
       tvSensorMPU6050gx.setText(String.valueOf(dataMPU6050.get(3)));
       tvSensorMPU6050gy.setText(String.valueOf(dataMPU6050.get(4)));
       tvSensorMPU6050gz.setText(String.valueOf(dataMPU6050.get(5)));
       tvSensorMPU6050temp.setText(String.valueOf(dataMPU6050.get(6)));
   }
}

The detailed implementation of the same can be found here.

Additional Resources

  1. Learn more about how real time sensor data analysis can be used in various fields like IOT http://ieeexplore.ieee.org/document/7248401/
  2. Google Fit guide on how to use native built-in sensors on phones, smart watches etc. https://developers.google.com/fit/android/sensors
  3. A simple starter guide to build an app capable of real time sensor data analysis http://developer.telerik.com/products/building-an-android-app-that-displays-live-accelerometer-data/
  4. Learn more about using AsyncTask https://developer.android.com/reference/android/os/AsyncTask.html
Continue ReadingReal time Sensor Data Analysis on PSLab Android

Creating a Four Quadrant Graph for PSLab Android App

While working on Oscilloscope in PSLab Android, we had to implement XY mode. XY plotting is part of regular Oscilloscope and in XY plotting the 2 signals are plotted against each other. For XY plotting we require a graph with all 4 quadrants but none of the Graph-View libraries in Android support a 4 quadrants graph. We need to find a solution for this. So, we used canvas class to draw a 4 quadrants graph.  The Canvas class defines methods for drawing text, lines, bitmaps, and many other graphics primitives. Let’s discuss how a 4 quadrants graph is implemented using Canvas.

Initially, a class Plot2D extending View is created along with a constructor in which context, values for X-Axis, Y-Axis.

public class Plot2D extends View {
public Plot2D(Context context, float[] xValues, float[] yValues, int axis) {
   super(context);
   this.xValues = xValues;
   this.yValues = yValues;
   this.axis = axis;
   vectorLength = xValues.length;
   paint = new Paint();
   getAxis(xValues, yValues);
}

 

So, now we need to convert a particular float value in a pixel. This is the most important part and for this, we create a function where we send the value of the pixels, the minimum and the maximum value of the axis and array of float values. We get an array of converted pixel values in return. p[i] = .1 * pixels + ((value[i] – min) / (max – min)) * .8 * pixels; is the way to transform an int value to a respective pixel value.

private int[] toPixel(float pixels, float min, float max, float[] value) {
   double[] p = new double[value.length];
   int[] pInt = new int[value.length];

   for (int i = 0; i < value.length; i++) {
       p[i] = .1 * pixels + ((value[i] - min) / (max - min)) * .8 * pixels;
       pInt[i] = (int) p[i];
   }
   return (pInt);
}

 

For constructing a graph we require to create the axis, add markings/labels and plot data in the graph. To achieve this we will override onDraw method. The parameter to onDraw() is a Canvas object that the view can use to draw itself. First, we need to get various parameters like data to plot, canvas height and width, the location of the x axis and y axis etc.

@Override
protected void onDraw(Canvas canvas) {

   float canvasHeight = getHeight();
   float canvasWidth = getWidth();
   int[] xValuesInPixels = toPixel(canvasWidth, minX, maxX, xValues);
   int[] yValuesInPixels = toPixel(canvasHeight, minY, maxY, yValues);
   int locxAxisInPixels = toPixelInt(canvasHeight, minY, maxY, locxAxis);
   int locyAxisInPixels = toPixelInt(canvasWidth, minX, maxX, locyAxis);

 

Drawing the axis

First, we will draw the axis and for this, we will use the white color. To draw the white color axis line we will the following code.

paint.setColor(Color.WHITE);
paint.setStrokeWidth(5f);
canvas.drawLine(0, canvasHeight - locxAxisInPixels, canvasWidth,
       canvasHeight - locxAxisInPixels, paint);
canvas.drawLine(locyAxisInPixels, 0, locyAxisInPixels, canvasHeight,
       paint);

 

Adding the labels

After drawing the axis lines, now we need to mark labels for both x and y axis. For this, we use the following code in onDraw method. By this, the axis labels are automatically marked after a fixed distance. The no. of labels depends on the value of n. The code ensures that the markings are apt for each of the quadrant, for example in the first quadrant the markings of the x axis is below the axis, whereas markings of the y axis are to the left.

float temp = 0.0f;
int n = 8;
for (int i = 1; i <= n; i++) {
    if (i <= n / 2) {
           temp = Math.round(10 * (minX + (i - 1) * (maxX - minX) / n)) / 10;
           canvas.drawText("" + temp,
                   (float) toPixelInt(canvasWidth, minX, maxX, temp),
                   canvasHeight - locxAxisInPixels - 10, paint);
           temp = Math.round(10 * (minY + (i - 1) * (maxY - minY) / n)) / 10;
           canvas.drawText("" + temp, locyAxisInPixels + 10, canvasHeight
                           - (float) toPixelInt(canvasHeight, minY, maxY, temp),
                   paint);
       } else {
           temp = Math.round(10 * (minX + (i - 1) * (maxX - minX) / n)) / 10;
           canvas.drawText("" + temp,
                   (float) toPixelInt(canvasWidth, minX, maxX, temp),
                   canvasHeight - locxAxisInPixels + 30, paint);
           temp = Math.round(10 * (minY + (i - 1) * (maxY - minY) / n)) / 10;
           canvas.drawText("" + temp, locyAxisInPixels - 65, canvasHeight
                           - (float) toPixelInt(canvasHeight, minY, maxY, temp),
                   paint);

By using this code we get the following results

Plotting the data

The last step is to plot the data, to achieve this we first convert float values of x axis and y axis data point to pixels using toPixel method and simply draw it on the graph. In addition to this, we set a red color to the line.

paint.setStrokeWidth(2);
canvas.drawARGB(255, 0, 0, 0);
for (int i = 0; i < vectorLength - 1; i++) {
   paint.setColor(Color.RED);
   canvas.drawLine(xValuesInPixels[i], canvasHeight
           - yValuesInPixels[i], xValuesInPixels[i + 1], canvasHeight
           - yValuesInPixels[i + 1], paint);
}

 

This implements a 4 quadrants graph in PSLab Android app for XY plotting in Oscilloscope Activity. The entire code for the same is available in here.

Resources

  1. A simple 2D Plot class for Android
  2. Android.com reference of Custom Drawing
Continue ReadingCreating a Four Quadrant Graph for PSLab Android App

Using Map View to Display Location of Clicked Image in Phimpme

Previously in Phimpme Android app we used to display all the images on the map, based on their location they have taken. In the upcoming version, we are introducing mapview to specify the location when user checks out the details of the image. In this post, I am explaining that how I have implemented Google’s mapView in Phimpme. Note: The prerequisite to display image location in the map view, an image should have geolocation in its meta data.

Let’s get started

Step -1 : First, enable the setting to ‘Show mapview in Image description’ and save it somewhere.

Although, it’s your choice you want to give users choice to view map or not.

First, we need to turn on the movie visibility from map provider in settings. Right now we are adding two maps in our list Google maps and openstreetmap.

Choose Map Provider from the list in Phimpme

Once you choose your preference, it will get stored in sharedPref. As we are displaying the map in the image details so we need to add Image view of map.

Step -2 : Add a ImageView in your XML, in which we will display map.

<ImageView
   android:id="@+id/photo_map"
   android:layout_width="match_parent"
   android:layout_height="150dp"
   android:visibility="gone"
/>

Keep default visibility of mapview is GONE, If a user will enable map view in setting then we will make this image visible.

Step -3 : Load image with map in ImageView:

Good things are that StaticMapProvier gives you the URL of the image in corresponding to particular GeoLocation, which actually is a view of the map.

Now we need to display mapview when the user taps on details of an image.

ImageView imgMap = (ImageView) dialogLayout.findViewById(R.id.photo_map);
final GeoLocation location;
if ((location = f.getGeoLocation()) != null) {
   PreferenceUtil SP = PreferenceUtil.getInstance(activity.getApplicationContext());

   StaticMapProvider staticMapProvider = StaticMapProvider.fromValue(
           SP.getInt(activity.getString(R.string.preference_map_provider), StaticMapProvider.GOOGLE_MAPS.getValue()));

   Glide.with(activity.getApplicationContext())
           .load(staticMapProvider.getUrl(location))
           .asBitmap()
           .centerCrop()
           .animate(R.anim.fade_in)
           .into(imgMap);

To load an image, we used Glide Image library, you can use any library of your choice.

MapView on the top of dialog box in Phimpme

Step 4: Open navigation on the tap of mapView or Image.

Attach a click listener on the imageView and on click of that create a Uri correspond to geolocation and through an Intent to Android to view that link, it will open Google maps with navigation.

 String uri = String.format(Locale.ENGLISH, "geo:%f,%f?z=%d", location.getLatitude(), location.getLongitude(), 17);
            activity.startActivity(new Intent(Intent.ACTION_VIEW, Uri.parse(uri)));

Resources

Continue ReadingUsing Map View to Display Location of Clicked Image in Phimpme

Implementing QR Code Detector in Open Event Organizer App

One of the main features of Open Event Organizer App is to scan a QR code from an attendee’s ticket to validate his/her entry to an event. The app uses Google’s Vision API library, com.google.android.gms.vision.barcode for QR code detection. In this blog, I talk about how to use this library to implement QR code detection with dynamic frame support in an Android App. The library uses a term barcode for all the supported formats including QR code. Hence in the blog, I use the term barcode for QR code format.

We use Google’s dagger for dependency injections in the app. So all the barcode related dependencies are injected in the activity using the dagger. Basically, there are these two classes – BarcodeDetector and CameraSource. The basic workflow is to create BarcodeDetector object which handles QR code detection. Add a SurfaceView in the layout which is used by the CameraSource to show preview to the user. Pass both of these to CameraSource. Enough talk, let’s look into the code while moving forward from here on. If you are not familiar with dagger dependency injection, I strictly suggest you have a look at some tutorial introducing dagger dependency injection.

So we have a barcode module class which takes care of creating  BarcodeDetector and CameraSource.

@Provides
BarcodeDetector providesBarCodeDetector(Context context) {
   BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context)
       .setBarcodeFormats(Barcode.QR_CODE)
       .build();
   return barcodeDetector;
}

@Provides
CameraSource providesCameraSource(Context context, BarcodeDetector barcodeDetector) {
   return new CameraSource
       .Builder(context, barcodeDetector)
       ...
       .build();
}

 

You can see in the code that BarcodeDetector is passed to the CameraSource builder. Now comes preview part. The user of the app should be able to see what is actually detected. Google has provided samples showing how to do that. It provides some classes that you can just add to your projects. The classes with the links are – BarcodeGraphic, CameraSourcePreview, GraphicOverlay and BarcodeGraphicTracker.

CameraSourcePreview is the custom view which is used in the QR detecting layout for preview. It handles all the SurfaceView related stuff with the additional BarcodeGraphic view which extends GraphicOveraly which is used to draw dynamic info based on the QR code detected. We use this class to draw a frame around the QR code detected. BarcodeGraphicTracker is used to receive newly detected items, add a graphical representation to an overlay, update the graphics as the item changes, and remove the graphics when the item goes away.

Override draw method of BarcodeGraphic according to your need of how you want to show results on the screen once barcode is detected. The method in the Organizer app looks like:

@Override
public void draw(Canvas canvas) {
   if (barcode == null) {
       return;
   }
   // Draws the bounding box around the barcode.
   RectF rect = new RectF(barcode.getBoundingBox());
   ...
   int width = (int) ((rect.right - rect.left)/3);
   int height = (int) ((rect.top - rect.bottom)/3);

   canvas.drawBitmap(Bitmap.createScaledBitmap(frameBottomLeft, width, height, false), rect.left, rect.top, null);
   ...
   canvas.drawRect(rect, rectPaint);
}

 

The class has a Barcode field which gets updated on barcode detection. In the above method, the field rect gets dimensions of the bounding box of the QR code detector. And accordingly, frames are drawn at the vertices of the rect . Include CameraSourcePreview inclosing GraphicOverlay in the activity’s layout.

<...CameraSourcePreview
   android:id="@+id/preview"
   android:layout_width="match_parent"
   android:layout_height="match_parent">

   <...GraphicOverlay />

</...CameraSourcePreview>

 

CameraSourcePreview and GraphicOverlay are saved in the activity from the layout. Pass CameraSource and GraphicOverlay to the CameraSourcePreview using the method start. Now the last part left is setting the processor to the BarcodeDetector to add a connection to the GraphicOverlay. Use BarcodeGraphicTracker which connects GraphicOverlay to BarcodeDetector. This is done by passing BarcodeTrackerFactory which has create method for BarcodeGraphicTracker to Multiprocessor. The code looks like:

barcodeDetector.setProcessor(
   new MultiProcessor.Builder<>(
       new BarcodeTrackerFactory(graphicOverlay)).build());

 

Now BarcodeDetector is connected to the layout. This will update the preview on the layout as overridden in the draw method of BarcodeGraphic on each barcode detection.

Links:
Google’s Vision API – link
Google Dagger github repo link – https://github.com/google/dagger

Continue ReadingImplementing QR Code Detector in Open Event Organizer App

Implement Marker Clustering in the Open Event Android App

Markers are an integral part of any map based service. In the Open Event Android App for samples like Mozilla All Hands 2017, there are a lot of microlocations that the organizers want to integrate into the app’s map fragment. Due to the presence of large number of markers, the map fragment clutters, thereby harming the user experience. As an example, imagine yourself as the user and you see the map as in the image given below!

Therefore to tackle problem like this, the markers are grouped into clusters. On click of the cluster, the markers get declustered and fall into their respective locations with the map zoomed in.

Implementation

First and foremost, define the libraries to be used by the utilities in the build.gradle of your app module. Make to import the latest versions.

// Googleplay Variant
googleplayCompile 'com.google.android.gms:play-services-maps:10.2.6'
googleplayCompile 'com.google.android.gms:play-services-location:10.2.6'
googleplayCompile 'com.google.maps.android:android-maps-utils:0.4'

 

Implement the ClusterItem interface in your location POJO which will house a marker’s location. The POJO will therefore override the getPostion() method of the ClusterItem interface where you will return the LatLng.

public class MicrolocationClusterWrapper implements ClusterItem {

@Override
public LatLng getPosition() {
   return latLng;
}

}

 

Create a custom Cluster Renderer class that will extend the default cluster renderer with you location POJO as parameter. Implement ClusterManager’s onClusterItemClickListener to listen to marker clicks and add custom colors to them. Set the custom marker properties before the marker items are rendered with the markerOptions inside the onBeforeClusterItemRendered().

@Override
   protected void onBeforeClusterItemRendered(MicrolocationClusterWrapper item, MarkerOptions markerOptions) {
       super.onBeforeClusterItemRendered(item, markerOptions);

       markerOptions.title(item.getMicrolocation().getName());
       if (microlocationClusterWrapper != null && item.equals(microlocationClusterWrapper)) {
           markerOptions.icon(ImageUtils.vectorToBitmap(context, R.drawable.map_marker, R.color.color_primary));
       } else {
           markerOptions.icon(ImageUtils.vectorToBitmap(context, R.drawable.map_marker, R.color.dark_grey));
       }
   }

   @Override
   protected void onClusterItemRendered(final MicrolocationClusterWrapper clusterItem, Marker marker) {
       super.onClusterItemRendered(clusterItem, marker);
       clusterItem.setMarker(marker);
  }

   @Override
   public boolean onClusterItemClick(MicrolocationClusterWrapper item) {
       if (microlocationClusterWrapper != null) {
           getMarker(microlocationClusterWrapper).setIcon(ImageUtils.vectorToBitmap(context, R.drawable.map_marker, R.color.dark_grey));
       }
       microlocationClusterWrapper = item;
       getMarker(item).setIcon(ImageUtils.vectorToBitmap(context, R.drawable.map_marker, R.color.color_primary));
       return false;
   }
}

 

Finally in your map fragment, initialize your map, cluster manager class and your custom cluster renderer you just created. Implement the MapReadyCallback so that the Google Map object is not null. Remember to pass the cluster renderer as a listener for the cluster manager’s cluster item click listener. Use the setOnClusterClickListener to zoom the map on the click of cluster.

private void handleClusterEvents() {
   clusterManager.setOnClusterItemClickListener(clusterRenderer);

   clusterManager.setOnClusterClickListener(cluster -> {
               mMap.animateCamera(CameraUpdateFactory.newLatLngZoom(
                       cluster.getPosition(), (float) Math.floor(mMap
                               .getCameraPosition().zoom + 2)), 300,
                       null);

               return true;
           });

   mMap.setOnMapClickListener(clusterRenderer);
}

 

Conclusion

Maps are an integral part of any event based apps and marker clustering undoubtedly enhances the user experience in Maps.

Resources

  • Marker Clustering Android documentation

https://developers.google.com/maps/documentation/android-api/utility/marker-clustering

  • Complete Code Reference

https://github.com/fossasia/open-event-android/pull/1777

  • Marker Customization in the case of Clustering

https://github.com/googlemaps/google-maps-ios-utils/issues/21

Continue ReadingImplement Marker Clustering in the Open Event Android App

Resource Injection Using ButterKnife in Loklak Wok Android

Loklak Wok Android being a sophisticated Android app uses a lot of views, and of those most are manipulated at runtime. In Android to play with a View or ViewGroup defined in XML at runtime requires developers to add the following line:

(TypeOfView) parentView.findViewById(R.id.id_of_view);

 

This leads to lengthy code. And very often, more than one Views respond to a particular event. For example, hiding Views if a network request fails and showing a message to the user to “Try Again!”. Let’s say you have to hide 4 Views, are you going to do the following:

view1.setVisibility(View.GONE);
view2.setVisibility(View.GONE);
view3.setVisibility(View.GONE);
view4.setVisibility(View.GONE);
textView.setVisibility(View.VISIBLE); // has "Try Again!" message.

// more 5 lines of code when hiding textView and displaying 4 other Views

 

Surely not! And the old fashioned way to get a string value defined as a resource in string.xml

String appName = getActivity().getResources().getString(R.id.app_name);

 

Surely, all this works good, but being a developer while working on a sophisticated app you would like to focus on the logic of the app, rather than scratching your head to debug whether you properly did a findViewById or not, did you typecast it to the proper View, or where did you miss to change the visibility of a view in response to an event.

Well, all of this can be easily handled by using a library which provides you the dependency, here resources. All you need to do is just declare your resources, and that’s it, the library provides the resources to you, yes you don’t need to initialize it using findViewById. So let’s dive in and see how ButterKnife is used in Loklak Wok Android to handle these issues.

Adding ButterKnife to Android Project

In the app/build.gradle:

dependencies {
    compile 'com.jakewharton:butterknife:8.6.0'
    annotationProcessor 'com.jakewharton:butterknife-compiler:8.6.0'
    ...
}

Dealing with Views in Fragments

When views are declared, BindView annotation is used with its parameter as the ID of the view, for example, views in TweetHarvestingFragment :

@BindView(R.id.toolbar)
Toolbar toolbar;
@BindView(R.id.harvested_tweets_count)
TextView harvestedTweetsCountTextView;
@BindView(R.id.harvested_tweets_container)
RecyclerView recyclerView;
@BindView(R.id.network_error)
TextView networkErrorTextView;

NOTE: Views declared can’t be private.

Once Views are declared, then it needs to be injected, it is done using ButterKnife.bind(Object target, View Source). Here in TweetHarvestingFragment the target will be the fragment itself and source i.e. the parent view will be rootView (obtained by inflating the layout file of fragment). All this needs to be done in onCreateView method

View rootView = inflater.inflate(R.layout.fragment_tweet_harvesting, container, false);
ButterKnife.bind(this, rootView);

That’s it, we are done!

The same paradigm can be used to bind views to a ViewHolder of a RecyclerView, as implemented in HarvestTweetViewHolder:

@BindView(R.id.user_fullname)
TextView userFullname;
@BindView(R.id.username)
TextView username;
@BindView(R.id.tweet_date)
TextView tweetDate;
@BindView(R.id.harvested_tweet_text)
TextView harvestedTweetTextView;

public HarvestedTweetViewHolder(View itemView) {
   super(itemView);
   ButterKnife.bind(this, itemView);
}

 

Injecting resources like strings, dimensions, colors, drawables etc. is even easier, only the related annotation and ID needs to be provided. Example the string app_name is used in TweetHarvestingFragment to display the app name i.e. “Loklak Wok” in toolbar

@BindString(R.string.app_name)
String appName;

// directly used inside onCreateView to set the title in toolbar
toolbar.setTitlet(appName);

 

Using ButterKnife onClickListeners can be implemented in separate method, a clean way to define click events rather than polluting onCreate(in Activity) or onCreateView(in Fragment), as implemented in SuggestFragment

@OnClick(R.id.clear_image_button)
public void onClickedClearImageButton(View view) {
   tweetSearchEditText.setText("");
}

 

Multiple Views responding to a single Event

Using @BindViews annotation a list of multiple views can be created, and then a common ButterKnife.Action can be defined to act on the list of views. In TweetHarvestingFragment visibility of some views are changed if a network request fails or succeeds using this feature of ButterKnife:

// declaring List of Views
@BindViews({
       R.id.harvested_tweets_count,
       R.id.harvested__tweet_count_message,
       R.id.harvested_tweets_container})
List<View> networkViews;

// defining action for views
private final ButterKnife.Action<View> VISIBLE = (view, index) -> view.setVisibility(View.VISIBLE);
private final ButterKnife.Action<View> GONE = (view, index) -> view.setVisibility(View.GONE);

// applying action on the views
ButterKnife.apply(networkViews, VISIBLE);
ButterKnife.apply(networkViews, GONE);

Resources

Continue ReadingResource Injection Using ButterKnife in Loklak Wok Android

Implement Caching in the Live Feed of Open Event Android App

In the Open Event Android App, a live feed from the event’s Facebook page was recently implemented. Since it was a live feed, it was decided that it was futile to store it in the Realm database of the app. The data of the live feed didn’t persist anywhere, hence the feed used to be empty when the app ran without the internet connection.

To solve the problem of data persistence, it was decided to store the feed in the cache. Now, there were two paths before us – use retrofit okhttp cache management or use volley. Since retrofit is used to make the API requests in the app, we used the former. To implement caching with retrofit, its API response should include the cache control header. Since it was not a response generated by a personal server, interceptors were needed to force change the request.

Interceptors

Interceptors are a powerful mechanism that can monitor, rewrite, and retry calls. The solution was to use interceptors to rewrite the calls to force use of cache. Two interceptors were added, application interceptor for the request and the network interceptor for the response.

Implementation

Create a cache file to store the response.

private static Cache provideCache() {
   Cache cache = null;
   try {
       cache = new Cache(new File(OpenEventApp.getAppContext().getCacheDir(), "facebook-feed-cache"),
               10 * 1024 * 1024); // 10 MB
   } catch (Exception e) {
       Timber.e(e, "Could not create Cache!");
   }
   return cache;
}

 

Create a network interceptor by chaining the response with the cache control header and removing the pragma header to force use of cache.

private static Interceptor provideCacheInterceptor() {
   return chain -> {
       Response response = chain.proceed(chain.request());

       // re-write response header to force use of cache
       CacheControl cacheControl = new CacheControl.Builder()
               .maxAge(2, TimeUnit.MINUTES)
               .build();

       return response.newBuilder()
               .removeHeader("Pragma")
               .header(CACHE_CONTROL, cacheControl.toString())
               .build();
   };
}

 

Create an application interceptor by chaining the request with the cache control header for stale responses and removing the pragma header to make the feed available for offline usage.

private static Interceptor provideOfflineCacheInterceptor() {
   return chain -> {
       Request request = chain.request();

       if (!NetworkUtils.haveNetworkConnection(OpenEventApp.getAppContext())) {
           CacheControl cacheControl = new CacheControl.Builder()
                   .maxStale(7, TimeUnit.DAYS)
                   .build();

           request = request.newBuilder()
                   .removeHeader("Pragma")
                   .cacheControl(cacheControl)
                   .build();
       }

       return chain.proceed(request);
   };
}

 

Finally add the cache and the two interceptors while building the okhttp client.

OkHttpClient okHttpClient = okHttpClientBuilder.addInterceptor(new HttpLoggingInterceptor()
       .setLevel(HttpLoggingInterceptor.Level.BASIC))
       .addInterceptor(provideOfflineCacheInterceptor())
       .addNetworkInterceptor(provideCacheInterceptor())
       .cache(provideCache())
       .build();

 

Conclusion

Working of apps without the internet connection builds up a strong case for corner cases while testing. It is therefore critical to persist data however small to avoid crashes and bad user experience.

Resources

Continue ReadingImplement Caching in the Live Feed of Open Event Android App