Creating Custom Widgets in Badge Magic Android

In this blog, we are going to have a look on how I created this badge preview in fossasia/badge-magic-android


What is Canvas?

Canvas is a class in Android that performs 2D drawing of different objects onto the screen. The saying “a blank canvas” is very similar to what a Canvas object is on Android. It is basically, an empty space to draw onto.

Canvas Coordinate System

The coordinate system of the Android canvas starts in the top left corner, where [0,0] represents that point. The y axis is positive downwards, and x axis positive towards the right.


Some basics of Canvas, lets see how we drew this Preview Badge.

The Badge consists of only 2 components:

  1. Rounded Rectangle ( Background )
  2. Normal Rectangles ( LED Lights )

Let’s see how we create rounded rectangles in android. 

// Draw Background
canvas.drawRoundRect(bgBounds, 25f, 25f, bgPaint)

Using drawRoundRect() we can easily create the badge background. 25f specified is the corner radius of the rectangle.

The LED Lights are just drawable resources which are used according to the current state of the LED.

private fun drawLED(condition: Boolean, canvas: Canvas, xValue: Int, yValue: Int) {
   if (condition) {
       ledEnabled.bounds = cells[xValue].list[yValue]
       ledEnabled.draw(canvas)
   } else {
       ledDisabled.bounds = cells[xValue].list[yValue]
       ledDisabled.draw(canvas)
   }
}

This function draws the LED Lights if the condition is satisfied.

When we consider a custom view, we need to consider the changes which occur according. These layout changes are to be controlled and maintained accordingly, Let’s see how we manage the positioning of the led lights for every android device. Spoiler: Simple 10th Grade Maths xD

override fun onLayout(changed: Boolean, left: Int, top: Int, right: Int, bottom: Int) {
   super.onLayout(changed, left, top, right, bottom)
   val offset = 30
   val singleCell = (right - left - (offset * 3)) / badgeWidth
   val offsetXToAdd: Int = ((((right - offset).toFloat() - (left + offset).toFloat()) - (singleCell * badgeWidth)) / 2).toInt() + 1

   cells = mutableListOf()
   for (i in 0 until badgeHeight) {
       cells.add(Cell())
       for (j in 0 until badgeWidth) {
           cells[i].list.add(Rect(
               (offsetXToAdd * 2) + j * singleCell,
               (offsetXToAdd * 2) + i * singleCell,
               (offsetXToAdd * 2) + j * singleCell + singleCell,
               (offsetXToAdd * 2) + i * singleCell + singleCell
           ))
       }
   }
   bgBounds = RectF((offsetXToAdd).toFloat(), (offsetXToAdd).toFloat(), ((singleCell * badgeWidth) + (offsetXToAdd * 3)).toFloat(), ((singleCell * badgeHeight) + (offsetXToAdd * 3)).toFloat())
}

We create an offset which is nothing but the gap from the screen edge to the badge itself, now we need to have gaps on both sides of the badge and we also leave half the offset inside the badge which is the difference between the badge background and the LED starting point, hence we calculate the value of single cells by: 

val singleCell = (right - left - (offset * 3)) / badgeWidth

We minus the no of pixels on the right of the display to the left, to get the width of the actual screen. Then we minus the padding from the left and right which is offset * 3 . Now we divide it by the number of cells we want in the badge which is the badgeWidth.

Once we have the number of cells, we want to calculate the left, right, top and bottom positions of all the LED. What we now do is loop into the number of LEDs and then multiply the singleLed width with the current position to get the accurate pixels which need to be escaped from the left.

cells = mutableListOf()
   for (i in 0 until badgeHeight) {
       cells.add(Cell())
       for (j in 0 until badgeWidth) {
           cells[i].list.add(Rect(
               (offsetXToAdd * 2) + j * singleCell,
               (offsetXToAdd * 2) + i * singleCell,
               (offsetXToAdd * 2) + j * singleCell + singleCell,
               (offsetXToAdd * 2) + i * singleCell + singleCell
           ))
       }
   }

Now the fun part, we save all of it in a 2D ArrayList to be able to draw it later on.

Conclusion

Working on custom views is very unique. This experience is one of a kind and drawing stuff with basic maths is fun in the first place. Simple equations led me to create a preview which simulates the complete badge in software. 

Ressources

Continue ReadingCreating Custom Widgets in Badge Magic Android

Implementation of Planned Actions in SUSI.AI Android Client

What do you understand by “Planned Actions”? Is it something required to do something according to plan? Is it something that needs to be done at a particular time? Yes, it is. “Planned Actions” is a feature present in the SUSI.AI android app that helps to schedule tasks. Yes, you heard it right! It helps to schedule the task. We can schedule some events like “setting an alarm after one minute”, “playing music after one minute”, etc. Such events require scheduling of different events at different point of time.

SUSI.AI is a smart assistant. It answers the queries that the user asks for, play music, etc. Additionally, the project also has a smart speaker part, which is very similar to Google Home. A smart speaker often has the ability to perform certain task other than answering normal queries at a scheduled time. During my GSoC period, I got the opportunity to implement such a feature of scheduling task in the SUSI.AI android app.

What is the scheduling of task or planned action?

Scheduling of task or planned actions is nothing but performing some kind of task at a scheduled time. Let’s consider “set alarm in one minute”. This means an alarm should be set that should ring up after one minute. 

Let’s see how it has been implemented in SUSI.

First of all, we need to have a skill that would respond to the query asked. Consider, the skill “rocket launch”. The response for this skill is:

{
  “query”: “rocket launch”,
  “client_id”: “aG9zdF8xNzIuNjguMTQ2LjE5MF80YTk0OGY4OA==”,
  “query_date”: “2019-08-19T17:31:53.670Z”,
  “answers”: [{
    “data”: [
      {
        “0”: “rocket launch”,
        “token_original”: “rocket”,
        “token_canonical”: “rocket”,
        “token_categorized”: “rocket”,
        “timezoneOffset”: “-330”,
        “language”: “en”
      },
      {
        “0”: “rocket launch”,
        “token_original”: “rocket”,
        “token_canonical”: “rocket”,
        “token_categorized”: “rocket”,
        “timezoneOffset”: “-330”,
        “language”: “en”
      }
    ],
    “metadata”: {“count”: 2},
    “actions”: [
      {
        “language”: “en”,
        “type”: “answer”,
        “expression”: “starting countdown”
      },
      {
        “expression”: “twelve”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 1000,
        “plan_date”: “2019-08-19T17:31:54.679Z”
      },
      {
        “expression”: “eleven”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 2000,
        “plan_date”: “2019-08-19T17:31:55.681Z”
      },
      {
        “expression”: “ten”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 3000,
        “plan_date”: “2019-08-19T17:31:56.681Z”
      },
      {
        “expression”: “nine”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 4000,
        “plan_date”: “2019-08-19T17:31:57.682Z”
      },
      {
        “expression”: “ignition sequence starts”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 5000,
        “plan_date”: “2019-08-19T17:31:58.682Z”
      },
      {
        “expression”: “six”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 7000,
        “plan_date”: “2019-08-19T17:32:00.682Z”
      },
      {
        “expression”: “five”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 8000,
        “plan_date”: “2019-08-19T17:32:01.683Z”
      },
      {
        “expression”: “four”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 9000,
        “plan_date”: “2019-08-19T17:32:02.683Z”
      },
      {
        “expression”: “three”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 10000,
        “plan_date”: “2019-08-19T17:32:03.683Z”
      },
      {
        “expression”: “two”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 11000,
        “plan_date”: “2019-08-19T17:32:04.684Z”
      },
      {
        “expression”: “one”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 12000,
        “plan_date”: “2019-08-19T17:32:05.684Z”
      },
      {
        “expression”: “zero”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 13000,
        “plan_date”: “2019-08-19T17:32:06.684Z”
      },
      {
        “expression”: “all engines running”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 14000,
        “plan_date”: “2019-08-19T17:32:07.685Z”
      },
      {
        “expression”: “liftoff, we have a liftoff!”,
        “language”: “en”,
        “type”: “answer”,
        “plan_delay”: 16000,
        “plan_date”: “2019-08-19T17:32:09.685Z”
      }
    ],
    “skills”: [“/susi_skill_data/models/general/Novelty and Humour/en/Apollo_Countdown.txt”],
    “persona”: {}
  }],
  “answer_date”: “2019-08-19T17:31:53.685Z”,
  “answer_time”: 15,
  “language”: “en”,
  “session”: {“identity”: {
    “type”: “host”,
    “name”: “172.68.146.190_4a948f88”,
    “anonymous”: true
  }}
}

The response for the above skill can be received by querying “rocket launch” at https://api.susi.ai/  or you can get it by going directly to the following link https://api.susi.ai/susi/chat.json?timezoneOffset=-330&q=Rocket+Launch

The skill acts as a count-down timer of rocket launch. Now, we have a skill, but how are we going to implement it in android app, so that the countdown timer works properly and the output appears at the proper scheduled time. Also, we need to take care that the other processes of the app doesn’t stop or hangs up when some events are scheduled. Going through all these problems and circumstances I came up with a solution. So, let’s have a look. I made use of Handler. 

Handler in Android helps to execute some tasks at a scheduled time. The tasks are executed in a different thread other than the Main UI thread. So, the other process in the app doesn’t stops and it runs smoothly.

Let’s have a look at the code:

for (i in 0 until actionSize) {

  if (susiResponse.answers[0].actions[i].plan_delay.toString().isNullOrEmpty()) {
      planDelay = 0L
  } else {
      planDelay = susiResponse.answers[0].actions[i].plan_delay
  }
  executeTask(planDelay, susiResponse, i, date)
  chatView?.hideWaitingDots()
}

The code above, loops through all the actions mentioned in the skill one by one. After parsing it, the data is passed to a function executeTask(). This function performs the task to schedule the events (plan actions).

handler = Handler()
try {
  handler.postDelayed({
      val parseSusiHelper = ParseSusiResponseHelper()
      parseSusiHelper.parseSusiResponse(susiResponse, i, utilModel.getString(R.string.error_occurred_try_again))
      var setMessage = parseSusiHelper.answer

      if (parseSusiHelper.actionType == Constant.TABLE) {
          tableItem = parseSusiHelper.tableData
      } else if (parseSusiHelper.actionType == Constant.VIDEOPLAY || parseSusiHelper.actionType == Constant.AUDIOPLAY) {
          // Play youtube video
          identifier = parseSusiHelper.identifier
          youtubeVid.playYoutubeVid(identifier)
      } else if (parseSusiHelper.actionType == Constant.ANSWER && (PrefManager.checkSpeechOutputPref() && check || PrefManager.checkSpeechAlwaysPref())) {
          setMessage = parseSusiHelper.answer
          try {
              var speechReply = setMessage
              Handler().post {
                  textToSpeech = TextToSpeech(context, TextToSpeech.OnInitListener { status ->
                      if (status != TextToSpeech.ERROR) {
                          val locale = textToSpeech?.language
                          textToSpeech?.language = locale
                          textToSpeech?.speak(speechReply, TextToSpeech.QUEUE_FLUSH, null)
                          PrefManager.putBoolean(R.string.used_voice, true)
                      }
                  })
              }
          } catch (e: Exception) {
              Timber.e(“Error occured while trying to start text to speech engine – ” + e)
          }
      } else if (parseSusiHelper.actionType == Constant.STOP) {
          setMessage = parseSusiHelper.stop
          removeCallBacks()
          chatView?.stopMic()
      }

      if (parseSusiHelper.answer == ALARM) {
          playRingTone()
      }

      try {
          databaseRepository.updateDatabase(ChatArgs(
                  prevId = id,
                  message = setMessage,
                  date = DateTimeHelper.getDate(date),
                  timeStamp = DateTimeHelper.getTime(date),
                  actionType = parseSusiHelper.actionType,
                  mapData = parseSusiHelper.mapData,
                  isHavingLink = parseSusiHelper.isHavingLink,
                  datumList = parseSusiHelper.datumList,
                  webSearch = parseSusiHelper.webSearch,
                  tableItem = tableItem,
                  identifier = identifier,
                  skillLocation = susiResponse.answers[0].skills[0]
          ), this)
      } catch (e: Exception) {
          Timber.e(“Error occured while updating the database – ” + e)
          databaseRepository.updateDatabase(ChatArgs(
                  prevId = id,
                  message = utilModel.getString(R.string.error_internet_connectivity),
                  date = DateTimeHelper.date,
                  timeStamp = DateTimeHelper.currentTime,
                  actionType = Constant.ANSWER
          ), this)
      }
  }, planDelay)
} catch (e: java.lang.Exception) {
  Timber.e(“Error while showing data – ” + e)
}

The variable planDelay determines the time after which the task needs to be executed. By default its value is 0, i.e, it gets executed at that moment only until any delay time is supplied by the skill.

In this way, planned actions have been implemented in SUSI.AI Android app.

Resources: 

Blogs: Handler, Handler Implementation

SUSI Skill: Rocket Launch

SUSI.AI Android App: PlayStore GitHub

Tags:SUSI.AI Android App, Kotlin, SUSI.AI, FOSSASIA, GSoC, Android, Handler, Planned Actions

Continue ReadingImplementation of Planned Actions in SUSI.AI Android Client

Displaying of Images using Picasso Library in SUSI.AI Android App

Displaying images is not so easy in an Android app without using a third-party library. Picasso is one of the most popularly used third-party libraries for android that is used to display images. It is a very simple and powerful library for image downloading and caching. SUSI.AI is a smart assistant app. We need the use of Picasso a lot to answer the queries put up by the users. Suppose if the user asks to show some pictures, we need to display it. In such a case we find Picasso very handy. So let’s know how to implement it, by going through the code base present in SUSI.AI android app.

Why use Picasso or another third-party library?

You might be thinking that why we should use a 3rd party library. You can achieve your task without using a 3rd party API as well. But if you will use the core method then it would take a larger amount of code. But if we will use a 3rd party library like Picasso then we will achieve our goal in fewer lines of code.  Also, we can easily cache the images using this libraries. Caching is really an important feature to speed up the application.

Adding Picasso Library to the Gradle file

Adding Picasso android library to your project is very easy. You just need to add the following line in the dependency block of your build.gradle file and replace ${rootConfiguration.picassoVersion} with the latest version of Picasso(for example 2.71828). Now sync your Gradle file. I am assuming that you have already added NETWORK permission in your project.

The simplest way of loading image is:

Picasso.with(this)
   .load("YOUR IMAGE URL HERE")
   .into(imageView);

Here, imageView is the reference to the imageView where you want to display the image.

This is how it is being implemented in the susi app. Here imageUrl is the URL of the image to be loaded. ImageView is the place where we need to display the loaded image. Now there is a chance that the imageUrl is actually not a URL of the image, or suppose Picasso fails to load the image due to some errors. In all such cases, a dummy image would be shown in the imageView. This dummy image is added by calling the error function of Picasso class and passing the reference of the image to it. The placeholder function displays a static image present inside the app until the actual image is being loaded.

Re-sizing and Rotating

We can also resize and rotate the image very easily.

Resources: 

Documentation: Picasso, Glide vs Picasso

Tutorial: Picasso Youtube Video

Library source code: Picasso Library source code

SUSI.AI Android App: PlayStore GitHub

Tags: SUSI.AI Android App, Kotlin, SUSI.AI, FOSSASIA, GSoC, Android, Picasso, Image loader..

Continue ReadingDisplaying of Images using Picasso Library in SUSI.AI Android App

Store data using Realm Database in SUSI.AI Android App

A large application always consists of some kinds of a database to keep a store of large chunks of data that it receives from the server. One such database is the Realm Database.

While working on SUSI.AI app (a smart assistant open-source app) I found the necessity to store room name in the database. These room names were used while configuring SUSI Smart Speaker using the app. Let me share how I implemented this in the app.

Before we proceed with the implementation, let’s go through the  prerequisites for using Realm with Android:

  • Android Studio version 1.5.1 or higher
  • JDK version 7.0 or higher
  • A recent version of the Android SDK
  • Android API Level 9 or higher (Android 2.3 and above)

First of all, you need to add the Gradle dependency at the app level.

Since we are going to do it in Kotlin, we need to add the following:

Since this is a database that is used throughout the application, we have to initialize the realm in the class that extends the Application class.

Once the realm is initialized, we set up a data model class. This model class contains the format of data to be stored. The model class should extend the RealmObject class. Also, the model class must use the keyword open. This makes it available for inheritance and allows it to be overridden. If we remove the open keyword here, we get an error stating: cannot inherit from the final model class. 

When the realm objects are initialized at the start, the default values of the variables are declared null (Concept of Function Default  Arguments in Kotlin).

Now to manipulate the database from the required files we need to create a reference to the Realm followed by getting the default instance of Realm.


To commit any operation with the database, the code must be written within beginTransaction and commitTransaction blocks. Below is an example where the room name is added to the database by using the model class mentioned above.

We use the above method to add data to the database. Now, if we can add data, we should be able to retrieve them too. Let’s go through the code snippet to retrieve data from the database.

The above snippet shows a function which extracts data from the database and stores them in an array list for displaying in a recycler view. 

Now, we all went through addition, display of data in the realm database. Is that sufficient?? No, if we can add data to the database, then there also should be a way to delete the data when needed. Clearing/removing data all at once from the database is very easy and can be simply done by :

Resources: 

Documentation: Realm

Reference: Realm Database

SUSI.AI Android App: PlayStore GitHub

Tags: SUSI.AI Android App, Kotlin, SUSI.AI, FOSSASIA, GSoC, Android, Realm, Database

Continue ReadingStore data using Realm Database in SUSI.AI Android App

Implementation of Shimmer Effect in Layouts in SUSI.AI Android App

The shimmer effect was created by Facebook to indicate the loading of data in pages where data is being loaded from the internet. This was created as an alternative for the existing ProgressBar and the usual loader to give better user experience with UI.

Let’s get started to see how we can implement it. Here, I am going to use SUSI.AI (a smart assistant app) as a reference app to show a code demonstration. I am working on this project in my GSoC period and while working I found the need to implement this feature in many places. So, I am writing this blog to share my experience with how, I implemented it in the app.

First of all, we need to add the shimmer dependency in the app level Gradle file.

Now, we need to create a placeholder layout simply by using views. This placeholder should resemble the actual layout. Usually, grey-colored is preferred in the placeholder background. A placeholder should not have any text written. It should be viewed only. Let’s consider the placeholder used in susi.

Now let’s have a glance at the actual items whose placeholders we have made.

Now, after the creation of the placeholder, we need to add this placeholder in the main layout file. It is done in the following way:

Here, I have added the placeholders 6 times so that the entire screen gets covered up. You can add it as many times as you want.

The next and the final task is to start and stop the shimmer effect according to the logic of the code. Here, the shimmer starts as soon as the fragment is created and stops when the data is successfully loaded from the server. Have a look at how to create the reference.

First of all, we need to create a reference to the shimmer. Then we use this reference to start/stop the shimmer effect. Here, in Kotlin we can directly use the id used in layout without creating any reference.

We start the shimmer effect simply by using startShimmer() function in the shimmer reference.

Similarly, we can stop it using stopShimmer() function in the reference.

Resources: 

Framework: Shimmer in Android

Documentation: ShimmerAndroid Design

SUSI.AI Android App: PlayStore GitHub

Tags:

SUSI.AI Android App, Kotlin, SUSI.AI, FOSSASIA, GSoC, Android, Shimmer

Continue ReadingImplementation of Shimmer Effect in Layouts in SUSI.AI Android App

Gestures in SUSI.AI Android

Gestures have become one of the most widely used features by a user. The user usually, expects that some tasks should be performed by the app when he or she executes some gestures on the screen.

A “touch gesture” occurs when a user places one or more fingers on the touch screen, and your application interprets that pattern of touches as a particular gesture. There are correspondingly two phases to gesture detection:

  1. Gather data about touch events.
  2. Interpret the data to see if it meets the criteria for any of the gestures your app supports.

There are various kinds of gestures supported by android. Some of them are:

  • Tap
  • Double Tap
  • 2-finger Tap
  • 2-finger-double tap
  • 3-finger tap
  • Pinch

In this post, we will go through the SUSI.AI android app (a smart assistant app) which has the “Right to left swipe” gesture detector in use. When such kind of gesture is detected inside the Chat Activity, it opens the Skill’s Activity. This makes the app very user-friendly. Before we start implementing the code,  go through the steps mentioned above in detail.

1st Step “Gather Data”: 

When a user places one or more fingers on the screen, this triggers the callback onTouchEvent() on the View that received the touch events. For each sequence of touch events (position, pressure, size, the addition of another finger, etc.) that is ultimately identified as a gesture, onTouchEvent() is fired several times.

The gesture starts when the user first touches the screen, continues as the system tracks the position of the user’s finger(s), and ends by capturing the final event of the user’s fingers leaving the screen. Throughout this interaction, the MotionEvent delivered to onTouchEvent() provides the details of every interaction. Your app can use the data provided by the MotionEvent to determine if a gesture it cares about happened.

2nd Step “Data Interpretation”:

The data received needs to be properly interpreted. The gestures should be properly recognized and processed to perform further actions. Like an app might have different gestures integrated into the same page live “Swipe-to-refresh”, “Double-tap”, “Single tap”, etc. Upon successfully differentiating this kind of gesture, further functions/tasks should be executed.

Let’s go through the code present in SUSI now.

First of all, a new class is created here “CustomGestureListener”. This class extends the “SimpleOnGestureListener” which is a part of the “GestureDetector” library of android. This class contains a function “onFling”. This function determines the gestures across the horizontal axis. event1.getX(), and event2.getX() functions says about the gesture values across the horizontal axis of the device. Here, when the value of X becomes getter than 0, it actually indicates that the user has swiped from right to left. This becomes active even in very minor change, which users might have presses accidentally, or has just touched the screen. So to avoid such minor impulses, we set a value that we will execute our task only when the value of X lies between 100 and 1000. This avoids minor gestures.

Inside the onCreate method, a new CustomGestureListener instance is created, passing through a reference to the enclosing activity and an instance of our new CustomGestureListener class as arguments. Finally, an onTouchEvent() callback method is implemented for the activity, which simply calls the corresponding onTouchEvent() method of the ScaleGestureDetector object, passing through the MotionEvent object as an argument.

Summary:

Gestures are usually implemented to enhance the user experience while using the application. Though there are some predefined gestures in Android, we can also create gestures of our own and use them in our application.

Resources: 

Documentation: Gestures

Reference: Gesture

SUSI.AI Android App: PlayStore GitHub

Tags:

SUSI.AI Android App, Kotlin, SUSI.AI, FOSSASIA,GSoC, Android, Gestures

Continue ReadingGestures in SUSI.AI Android

Implementing pagination with Retrofit in Eventyay Attendee

Pagination (Paging) is a common and powerful technique in Android Development when making HTTP requests or fetching data from the database. Eventyay Attendee has found many situations where data binding comes in as a great solution for our network calls with Retrofit. Let’s take a look at this technique.

  • Problems without Pagination in Android Development
  • Implementing Pagination with Kotlin with Retrofit
  • Results and GIF
  • Conclusions

PROBLEMS WITHOUT DATABINDING IN ANDROID DEVELOPMENT

Making HTTP requests to fetch data from the API is a basic work in any kind of application. With the mobile application, network data usage management is an important factor that affects the loading performance of the app. Without paging, all of the data are fetched even though most of them are not displayed on the screen. Pagination is a technique to load all the data in pages of limited items, which is much more efficient

IMPLEMENTING DATABINDING IN FRAGMENT VIEW

Step 1:  Set up dependency in build.gradle

// Paging
implementation "androidx.paging:paging-runtime:$paging_version"
implementation "androidx.paging:paging-rxjava2:$paging_version"

Step 2:  Set up retrofit to fetch events from the API

@GET("events?include=event-sub-topic,event-topic,event-type")
fun searchEventsPaged(
   @Query("sort") sort: String,
   @Query("filter") eventName: String,
   @Query("page[number]") page: Int,
   @Query("page[size]") pageSize: Int = 5
): Single<List<Event>>

Step 3: Set up the DataSource

DataSource is a base class for loading data in the paging library from Android. In Eventyay, we use PageKeyedDataSource. It will fetch the data based on the number of pages and items per page with our default parameters. With PageKeyedDataSource, three main functions loadInitial(), loadBefore(), loadAfter() are used to to load each chunks of data.

class EventsDataSource(
   private val eventService: EventService,
   private val compositeDisposable: CompositeDisposable,
   private val query: String?,
   private val mutableProgress: MutableLiveData<Boolean>

) : PageKeyedDataSource<Int, Event>() {
   override fun loadInitial(
       params: LoadInitialParams<Int>,
       callback: LoadInitialCallback<Int, Event>
   ) {
       createObservable(1, 2, callback, null)
   }

   override fun loadAfter(params: LoadParams<Int>, callback: LoadCallback<Int, Event>) {
       val page = params.key
       createObservable(page, page + 1, null, callback)
   }

   override fun loadBefore(params: LoadParams<Int>, callback: LoadCallback<Int, Event>) {
       val page = params.key
       createObservable(page, page - 1, null, callback)
   }

   private fun createObservable(
       requestedPage: Int,
       adjacentPage: Int,
       initialCallback: LoadInitialCallback<Int, Event>?,
       callback: LoadCallback<Int, Event>?
   ) {
       compositeDisposable +=
           eventService.getEventsByLocationPaged(query, requestedPage)
               .withDefaultSchedulers()
               .subscribe({ response ->
                   if (response.isEmpty()) mutableProgress.value = false
                   initialCallback?.onResult(response, null, adjacentPage)
                   callback?.onResult(response, adjacentPage)
               }, { error ->
                   Timber.e(error, "Fail on fetching page of events")
               }
           )
   }
}

Step 4: Set up the Data Source Factory

DataSourceFactory is the class responsible for creating DataSource object so that we can create PagedList (A type of List used for paging) for events.

class EventsDataSourceFactory(
   private val compositeDisposable: CompositeDisposable,
   private val eventService: EventService,
   private val query: String?,
   private val mutableProgress: MutableLiveData<Boolean>
) : DataSource.Factory<Int, Event>() {
   override fun create(): DataSource<Int, Event> {
       return EventsDataSource(eventService, compositeDisposable, query, mutableProgress)
   }
}

Step 5: Adapt the current change to the ViewModel. 

Previously, events fetched in List<Event> Object are now should be turned into PagedList<Event>.

sourceFactory = EventsDataSourceFactory(
   compositeDisposable,
   eventService,
   mutableSavedLocation.value,
   mutableProgress
)
val eventPagedList = RxPagedListBuilder(sourceFactory, config)
   .setFetchScheduler(Schedulers.io())
   .buildObservable()
   .cache()

compositeDisposable += eventPagedList
   .subscribeOn(Schedulers.io())
   .observeOn(AndroidSchedulers.mainThread())
   .distinctUntilChanged()
   .doOnSubscribe {
       mutableProgress.value = true
   }.subscribe({
       val currentPagedEvents = mutablePagedEvents.value
       if (currentPagedEvents == null) {
           mutablePagedEvents.value = it
       } else {
           currentPagedEvents.addAll(it)
           mutablePagedEvents.value = currentPagedEvents
       }
   }, {
       Timber.e(it, "Error fetching events")
       mutableMessage.value = resource.getString(R.string.error_fetching_events_message)
   })

Step 6: Turn ListAdapter into PagedListAdapter

PageListAdapter is basically the same ListAdapter to update the UI of the events item but specifically used for Pagination. In here, List objects can also be null.

class EventsListAdapter : PagedListAdapter<Event, EventViewHolder>(EventsDiffCallback()) {

   var onEventClick: EventClickListener? = null
   var onFavFabClick: FavoriteFabClickListener? = null
   var onHashtagClick: EventHashTagClickListener? = null

   override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): EventViewHolder {
       val binding = ItemCardEventsBinding.inflate(LayoutInflater.from(parent.context), parent, false)
       return EventViewHolder(binding)
   }

   override fun onBindViewHolder(holder: EventViewHolder, position: Int) {
       val event = getItem(position)
       if (event != null)
           holder.apply {
               bind(event, position)
               eventClickListener = onEventClick
               favFabClickListener = onFavFabClick
               hashTagClickListAdapter = onHashtagClick
           }
   }

   /**
    * The function to call when the adapter has to be cleared of items
    */
   fun clear() {
       this.submitList(null)
   }

AND HERE ARE THE RESULTS…

CONCLUSION

Databinding is the way to go when working with a complex UI in Android Development. This helps reducing boilerplate code and to increase the readability of the code and the performance of the UI. One problem with data-binding is that sometimes, it is pretty hard to debug with unhelpful log messages. Hopefully, you can empower your UI in your project now with data-binding. 

Pagination is the way to go for fetching items from the API and making infinite scrolling. This helps reduce network usage and improve the performance of Android applications. And that’s it. I hope you can make your application more powerful with pagination. 

RESOURCES

Open Event Codebase: https://github.com/fossasia/open-event-attendee-android/pull/2012

Documentation: https://developer.android.com/topic/libraries/architecture/paging/ 

Google Codelab: https://codelabs.developers.google.com/codelabs/android-paging/#0

Continue ReadingImplementing pagination with Retrofit in Eventyay Attendee

Handle app links and apply unit tests in Open Event Attendee Application

The open event attendee is an android app which allows users to discover events happening around the world using the Open Event Platform. It consumes the APIs of the open event server to get a list of available events and can get detailed information about them. Users following links on devices have one goal in mind: to get to the content they want to see. As a developer, you can set up Android App Links to take users to a link’s specific content directly in your app, bypassing the app-selection dialog, also known as the disambiguation dialog. Because Android App Links leverage HTTP URLs and association with a website, users who don’t have your app installed go directly to content on your site.

A unit test generally exercises the functionality of the smallest possible unit of code (which could be a method, class, or component) in a repeatable way. You should build unit tests when you need to verify the logic of specific code in your app.

  • Why unit test cases?
  • Setup app link intent in the app
  • Apply unit test cases
  • Conclusion
  • Resources

Let’s analyze every step in detail.

Why unit test cases?

As it is already discussed what the app link intents are, and it is basically for providing a better user experience for the application. These are some following reason why unit test cases should use – 

  1. Rapid feedback on failures.
  2. Early failure detection in the development cycle.
  3. Safer code refactoring, letting you optimize code without worrying about regressions.
  4. Stable development velocity, helping you minimize technical debt.

JUnit4 library is used for unit tests.

Setup app link intent in the app

Declare the frontend host according to build flavor in the app level gradle file:

buildTypes {
        release {
            resValue "string",  "FRONTEND_HOST", "eventyay.com"
        }
        debug {
            resValue "string", "FRONTEND_HOST", "open-event-fe.netlify.com"
        }
    }

Handle the app link intent in Manifest file by adding intent filter under main activity decleartion:

<intent-filter>
    <action android:name="android.intent.action.VIEW" />
    <category android:name="android.intent.category.DEFAULT" />
    <category android:name="android.intent.category.BROWSABLE" />

    <data
        android:scheme="https"
        android:host="@string/FRONTEND_HOST"/>
</intent-filter>

Manifest will through the intent in the main activity file.

Now handle the intent in main activity:

override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        handleAppLinkIntent(intent)
    }

    override fun onNewIntent(intent: Intent?) {
        super.onNewIntent(intent)
        handleAppLinkIntent(intent)
    }

    private fun handleAppLinkIntent(intent: Intent?) {
        val uri = intent?.data ?: return
        val bundle = AppLinkUtils.getArguments(uri)
        val destinationId = AppLinkUtils.getDestinationId(uri)
        if (destinationId != null) {
            navController.navigate(destinationId, bundle)
        }
    }

Here a new class/object AppLinkUtils is defined which will return destination fragment id and the argument/data according to the intent URI.

Apply unit test cases:

First, implement the libraries in the gradle file –  1. JUnit for unit tests, 2. Robolectric for using android classes in the test class:

testImplementation 'junit:junit:4.12'
testImplementation 'org.robolectric:robolectric:3.4.2'

Create a test class for testing the app link functions and run it with RoboLectricTestRunner:

private const val EVENT = "event"
private const val RESET_PASSWORD = "resetPassword"
private const val VERIFY_EMAIL = "verifyEmail"

@RunWith(RobolectricTestRunner::class)
class AppLinkUtilsTest {

    private fun getAppLink(type: String): Uri {
        return when (type) {
            EVENT -> Uri.parse("https://eventyay.com/e/5f6d3feb")
            RESET_PASSWORD -> Uri.parse("https://eventyay.com/reset-password?token=822980340478781748445098077144")
            VERIFY_EMAIL -> Uri.parse("https://eventyay.com/verify?token=WyJsaXZlLmhhcnNoaXRAaG")
            else -> Uri.parse("")
        }
    }

    @Test
    fun `should get event link`() {
        val uri = getAppLink(EVENT)
        assertEquals(R.id.eventDetailsFragment, AppLinkUtils.getDestinationId(uri))
        assertEquals("""
            5f6d3feb
        """.trimIndent(), AppLinkUtils.getArguments(uri).getString(EVENT_IDENTIFIER))
    }// Find more test cases in the GitHub Repo.

Testing response:

GIF

In a Nutshell

So, essentially the Eventyay Attendee should have this feature to handle all links i.e. Reset password, verify user email and open event details in the app itself. So, we can provide a better user experience in-app instead of redirecting to the frontend for them.

Resources

  1. Android app links: https://developer.android.com/studio/write/app-link-indexing
  2. Developing android unit testing: https://www.vogella.com/tutorials/AndroidTesting/article.html

Tags

Eventyay, open-event, JUnit, AndroidUnitTest, AppLinks, Fossasia, GSoC, Android, Kotlin

Continue ReadingHandle app links and apply unit tests in Open Event Attendee Application

Configure location feature using MVVM in Open Event Attendee Application

The open event attendee is an android app which allows users to discover events happening around the world using the Open Event Platform. It consumes the APIs of the open event server to get a list of available events and can get detailed information about them. It deals with events based on location, but we have to take the location as an input from the user. While in many cases, we have to search for events on our current location only. To make this work, I have added a current location option, where the app will get our location and search for nearby events. Earlier we had to enter our current location as well to search nearby events.

Model–view–viewmodel is a software architectural pattern. MVVM facilitates separation of development of the graphical user interface – be it via a markup language or GUI code – from the development of the business logic or back-end logic (the data model).

  • Why Model-view-ViewModel?
  • Setup Geo location View Model
  • Configure location feature with MVVM
  • Conclusion
  • Resources

Let’s analyze every step in detail.

Advantages of using Model-view-ViewModel

  1. A clean separation of different kinds of code should make it easier to go into one or several of those more granular and focused parts and make changes without worrying.
  2. External and internal dependencies are in separate pieces of code from the parts with the core logic that you would like to test.
  3. Observation of mutable live data whenever it is changed.

Setup the Geolocation view model

Created new kotlin class name GeoLocationViewModel which contains a configure function for current location:

package org.fossasia.openevent.general.search

class GeoLocationViewModel : ViewModel() {
    fun configure(activity: Activity) { }                                     
}

The GeoLocationViewModel class implement as VIewModel().

Now add the class in module inside Modules.kt :

val viewModelModule = module {
    viewModel { GeoLocationViewModel() }
}

Configure location feature with GeoLocationViewModel:

First, add play store location service implementation inside dependencies of build.gradle:

// Location Play Service
playStoreImplementation 'com.google.android.gms:play-services-location:16.0.0'

Now we need location permissions to implement this feature. Adding user permissions in the manifest file:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />

Now ask user for the location permission:

private fun checkLocationPermission() {
        val permission = context?.let {
            ContextCompat.checkSelfPermission(it, Manifest.permission.ACCESS_COARSE_LOCATION) }
        if (permission != PackageManager.PERMISSION_GRANTED) {
            requestPermissions(arrayOf(Manifest.permission.ACCESS_COARSE_LOCATION,
                Manifest.permission.ACCESS_FINE_LOCATION), LOCATION_PERMISSION_REQUEST)
        }
    }

Check for device location is enabled, if not send an intent to turn location on. The method is written inside configure function:

val service = activity.getSystemService(Context.LOCATION_SERVICE)
        var enabled = false
        if (service is LocationManager) enabled = service.isProviderEnabled(LocationManager.NETWORK_PROVIDER)
        if (!enabled) {
            val intent = Intent(Settings.ACTION_LOCATION_SOURCE_SETTINGS)
            activity.startActivity(intent)
            return
        }

Now create Mutable live data for current location inside the view model class :

private val mutableLocation = MutableLiveData<String>()
val location: LiveData<String> = mutableLocation

Now implement location request and location callback inside configure method:

val locationRequest: LocationRequest = LocationRequest.create()
locationRequest.priority = LocationRequest.PRIORITY_LOW_POWER
val locationCallback = object : LocationCallback() {
      override fun onLocationResult(locationResult: LocationResult?) {
           if (locationResult == null) {
               return
            }
            for (location in locationResult.locations) {
                if (location != null) {
                    val latitude = location.latitude
                    val longitude = location.longitude
                    try {
                        val geocoder = Geocoder(activity, Locale.getDefault())
                        val addresses: List<Address> = geocoder.getFromLocation(latitude, longitude, maxResults)
                        for (address: Address in addresses) {
                            if (address.adminArea != null) {
                                mutableLocation.value = address.adminArea
                            }
                        }
                    } catch (exception: IOException) {
                        Timber.e(exception, "Error Fetching Location")
                    }
                }
             }
        }
    }

Now call location service inside configure method:

LocationServices
                .getFusedLocationProviderClient(activity)
                .requestLocationUpdates(locationRequest, locationCallback, null)

Now observe data for current location inside the fragment from the view model. Save the current user location and go to the main activity:

private val geoLocationViewModel by viewModel<GeoLocationViewModel>()
geoLocationViewModel.location.observe(this, Observer { location ->
                searchLocationViewModel.saveSearch(location)
                redirectToMain()
            })

GIF

In a Nutshell

So, essentially the Eventyay Attendee should have this feature to show all the events nearby me or in my city, although the app was already doing the job, but we had to manually select the city or locality we wish to search, now after the addition of dedicated current location option, the app will be more user friendly and automated.

Resources

  1. Google location services API: https://www.toptal.com/android/android-developers-guide-to-google-location-services-api
  2. MVVM architecture in Android: https://proandroiddev.com/mvvm-architecture-viewmodel-and-livedata-part-1-604f50cda1

Tags

Eventyay, open-event, PlayServices, Location, MVVM, Fossasia, GSoC, Android, Kotlin

Continue ReadingConfigure location feature using MVVM in Open Event Attendee Application

Data Binding with Kotlin in Eventyay Attendee

Databinding is a common and powerful technique in Android Development. Eventyay Attendee has found many situations where data binding comes in as a great solution for our complex UI. Let’s take a look at this technique.

  • Problems without data binding in Android Development
  • Implementing Databinding with Kotlin inside Fragment
  • Implementing Databinding with Kotlin inside RecyclerView/Adapter
  • Results and GIF
  • Conclusions

PROBLEMS WITHOUT DATABINDING IN ANDROID DEVELOPMENT

Getting the data and fetching it to the UI is a basic work in any kind of application. With Android Development, the most common way to do is it to call function like .setText(), isVisible = True/False,.. in your fragment. This can create many long boilerplate codes inside Android classes. Databinding removes them and moves to the UI classes (XML).

IMPLEMENTING DATABINDING IN FRAGMENT VIEW

Step 1: Enabling data binding in the project build.gradle

android {
   dataBinding {
       enabled = true
   }

Step 2: Wrap the current layout with <layout></layout> tag. Inside that, put <data></data> to indicate any variables needed for data binding. For example, this code here display an event variable for our fragment about event details:

<layout xmlns:android="http://schemas.android.com/apk/res/android"
   xmlns:app="http://schemas.android.com/apk/res-auto"
   xmlns:bind="http://schemas.android.com/tools">

   <data>

       <variable
           name="event"
           type="org.fossasia.openevent.general.event.Event" />
   </data>

   <androidx.coordinatorlayout.widget.CoordinatorLayout
       android:id="@+id/eventCoordinatorLayout"
       android:layout_width="match_parent"
       android:layout_height="match_parent"
       android:background="@android:color/white">

Step 3: Bind your data in the XML file and create a Binding Adapter class for better usage

With the setup above, you can start binding your data with “@{<data code here>}”

<TextView
   android:id="@+id/eventName"
   android:layout_width="0dp"
   android:layout_height="wrap_content"
   android:layout_marginLeft="@dimen/layout_margin_large"
   android:layout_marginTop="@dimen/layout_margin_large"
   android:layout_marginRight="@dimen/layout_margin_large"
   android:text="@{event.name}"
   android:fontFamily="sans-serif-light"
   android:textColor="@color/dark_grey"
   android:textSize="@dimen/text_size_extra_large"
   app:layout_constraintEnd_toEndOf="@+id/eventImage"
   app:layout_constraintStart_toStartOf="@+id/eventImage"
   app:layout_constraintTop_toBottomOf="@+id/eventImage"
   tools:text="Open Source Meetup" />

Sometimes, to bind our data normally we need to use a complex function, then creating Binding Adapter class really helps. For example, Eventyay Attendee heavily uses Picasso function to fetch image to ImageView:

@BindingAdapter("eventImage")
fun setEventImage(imageView: ImageView, url: String?) {
   Picasso.get()
       .load(url)
       .placeholder(R.drawable.header)
       .into(imageView)
}
<ImageView
   android:id="@+id/eventImage"
   android:layout_width="@dimen/layout_margin_none"
   android:layout_height="@dimen/layout_margin_none"
   android:scaleType="centerCrop"
   android:transitionName="eventDetailImage"
   app:layout_constraintDimensionRatio="2"
   app:layout_constraintEnd_toEndOf="parent"
   app:layout_constraintHorizontal_bias="0.5"
   app:layout_constraintStart_toStartOf="parent"
   app:eventImage="@{event.originalImageUrl}"
   app:layout_constraintTop_toBottomOf="@id/alreadyRegisteredLayout" />

Step 4: Finalize data binding setup in Android classes. We can create a binding variable. The binding root will serve as the root node of the layout. Whenever data is needed to be bind, set the data variable stated to that binding variable and call function executePendingBingdings()

private lateinit var rootView: View
private lateinit var binding: FragmentEventBinding
binding = DataBindingUtil.inflate(inflater, R.layout.fragment_event, container, false)
rootView = binding.root
binding.event = event
binding.executePendingBindings()

SOME NOTES

  • In the example mentioned above, the name of the binding variable class is auto-generated based on the name of XML file + “Binding”. For example, the XML name was fragment_event so the DataBinding classes generated name is FragmentEventBinding.
  • The data binding class is only generated only after compiling the project.
  • Sometimes, compiling the project fails because of some problems due to data binding without any clear log messages, then that’s probably because of error when binding your data in XML class. For example, we encounter a problem when changing the value in Attendee data class from firstname to firstName but XML doesn’t follow the update. So make sure you bind your data correctly
<TextView
   android:id="@+id/name"
   android:layout_width="wrap_content"
   android:layout_height="wrap_content"
   android:layout_marginBottom="@dimen/layout_margin_large"
   android:textColor="@color/black"
   android:textSize="@dimen/text_size_expanded_title_large"
   android:text="@{attendee.firstname + ` ` + attendee.lastname}"
   tools:text="@string/name_preview" />

CONCLUSION

Databinding is the way to go when working with a complex UI in Android Development. This helps reducing boilerplate code and to increase the readability of the code and the performance of the UI. One problem with data binding is that sometimes, it is pretty hard to debug with an unhelpful log message. Hopefully, you can empower your UI in your project now with data binding.  

RESOURCES

Eventyay Attendee Android Codebase: https://github.com/fossasia/open-event-android

Eventyay Attendee Android PR: #1961 – feat: Set up data binding for Recycler/Adapter

Documentation: https://developer.android.com/topic/libraries/data-binding

Google Codelab: https://codelabs.developers.google.com/codelabs/android-databinding/#0

Continue ReadingData Binding with Kotlin in Eventyay Attendee