Gestures in SUSI.AI Android
Gestures have become one of the most widely used features by a user. The user usually, expects that some tasks should be performed by the app when he or she executes some gestures on the screen.
A “touch gesture” occurs when a user places one or more fingers on the touch screen, and your application interprets that pattern of touches as a particular gesture. There are correspondingly two phases to gesture detection:
- Gather data about touch events.
- Interpret the data to see if it meets the criteria for any of the gestures your app supports.
There are various kinds of gestures supported by android. Some of them are:
- Tap
- Double Tap
- 2-finger Tap
- 2-finger-double tap
- 3-finger tap
- Pinch
In this post, we will go through the SUSI.AI android app (a smart assistant app) which has the “Right to left swipe” gesture detector in use. When such kind of gesture is detected inside the Chat Activity, it opens the Skill’s Activity. This makes the app very user-friendly. Before we start implementing the code, go through the steps mentioned above in detail.
1st Step “Gather Data”:
When a user places one or more fingers on the screen, this triggers the callback onTouchEvent() on the View that received the touch events. For each sequence of touch events (position, pressure, size, the addition of another finger, etc.) that is ultimately identified as a gesture, onTouchEvent() is fired several times.
The gesture starts when the user first touches the screen, continues as the system tracks the position of the user’s finger(s), and ends by capturing the final event of the user’s fingers leaving the screen. Throughout this interaction, the MotionEvent delivered to onTouchEvent() provides the details of every interaction. Your app can use the data provided by the MotionEvent to determine if a gesture it cares about happened.
2nd Step “Data Interpretation”:
The data received needs to be properly interpreted. The gestures should be properly recognized and processed to perform further actions. Like an app might have different gestures integrated into the same page live “Swipe-to-refresh”, “Double-tap”, “Single tap”, etc. Upon successfully differentiating this kind of gesture, further functions/tasks should be executed.
Let’s go through the code present in SUSI now.
First of all, a new class is created here “CustomGestureListener”. This class extends the “SimpleOnGestureListener” which is a part of the “GestureDetector” library of android. This class contains a function “onFling”. This function determines the gestures across the horizontal axis. event1.getX(), and event2.getX() functions says about the gesture values across the horizontal axis of the device. Here, when the value of X becomes getter than 0, it actually indicates that the user has swiped from right to left. This becomes active even in very minor change, which users might have presses accidentally, or has just touched the screen. So to avoid such minor impulses, we set a value that we will execute our task only when the value of X lies between 100 and 1000. This avoids minor gestures.
Inside the onCreate method, a new CustomGestureListener instance is created, passing through a reference to the enclosing activity and an instance of our new CustomGestureListener class as arguments. Finally, an onTouchEvent() callback method is implemented for the activity, which simply calls the corresponding onTouchEvent() method of the ScaleGestureDetector object, passing through the MotionEvent object as an argument.
Summary:
Gestures are usually implemented to enhance the user experience while using the application. Though there are some predefined gestures in Android, we can also create gestures of our own and use them in our application.
Resources:
Documentation: Gestures
Reference: Gesture
SUSI.AI Android App: PlayStore GitHub
Tags:
SUSI.AI Android App, Kotlin, SUSI.AI, FOSSASIA,GSoC, Android, Gestures
You must be logged in to post a comment.