Making loklak Server’s Kaizen Harvester Extendable

Harvesting strategies in loklak are something that the end users can’t see, but they play a vital role in deciding the way in which messages are collected with loklak. One of the strategies in loklak is defined by the Kaizen Harvester, which generates queries from collected messages.

The original strategy used a simple hash queue which drops queries once it is full. This effect is not desirable as we tend to lose important queries in this process if they come up late while harvesting. To overcome this behaviour without losing important search queries, we needed to come up with new harvesting strategy(ies) that would provide a better approach for harvesting. In this blog post, I am discussing the changes made in the kaizen harvester so it can be extended to create different flavors of harvesters.

What can be different in extended harvesters?

To make the Kaizen harvester extendable, we first needed to decide that what are the parts in the original Kaizen harvester that can be changed to make the strategy different (and probably better).

Since one of the most crucial part of the Kaizen harvester was the way it stores queries to be processed, it was one of the most obvious things to change. Another thing that should be allowed to configure across various strategies was the decision of whether to go for harvesting the queries from the query list.

Query storage with KaizenQueries

To allow different methods of storing the queries, KaizenQueries class was introduced in loklak. It was configured to provide basic methods that would be required for a query storing technique to work. A query storing technique can be any data structure that we can use to store search queries for Kaizen harvester.

public abstract class KaizenQueries {

     public abstract boolean addQuery(String query);

     public abstract String getQuery();

     public abstract int getSize();

     public abstract int getMaxSize();

     public boolean isEmpty() {
         return this.getSize() == 0;
     }
}

[SOURCE]

Also, a default type of KaizenQueries was introduced to use in the original Kaizen harvester. This allowed the same interface as the original queue which was used in the harvester.

Another constructor was introduced in Kaizen harvester which allowed setting the KaizenQueries for an instance of its derived classes. It solved the problem of providing an interface of KaizenQueries inside the Kaizen harvester which can be used by any inherited strategy –

private KaizenQueries queries = null;

public KaizenHarvester(KaizenQueries queries) {
    ...
    this.queries = queries;
    ...
}

public void someMethod() {
    ...
    this.queries.addQuery(someQuery);
    ...
}

[SOURCE]

This being added, getting new queries or adding new queries was a simple. We just need to use getQuery() and addQuery() methods without worrying about the internal implementations.

Configurable decision for harvesting

As mentioned earlier, the decision taken for harvesting should also be configurable. For this, a protected method was implemented and used in harvest() method –

protected boolean shallHarvest() {
    float targetProb = random.nextFloat();
    float prob = 0.5F;
    if (this.queries.getMaxSize() > 0) {
        prob = queries.getSize() / (float)queries.getMaxSize();
    }
    return !this.queries.isEmpty() && targetProb < prob;
}

@Override
public int harvest() {
    if (this.shallHarvest()) {
        return harvestMessages();
    }

    grabSuggestions();

    return 0;
}

[SOURCE]

The shallHarvest() method can be overridden by derived classes to allow any type of harvesting decision that they want. For example, we can configure it in such a way that it harvests if there are any queries in the queue, or to use a different probability distribution instead of linear (maybe gaussian around 1).

Conclusion

This blog post explained about the changes made in loklak’s Kaizen harvester which allowed other harvesting strategies to be built on its top. It discussed the two components changed and how they allowed ease of inheriting the original Kaizen harvester. These changes were proposed in PR loklak/loklak_server#1203 by @singhpratyush (me).

Resources

Continue ReadingMaking loklak Server’s Kaizen Harvester Extendable

Accessing Child Component’s API in Loklak Search

Loklak search being an angular application, comprises of components. Components provide us a way to organize the application in a more consistent way, along with providing the ability to reuse code in the application. Each component has two type of API’s public and private. Public API is the API which it exposes to the outer world for manipulating the working of the component, while private API is something which is local to the component and cannot be directly accessed by the outside world. Now when this distinction between the two is clear, it is important to state the need of these API’s, and why are they required in loklak search.

The components can never live in isolation, i.e. they have to communicate with their parent to be able to function properly. Same is the case with components of loklak search. They have to interact with others to make the application work. So how this, interaction looks like,

The rule of thumb here is, data flows down, events flow up. This is the core idea of all the SPA frameworks of modern times, unidirectional data flow, and these interactions can be seen everywhere in loklak search.

<feed-header
   [query]="query"
   (searchEvent)="doSearch($event)"></feed-header>

This is how a simple component’s API looks in loklak search. Here our component is FeedHeader and it exposes some of it’s API as inputs and outputs.

export class FeedHeaderComponent {

 @Input() query: string;

 @Output() searchEvent: EventEmitter<string> = new EventEmitter<string>();

  // Other methods and properties of the component
}

The FeedHeaderComponent ‘s class defines some inputs which it takes. These inputs are the data given to the component. Here the input is a simple query property, and the parent at the time of instantiating the component, passes the value to it’s child as [query]=”query”. This enables the one direction of API, from parent to child. Now, we also need a way for parent to be able to events generated by the child on interaction with user. For example, here we need to have a way to tell the parent to perform a search whenever user presses search button. For this the Output property searchEvent is used. The search event can be emitted by the child component independently. While the parent, if it wants to listen to child components simply do so by binding to the event and running a corresponding function whenever event is emitted (searchEvent)=”doSearch($event)”. Here the event which parent listens to is searchEvent and whenever such an event is emitted by the child a function doSearch is run by the parent. Thus this completes the event flow, from child to parent.

Now it is worth noticing that all these inputs for data and outputs for events is provided by the child component itself. They are the API of the child and parent’s job is just to bind to these inputs and outputs to bind to data and listen to events. This allows the component interactions in both directions.

@ViewChild and triggering child’s methods

The inputs are important to carry data from the parent to the child, declaratively but sometimes it is necessary for the parent to access the public API of it’s child more directly, specially the API methods to trigger an action. These methods require the way for the parent to access its child component. This is done by @ViewChild decorator. The child element which the parent wants access to, have to declare the component as, one of it’s attributes. Like in our example, the FeedHeaderComponent needs access to its child component SuggestBoxComponent, to show/hide suggest box as and when required. So here the feed header component gets the access to its child using viewchild decorator.

export class FeedHeaderComponent {

  @ViewChild(‘#suggestBox) suggestBox: SuggestBoxComponent;

  // Other properties and methods

  toggleSuggestBox() {

     this.suggestBox.toggle();
  }
}

The SuggestBoxComponent here has a public method toggle() which toggles the visibility state of the suggest box. This method is available as a component’s public API method. The parent of this component calls this method using the @ViewChild reference which it grabbed at the time of view instantiation.

export class SuggestBoxComponent {

  private suggestBoxVisible = true;

  public toggle() {

     this.suggestBoxVisible = !this.suggestBoxVisible;
  }
}

Resources and Links

  • Angular document pages
  • Basic Usage in Angular tour of heroes tutorial
  • In depth usage blog for Inputs and Outputs SitePoint Tutorial
  • Loklak Search Repo
Continue ReadingAccessing Child Component’s API in Loklak Search

How to Store and Retrieve User Settings from SUSI Server in SUSI iOS

Any user using the SUSI iOS client can set preferences like enabling or disabling the hot word recognition or enabling input from the microphone. These settings need to be stored, in order to be used across all platforms such as web, Android or iOS. Now, in order to store these settings and maintain a synchronization between all the clients, we make use of the SUSI server. The server provides an endpoint to retrieve these settings when the user logs in.

First, we will focus on storing settings on the server followed by retrieving settings from the server. The endpoint to store settings is as follows:

http://api.susi.ai/aaa/changeUserSettings.json?key=key&value=value&access_token=ACCESS_TOKEN

This takes the key value pair for storing a settings and an access token to identify the user as parameters in the GET request. Let’s start by creating the method that takes input the params, calls the API to store settings and returns a status specifying if the executed successfully or not.

 let url = getApiUrl(UserDefaults.standard.object(forKey: ControllerConstants.UserDefaultsKeys.ipAddress) as! String, Methods.UserSettings)

        _ = makeRequest(url, .get, [:], parameters: params, completion: { (results, message) in
            if let _ = message {
                completion(false, ResponseMessages.ServerError)
            } else if let results = results {
                guard let response = results as? [String : AnyObject] else {
                    completion(false, ResponseMessages.ServerError)
                    return
                }
                if let accepted = response[ControllerConstants.accepted] as? Bool, let message = response[Client.UserKeys.Message] as? String {
                    if accepted {
                        completion(true, message)
                        return
                    }
                    completion(false, message)
                    return
                }
            }
        })

Let’s understand this function line by line. First we generate the URL by supplying the server address and the method. Then, we pass the URL and the params in the `makeRequest` method which has a completion handler returning a results object and an error object. Inside the completion handler, check for any error, if it exists mark the request completed with an error else check for the results object to be a dictionary and a key `accepted`, if this key is `true` our request executed successfully and we mark the request to be executed successfully and finally return the method. After making this method, it needs to be called in the view controller, we do so by the following code.

Client.sharedInstance.changeUserSettings(params) { (_, message) in
  DispatchQueue.global().async {
    self.view.makeToast(message)
  }
}

The code above takes input params containing the user token and key-value pair for the setting that needs to be stored. This request runs on a background thread and displays a toast message with the result of the request.

Now that the settings have been stored on the server, we need to retrieve these settings every time the user logs in the app. Below is the endpoint for the same:

http://api.susi.ai/aaa/listUserSettings.json?access_token=ACCESS_TOKEN

This endpoint accepts the user token which is generated when the user logs in which is used to uniquely identify the user and his/her settings are returned. Let’s create the method that would call this endpoint and parse and save the settings data in the iOS app’s User Defaults.

if let _ = message {
  completion(false, ResponseMessages.ServerError)
} else if let results = results {
  guard let response = results as? [String : AnyObject] else {
    completion(false, ResponseMessages.ServerError)
    return
  }
  guard let settings = 
response[ControllerConstants.Settings.settings.lowercased()] as? [String:String] else {
    completion(false, ResponseMessages.ServerError)
    return
  }
  for (key, value) in settings {
    if value.toBool() != nil {
      UserDefaults.standard.set(value.toBool()!, forKey: key)
    } else {
      UserDefaults.standard.set(value, forKey: key)
    }
  }
  completion(true, response[Client.UserKeys.Message] as? String ?? "error")
}

Here, the creation of the URL is same as we created above the only difference being the method passed. We parse the settings key value into a dictionary followed by a loop which loop’s through all the keys and stores the value in the User Defaults for that key. We simply call this method just after user log in as follows:

Client.sharedInstance.fetchUserSettings(params as [String : AnyObject]) { (success, message) in
  DispatchQueue.global().async {
    print("User settings fetch status: \(success) : \(message)")
  }
}

That’s all for this tutorial where we learned how to store and retrieve settings on the SUSI Server.

References

Continue ReadingHow to Store and Retrieve User Settings from SUSI Server in SUSI iOS

Adding Tumblr Upload Feature in Phimpme

The Phimpme Android application along with various other cloud storage and social media upload features provides an option to upload the images on Tumblr without having to download any other applications. In this post, I will be explaining how I integrated Tumblr in phimpme as there is no proper guide on the web how to integrate to Tumblr in Android. Tumblr provides an Android-SDK but there is no proper documentation to it and is not enough to authenticate and upload the images to it. After so much research I came to a solution. So read this article to know how to integrate Tumblr in Android.

Step 1:

First, add two dependencies to your project one is for Android SDK of Tumblr and one is for loglr which help you to get login on Tumblr.

dependencies {

compile 'com.daksh:loglr:1.2.1'

compile 'com.tumblr:jumblr:0.0.11'

}

Step 2:

  1. Register your app on Tumblr to obtain developer keys.
  2. Enter callback URL it is important to get keys.
  3. Generate CONSUMER_KEY & CONSUMER_SECRET from the official developer console of Tumblr.

Register your application

Step 3:

Now we use Loglr library to log in to Tumblr. Tumblr doesn’t provide any library to login so I am using Loglr library for login Tumblr. After successfully log in Loglr will return API_KEY and API_SECRET. We will use these keys later to upload the image. Save these keys as constant variables.

public final static String TUMBLR_CONSUMER_KEY = "ENTER-CONSUMER-KEY";

public final static String TUMBLR_CONSUMER_SECRET = "ENTER-CONSUMER-SECRET";

Step 4:

To authenticate the Tumblr use loglr login instance and it can be done as follows.

Loglr.getInstance()

     .setConsumerKey(Constants.TUMBLR_CONSUMER_KEY)

     .setConsumerSecretKey(Constants.TUMBLR_CONSUMER_SECRET)

     .setLoginListener(loginListener)

     .setExceptionHandler(exceptionHandler)

     .enable2FA(true)

     .setUrlCallBack(Constants.CALL_BACK_TUMBLR)

     .initiateInActivity(AccountActivity.this);

After that you will be prompt to enter your tumblr credentials to authenticate the phimpme Android app. Once you have done it will return api_token and api_secret. Now save this in database.

account.setToken(loginResult.getOAuthToken());

account.setSecret(loginResult.getOAuthTokenSecret());

Step 5:

Once the authentication is done now we can upload an image directly to Tumblr from the Share activity in the Phimpme Android application. To upload an image create an async task so that uploading process will run in a background thread and not block the main UI thread. Keep in mind Tumblr require 4 variable to create Tumblr client CONSUMER_KEY, CONSUMER_SECRET, API_KEY and API_SECRET. Now we can create a Tumblr client using these 4 values. Once the client is created we are ready to get data from Tumblr and upload an image on Tumblr. Before uploading an image on Tumblr we need blog name because the user can have multiple blogs on Tumblr so we need to ask the user to choose a blog name from the list or we can provide dialog to enter blog name manually. Now enter the following code in the doInBackground() method of asynctask.

PhotoPost post = null;

try {

 post = client.newPost(user.getBlogs().get(0).getName(), PhotoPost.class);

 if (caption!=null && !caption.isEmpty())

 post.setCaption(caption);

 post.setData(new File(imagePath));

 post.save();

} catch (IllegalAccessException | InstantiationException e) {

 success = false;

}

If success variable is true that means our image is uploaded successfully. This is how I implemented the upload feature to Tumblr using two different libraries. To get the full source code, please refer to the Phimpme Android repository.

Resources:

Continue ReadingAdding Tumblr Upload Feature in Phimpme

Uploading Images to Box Storage from the Phimpme Application

The Phimpme Android application along with many other cloud storage applications integrated like Dropbox, Imgur, Pinterest has the option to upload the image to the Box storage without having to install any other applications on the device. From the Phimpme app, the user can click the photo, edit it, view any image from the gallery and then can upload multiple images to many storage services or social media without any hassle. In this tutorial, I will be discussing how we achieved the functionality to upload the images on the Box storage.

Step 1:

To integrate Box storage to the application, the first thing we need to do is create an application from the Box developers console and get the CLIENT_ID and the CLIENT_SECRET. To do this:

  1. Go to the Box developer page and log in.
  2. Create a new application from the app console.
  3. It will now give options to select what kind of application are you using. Select the option partner integration to get the image upload functionality to all the users.
  4. Give a name to your application and finally click the create application button.
  5. After this, you will be taken to the screen similar to the below screenshot from where you can copy the CLIENT_ID and CLIENT_SECRET to be used later on.

Step 2:

Now coming to the Android part, to get the image upload functionality to the box storage, we need to make use of the Android SDK provided by Box to achieve the uploading functionality easily. To add the SDK to the Android project from Gradle, copy the following line of Gradle script which will download and install the SDK from the maven repository.

maven{ url "https://mvnrepository.com/artifact/com.box/box-android-sdk" }
compile group: 'com.box', name: 'box-android-sdk', version: '4.0.8'

Now after adding the SDK, rebuild the project.

Step 3:

After this, to upload the file on the box storage, we need to login to the Box account to get the access token and the user name to store it in Realm database. For this, first we have to configure the Box client. This can be done using the following line of code.

BoxConfig.CLIENT_ID = BOX_CLIENT_ID;
BoxConfig.CLIENT_SECRET = BOX_CLIENT_SECRET;

Replace the BOX_CLIENT_ID and BOX_CLIENT_SECRET in the above code with the key received by following the step 1.

After configuring the Box API client, we can authenticate the user using the sessionBox object by following lines of code.

sessionBox = new BoxSession(AccountActivity.this);
sessionBox.authenticate();

After the authentication, we have to get the user name and the access token of the user using the authenticated session box.

account.setUsername(sessionBox.getUser().getName());
account.setToken(String.valueOf(accessToken));

After authenticating the user, we can upload the photos directly to the Box storage from the Share Activity in the Phimpme Android application. For this, we have to create a new Asynchronous task in android which will do the network operations in the background to avoid the Network on main thread exception. After this, we can make use of the Box file API to upload the photos to the Box storage and call the function getUploadRequest which takes the three parameters file input stream, the upload name of the file and the destination of the folder respectively. This can be done by the following lines of code.

mFileApi = new BoxApiFile(sessionBox);
BoxRequestsFile.UploadFile request = mFileApi.getUploadRequest(inputStream, uploadName, destinationFolderId);

This upload request throws a BoxException so we have to catch that exception using the try/catch block to avoid the application crash in case the upload request fails.

This is how we have implemented the upload to Box storage functionality in the Phimpme Android application. To get the full source code, please refer to the Phimpme Android repository.

Resources

  1. Box Developers app console : https://app.box.com/developers/console/
  2. Box Android SDK GitHub page : https://github.com/box/box-android-sdk
  3. Android’s Developer page on NetworkOnMainThreadExceoption : https://developer.android.com/reference/android/os/NetworkOnMainThreadException.html
Continue ReadingUploading Images to Box Storage from the Phimpme Application

Posting Scraped Tweets to Loklak server from Loklak Wok Android

Loklak Wok Android is a peer harvester that posts collected  messages to the Loklak Server. The suggestions to search tweets are fetched using suggest API endpoint. Using the suggestion queries, tweets are scraped. The scraped tweets are shown in a RecyclerView and simultaneously they are posted to loklak server using push API endpoint. Let’s see how this is implemented.

Adding Dependencies to the project

This feature heavily uses Retrofit2, Reactive extensions(RxJava2, RxAndroid and Retrofit RxJava adapter) and RetroLambda (for Java lambda support in Android).

In app/build.gradle:

apply plugin: 'com.android.application'
apply plugin: 'me.tatarka.retrolambda'

android {
   ...
   packagingOptions {
       exclude 'META-INF/rxjava.properties'
   }
}

dependencies {
   ...
   compile 'com.google.code.gson:gson:2.8.1'

   compile 'com.squareup.retrofit2:retrofit:2.3.0'
   compile 'com.squareup.retrofit2:converter-gson:2.3.0'
   compile 'com.squareup.retrofit2:adapter-rxjava2:2.3.0'

   compile 'io.reactivex.rxjava2:rxjava:2.0.5'
   compile 'io.reactivex.rxjava2:rxandroid:2.0.1'
}

 

In build.gradle project level:

dependencies {
   classpath 'com.android.tools.build:gradle:2.3.3'
   classpath 'me.tatarka:gradle-retrolambda:3.2.0'
}

 

Implementation

The suggest and push API endpoint is defined in LoklakApi interface

public interface LoklakApi {

   @GET("/api/suggest.json")
   Observable<SuggestData> getSuggestions(@Query("q") String query, @Query("count") int count);

   @POST("/api/push.json")
   @FormUrlEncoded
   Observable<Push> pushTweetsToLoklak(@Field("data") String data);
}

 

The POJOs (Plain Old Java Objects) for suggestions and posting tweets are obtained using jsonschema2pojo, Gson uses POJOs to convert JSON to Java objects.

The REST client is created by Retrofit2 and is implemented in RestClient class. The Gson converter and RxJava adapter for retrofit is added in the retrofit builder. create method is called to generate the API methods(retrofit implements LoklakApi Interface).

public class RestClient {

   private RestClient() {
   }

   private static void createRestClient() {
       sRetrofit = new Retrofit.Builder()
               .baseUrl(BASE_URL)
               // gson converter
               .addConverterFactory(GsonConverterFactory.create(gson))
               // retrofit adapter for rxjava
               .addCallAdapterFactory(RxJava2CallAdapterFactory.create())
               .build();
   }

   private static Retrofit getRetrofitInstance() {
       if (sRetrofit == null) {
           createRestClient();
       }
       return sRetrofit;
   }

   public static <T> T createApi(Class<T> apiInterface) {
       // create method to generate API methods
       return getRetrofitInstance().create(apiInterface);
   }

}

 

The suggestions are fetched by calling getSuggestions after LoklakApi interface is implemented. getSuggestions returns an Observable of type SuggestData, which contains the suggestions in a List. For scraping tweets only a single query needs to be passed to LiquidCore, so flatmap is used to transform the observabe and then fromIterable operator is used to emit single queries as string to LiquidCore which then scrapes tweets, as implemented in fetchSuggestions

private Observable<String> fetchSuggestions() {
   LoklakApi loklakApi = RestClient.createApi(LoklakApi.class);
   Observable<SuggestData> observable = loklakApi.getSuggestions("", 2);
   return observable.flatMap(suggestData -> {
       List<Query> queryList = suggestData.getQueries();
       List<String> queries = new ArrayList<>();
       for (Query query : queryList) {
           queries.add(query.getQuery());
       }
       return Observable.fromIterable(queries);
   });
}

 

As LiquidCore uses callbacks to create a connection between NodeJS instance and Android, to maintain a flow of observables a custom observable is created using create operator which encapsulates the callbacks inside it. For a detail understanding of how LiquidCore event handling works, please go through the example. The way it is implemented in getScrapedTweets:

private Observable<ScrapedData> getScrapedTweets(final String query) {
   final String LC_TWITTER_URI = "android.resource://org.loklak.android.wok/raw/twitter";
   URI uri = URI.create(LC_TWITTER_URI);

   return Observable.create(emitter -> { // custom observable creation
       EventListener startEventListener = (service, event, payload) -> {
               service.emit(LC_QUERY_EVENT, query);
           service.emit(LC_FETCH_TWEETS_EVENT);
       };

       EventListener getTweetsEventListener = (service, event, payload) -> {
           ScrapedData scrapedData = mGson.fromJson(payload.toString(), ScrapedData.class);
           emitter.onNext(scrapedData); // data emitted using ObservableEmitter
       };

       MicroService.ServiceStartListener serviceStartListener = (service -> {
           service.addEventListener(LC_START_EVENT, startEventListener);
           service.addEventListener(LC_GET_TWEETS_EVENT, getTweetsEventListener);
       });

       MicroService microService = new MicroService(getActivity(), uri, serviceStartListener);
       microService.start();
   });
}

 

Now that we are getting suggestions and using them to get scraped tweets, this needs to be done periodically, so that tweets are pushed continuously to the loklak server. For this interval operator is used. A List is maintained which contains the suggestion queries based on which tweets are to be scraped. Once the scraping is done, the suggestion query is removed from the list when they are displayed in RecyclerView. And if the list is empty, then only a new set of suggestions are fetched.

Observable.interval(4, TimeUnit.SECONDS)
       .flatMap(this::getSuggestionsPeriodically)
       .flatMap(query -> {
           mSuggestionQuerries.add(query); // query added to list
           return getScrapedTweets(query);
       });

 

Method reference is used to maintain the modularity, so the logic of periodically fetching suggestions is implemented in getSuggestionsPeriodically

private Observable<String> getSuggestionsPeriodically(Long time) {
   if (mSuggestionQuerries.isEmpty()) { // checks if list is empty
       mInnerCounter = 0;
       return fetchSuggestions(); // new suggestions
   } else { // wait for a previous request to complete
       mInnerCounter++;
       if (mInnerCounter > 3) { // if some strange error occurs
           mSuggestionQuerries.clear();
       }
       return Observable.never(); // no observable is passed to subsriber, subscriber waits
   }
}

 

Now, it’s time to display the fetched tweets and then push the tweets to loklak server. When periodic fetching of suggestions was implemented we used interval operator and then flatMap to transform observables i.e. chaining network requests.

Till this point the observable we were creating were Cold Observable.Cold observables only emit values when a subscription is made. As we need to display scraped tweets and then push it, i.e. one source of observables and two (multiple) subscribers. By intuition the observable should be subscribed two times, for example:

Observable observable = Observable.interval(4, TimeUnit.SECONDS)
       .flatMap(this::getSuggestionsPeriodically)
       .flatMap(query -> {
           mSuggestionQuerries.add(query);
           return getScrapedTweets(query);
       });

// first time subscription
observable
       .subscribeOn(Schedulers.io())
       .observeOn(AndroidSchedulers.mainThread())
       .subscribe(
	// display in RecyclerVIew
       );


// second time subscription
observable
       .flatMap(// trnasformations to push data to server)
       .subscribeOn(Schedulers.io())
       .observeOn(AndroidSchedulers.mainThread())
       .subscribe(
	// display in number of tweets pushed
       );

 

But the source observable is cold observable i.e. it emits objects when it is subscribed to, due to which there will be two different network calls, one for first subscription and one for second subscription. So, both the subscriptions will have different data, which is not what is desired. The expected result is that there should be a single network call, and the data obtained from that call should be displayed and pushed to loklak server.

For this, hot Observables are used. Hot observables start emitting objects the moment they are created, irrespective of whether they are subscribed or not.

A cold observable can be converted to a hot observable by using publish operator and it starts emitting objects when connect operator is used. This is implemented in displayAndPostScrapedData:

ConnectableObservable<ScrapedData> observable = Observable.interval(4, TimeUnit.SECONDS)
           .flatMap(this::getSuggestionsPeriodically)
           .flatMap(query -> {
               mSuggestionQuerries.add(query);
               return getScrapedTweets(query);
           })
           .retry(2)
           .publish();

   // first time subscription to display scraped data
   Disposable viewDisposable = observable
           .subscribeOn(Schedulers.io())
           .observeOn(AndroidSchedulers.mainThread())
           .subscribe(
                   this::displayScrapedData,
                   this::setNetworkErrorView
           );
   mCompositeDisposable.add(viewDisposable);
  
   // second time subscription for pushing data to loklak
   Disposable pushDisposable = observable
           .flatMap(this::pushScrapedData) // scraped data transformed for pushing
           .subscribeOn(Schedulers.io())
           .observeOn(AndroidSchedulers.mainThread())
           .subscribe(
                   push -> {
                       mHarvestedTweets += push.getRecords();
                       harvestedTweetsCountTextView.setText(String.valueOf(mHarvestedTweets));
                   },
                   throwable -> {}
           );
   mCompositeDisposable.add(pushDisposable);

   Disposable publishDisposable = observable.connect(); // hot observable starts emitting
   mCompositeDisposable.add(publishDisposable);
}

 

The two subscriptions are made before connect operator is invoked because the hot observable emits objects due to successful network calls and network calls can’t be done on MainThread (UI Thread). So, doing the subscription before, channels the network calls to a background thread.

The scraped data is converted to JSON from objects using Gson, the JSON is converted to string and then using push API endpoint it is posted to loklak server. This is implemented in pushScrapedData method, which is used in second subscription by using method referencing.

private Observalbe<Push> pushScrapedData(ScrapedData scrapedData) throws Exception{    
    LoklakApi loklakApi = RestClient.createApi(LoklakApi.class);
    List<Status> statuses = scrapedData.getStatuses();
    String data = mGson.toJson(statuses);
    JSONArray jsonArray = new JSONArray(data);
    JSONObject jsonObject = new JSONObject();
    jsonObject.put("statuses", jsonArray);
    return loklakApi.pushTweetsToLoklak(jsonObject.toString());
}

 

Method reference for displayScrapedData and setNetworkErrorView methods are used to display the scraped data and handle unsuccessful network requests.

Only 80 tweets are preserved in RecyclerView. If number of tweets exceeds 80, then old tweets are removed.

private void displayScrapedData(ScrapedData scrapedData) {
   String query = scrapedData.getQuery();
   List<Status> statuses = scrapedData.getStatuses();
   mSuggestionQuerries.remove(query);
   if (mHarvestedTweetAdapter.getItemCount() > 80) {
       mHarvestedTweetAdapter.clearAdapter(); // old tweets removed
   }
   mHarvestedTweetAdapter.addHarvestedTweets(statuses);
   int count = mHarvestedTweetAdapter.getItemCount() - 1;
   recyclerView.scrollToPosition(count);
}

 

In case of a network error, the visibility of RecyclerView and TextView (which shows number of tweets pushed) is changed to gone and a message is displayed that there is network error.

private void setNetworkErrorView(Throwable throwable) {
   Log.e(LOG_TAG, throwable.toString());
   // recyclerView and TextView showing count of harvested tweets are hidden
   ButterKnife.apply(networkViews, GONE);
   // network error message displayed
   networkErrorTextView.setVisibility(View.VISIBLE);
}

References

Resources

Continue ReadingPosting Scraped Tweets to Loklak server from Loklak Wok Android

Realm database in Loklak Wok Android for Persistent view

Loklak Wok Android provides suggestions for tweet searches. The suggestions are stored in local database to provide a persistent view, resulting in a better user experience. The local database used here is Realm database instead of sqlite3 which is supported by Android SDK. The proper way to use an sqlite3 database is to first create a contract where the schema of the database is defined, then a database helper class which extends from SQLiteOpenHelper class where the schema is created i.e. tables are created and finally write ContentProvider so that you don’t have to write long SQL queries every time a database operation needs to be performed. This is just a lot of hard work to do, as this includes a lot of steps, debugging is also difficult. A solution to this can be using an ORM that provides a simple API to use sqlite3, but the currently available ORMs lack in terms of performance, they are too slow. A reliable solution to this problem is realm database, which is faster than raw sqlite3 and has really simple API for database operations. This blog explains the use of realm database for storing tweet search suggestions.

Adding Realm database to Android project

In project level build.gradle

buildscript {
   repositories {
       jcenter()
   }
   dependencies {
       classpath 'com.android.tools.build:gradle:2.3.3'
       classpath "io.realm:realm-gradle-plugin:3.3.1"

       // NOTE: Do not place your application dependencies here; they belong
       // in the individual module build.gradle files
   }
}

 

And at the top of app/build.gradle “apply plugin: ‘realm-android'”  is added.

Using Realm Database

Let’s start with a simple example. We have a Student class that has only two attributes name and age. To create the model for the database, the Student class is simply extended to RealmObject.

public class Student extends RealmObject {

   private String name;
   private int age;

   // A constructor needs to be explicitly defined, be it an empty constructor
   public Student(String name, int age) {
       this.name = name;
       this.age = age;
   }

   // getters and setters
}

 

To push data to the database, Java objects are created, a transaction is initialized, then copyToRealm method is used to push the data and finally the transaction is committed. But before all this, the database is initialized and a Realm instance is obtained.

Realm.init(context); // Database initialized
Realm realm = Realm.getDefaultInstance(); // realm instance obtained
      
Student student = new Student("Rahul Dravid", 22); // Simple java object created
realm.beginTransaction() // initialization of transaction
realm.copyToRealm(student); // pushed to database
realm.commitTransaction(); // transaction committed

 

copyToRealm takes only a single parameter, the parameter can be an object or an Iterable. Off course, the passed parameter should extend RealmObject. A List of Student can be passed as a parameter to copyToRealm to push multiple data into the database.

The above way of inserting data is synchronous. Realm also supports asynchronous transactions, you guessed it right, you don’t have to depend on AsyncTaskLoader. The same operation can be performed asynchronously as

realm.executeTransaction(new Realm.Transaction() { 
    @Override public void execute(Realm realm) {
        Student student = new Student("Rahul Dravid", 22);
        realm.copyToRealm(student);
    }
});

 

Now, querying the database is as easy as inserting.

RealmResults<Student> studentList = realm.where(Student.class).findAll();

 

No, transaction is required as we are not manipulating the database, as data is just read from the database. RealmResults extend Java List, so List methods which don’t manipulate the List can be used on studentList e.g. get(int index) to obtain object at the index.

The result can also be a filtered one, for example filtering students who are 22 years old.

RealmResults<Student> studentList = realm.where(Student.class).equalTo(age, 22).findAll();

 

Now, removing data from database. deleteAllFromRealm method can be executed on the obtained RealmResults or to completely remove data of a model class delete(Model.class) method on the realm instance is invoked. The operations should be enclosed between beginTransaction and commitTransaction if synchronous behaviour is required else for asynchronous behaviour the operation is done in execute method of an anonymous object of Realm.Transaction.

studentList.deleteAllFromRealm(); // removes the filtered result
realm.delete(Student.class); // removes all data of model class

 

Storing Tweet Search Suggestions in Loklak Wok Android for Persistent view

Loklak Wok Android uses Retrofit2 for sending network requests, for which POJO classes are already created so that it becomes easy for parsing the obtained JSON from network request. Due to this using Realm database becomes more easier, as the defined POJOs can be simply extended to RealmObject to create the model class of the data e.g. Query class extends RealmObject, one of the attribute is the suggestion query i.e. mQuery.

The database is initialized in LoklakWokApplication, the application class, this way the database is initialized only once which persists throughout the app lifecycle.

@Override
public void onCreate() {
   super.onCreate();
   Realm.init(this);
   RealmConfiguration realmConfiguration = new RealmConfiguration.Builder()
           .name(Realm.DEFAULT_REALM_NAME)
           .deleteRealmIfMigrationNeeded()
           .build();
   Realm.setDefaultConfiguration(realmConfiguration);
}

 

deleteRealmIfMigrationNeeded removes the old realm database and creates a new one if any of the Model class get changed i.e. an attribute is removed, added or simply the name of attribute is changed. This is done as we are not storing user generated data, we are just using database to provide persistent view. So, the previously kept data is not important.

The database is closed in onTerminate callback of the application

@Override
public void onTerminate() {
   Realm.getDefaultInstance().close();
   super.onTerminate();
}

 

Now that database is initialized. We fetch the previously stored data and display it in RecyclerView. If the network request is successful the queries from database are replaced by the queries fetched in network request, else the queries from database are displayed providing a persistent view. The way it is implemented in onCreateView of SuggestFragment

mRealm = Realm.getDefaultInstance();
...
// old queries obtained from database
RealmResults<Query> queryRealmResults = mRealm.where(Query.class).findAll();
List<Query> queries = mRealm.copyFromRealm(queryRealmResults);
// RecyclerView adapter created with old queries
mSuggestAdapter = new SuggestAdapter(queries, this);
tweetSearchSuggestions.setLayoutManager(new LinearLayoutManager(getActivity()));
tweetSearchSuggestions.setAdapter(mSuggestAdapter);

 

The old queries are replaced in onSuccessfulRequest

private void onSuccessfulRequest(SuggestData suggestData) {
   // suggestData contains suggestion queries
   if (suggestData != null) {
       // old queries replaced with new ones
       mSuggestAdapter.setQueries(suggestData.getQueries());
   }
   setAfterRefreshingState();
}

 

Now suggestion queries needs to be inserted into the database. Only the latest suggestions are inserted i.e. queries present when onStop lifecycle method of fragment is called, and as previous queries are not needed anymore, they are deleted. The operation is performed in a synchronous way.

@Override
public void onStop() {
   ...
   mRealm.beginTransaction();
   // old queries deleted
   mRealm.delete(Query.class);
   // new queries inserted
   mRealm.copyToRealm(mSuggestAdapter.getQueries());
   mRealm.commitTransaction();
   // fragment lifecycle called i.e. a new fragment/activity opens
   super.onStop();
}

 

Conclusion: Sqlite3 and Realm comparison

Operations Sqlite3 Realm
Table creation CREATE TABLE … extends RealmObject
Inserting data INSERT INTO … copyToRealm
Searching data SELECT … realm.where(Model.class)
Deleting data DELETE FROM … realmResults.deleteAllFromRealm() or realm.delete(Model.class)

Resources:

 

 

Continue ReadingRealm database in Loklak Wok Android for Persistent view

Implementing Tweet Search Suggestions in Loklak Wok Android

Loklak Wok Android not only is a peer harvester for Loklak Server but it also provides users to search tweets using Loklak’s API endpoints. To provide a better search tweet search experience to the users, the app provides search suggestions using suggest API endpoint. The blog describes how “Search Suggestions” is implemented.

Third Party Libraries used to Implement Suggestion Feature

  • Retrofit2: Used for sending network request
  • Gson: Used for serialization, JSON to POJOs (Plain old java objects).
  • RxJava and RxAndroid: Used to implement a clean asynchronous workflow.
  • Retrolambda: Provides support for lambdas in Android.

These libraries can be installed by adding the following dependencies in app/build.gradle

android {
   .
   // removes rxjava file repetations
   packagingOptions {
      exclude 'META-INF/rxjava.properties'
   }
}

dependencies {
   // gson and retrofit2
    compile 'com.google.code.gson:gson:2.8.1'
    compile 'com.squareup.retrofit2:retrofit:2.3.0'
    compile 'com.squareup.retrofit2:converter-gson:2.3.0'
    compile 'com.squareup.retrofit2:adapter-rxjava2:2.3.0'

   // rxjava and rxandroid
    compile 'io.reactivex.rxjava2:rxjava:2.0.5'
    compile 'io.reactivex.rxjava2:rxandroid:2.0.1'
    compile 'com.jakewharton.rxbinding2:rxbinding:2.0.0'
}

 

To add retrolambda

// in project's build.gradle
dependencies {
    
    classpath 'me.tatarka:gradle-retrolambda:3.2.0'
}

// in app level build.gradle at the top
apply plugin: 'me.tatarka.retrolambda'

 

Fetching Suggestions

Retrofit2 sends a GET request to search API endpoint, the JSON response returned is serialized to Java Objects using the models defined in models.suggest package. The models can be easily generated using JSONSchema2Pojo. The benefit of using Gson is that, the hard work of parsing JSON is easily handled by it. The static method createRestClient creates the retrofit instance to be used for network calls

private static void createRestClient() {
   sRetrofit = new Retrofit.Builder()
           .baseUrl(BASE_URL) // base url : https://api.loklak.org/api/
           .addConverterFactory(GsonConverterFactory.create(gson))
           .addCallAdapterFactory(RxJava2CallAdapterFactory.create())
           .build();
}

 

The suggest endpoint is defined in LoklakApi interface

public interface LoklakApi {

   @GET("/api/suggest.json")
   Observable<SuggestData> getSuggestions(@Query("q") String query);

   @GET("/api/suggest.json")
   Observable<SuggestData> getSuggestions(@Query("q") String query, @Query("count") int count);

   .
}

 

Now, the suggestions are obtained using fetchSuggestion method. First, it creates the rest client to send network requests using createApi method (which internally calls creteRestClient implemented above). The suggestion query is obtained from the EditText. Then the RxJava Observable is subscribed in a separate thread which is specially meant for doing IO operations and finally the obtained data is observed i.e. views are inflated in the MainUI thread.

private void fetchSuggestion() {
   LoklakApi loklakApi = RestClient.createApi(LoklakApi.class); // rest client created
   String query = tweetSearchEditText.getText().toString(); // suggestion query from EditText
   Observable<SuggestData> suggestionObservable = loklakApi.getSuggestions(query); // observable created
   Disposable disposable = suggestionObservable
           .subscribeOn(Schedulers.io()) // subscribed on IO thread
           .observeOn(AndroidSchedulers.mainThread()) // observed on MainUI thread
           .subscribe(this::onSuccessfulRequest, this::onFailedRequest); // views are manipulated accordingly
   mCompositeDisposable.add(disposable);
}

 

If the network request is successful onSuccessfulRequest method is called which updates the data in the RecyclerView.

private void onSuccessfulRequest(SuggestData suggestData) {
   if (suggestData != null) {
       mSuggestAdapter.setQueries(suggestData.getQueries()); // data updated.
   }
   setAfterRefreshingState();
}

 

If the network request fails then onFailedRequest is called which displays a toast saying “Cannot fetch suggestions, Try Again!”. If requests are sent simultaneously and they fail, the previous message i.e. the previous toast is removed.

private void onFailedRequest(Throwable throwable) {
   Log.e(LOG_TAG, throwable.toString());
   if (mToast != null) { // checks if a previous toast is present
       mToast.cancel(); // removes the previous toast.
   }
   setAfterRefreshingState();
   // value of networkRequestError: "Cannot fetch suggestions, Try Again!"
   mToast = Toast.makeText(getActivity(), networkRequestError, Toast.LENGTH_SHORT); // toast is crated
   mToast.show(); // toast is displayed
}

 

Lively Updating suggestions

One way to update suggestions as the user types in, is to send a GET request with a query parameter to suggest API endpoint and check if a previous request is incomplete cancel it. This includes a lot of IO work and seems unnecessary because we would be sending request even if the user hasn’t completed typing what he/she wants to search. One probable solution is to use a Timer.

private TextWatcher searchTextWatcher = new TextWatcher() {  
    @Override
    public void afterTextChanged(Editable arg0) {
        // user typed: start the timer
        timer = new Timer();
        timer.schedule(new TimerTask() {
            @Override
            public void run() {
                String query = editText.getText()
                // send network request to get suggestions
            }
        }, 600); // 600ms delay before the timer executes the „run" method from TimerTask
    }

    @Override
    public void beforeTextChanged(CharSequence s, int start, int count, int after) {}

    @Override
    public void onTextChanged(CharSequence s, int start, int before, int count) {
        // user is typing: reset already started timer (if existing)
        if (timer != null) {
            timer.cancel();
        }
    }
};

 

This one looks good and eventually gets the job done. But, don’t you think for a simple stuff like this we are writing too much of code that is scary at the first sight.

Let’s try a second approach by using RxBinding for Android views and RxJava operators. RxBinding simply converts every event on the view to an Observable which can be subscribed and worked upon.

In place of TextWatcher here we use RxTextView.textChanges method that takes an EditText as a parameter. textChanges creates Observable of character sequences for text changes on view, here EditText. Then debounce operator of RxJava is used which drops observables till the timeout expires, a clean alternative for Timer. The second approach is implemented in updateSuggestions.

private void updateSuggestions() {
   Disposable disposable = RxTextView.textChanges(tweetSearchEditText) // generating observables for text changes
           .debounce(400, TimeUnit.MILLISECONDS) // droping observables i.e. text changes within 4 ms
           .subscribe(charSequence -> {
               if (charSequence.length() > 0) {
                   fetchSuggestion(); // suggestions obtained
               }
           });
   mCompositeDisposable.add(disposable);
}

 

CompositeDisposable is a bucket that contains Disposable objects, which are returned each time an Observable is subscribed to. So, all the disposable objects are collected in CompositeDisposable and unsubscribed when onStop of the fragment is called to avoid memory leaks.

A SwipeRefreshLayout is used, so that user can retry if the request fails or refresh to get new suggestions. When refreshing is not complete a circular ProgressDialog is shown and the RecyclerView showing old suggestions is made invisible, executed by setBeforeRefreshingState method

private void setBeforeRefreshingState() {
   refreshSuggestions.setRefreshing(true);
   tweetSearchSuggestions.setVisibility(View.INVISIBLE);
}

 

Similarly, once refreshing is done ProgessDialog is stopped and the visibility of RecyclerView which now contains the updated suggestions is changed to VISIBLE. This is executed in setAfterRefreshingState method

private void setAfterRefreshingState() {
   refreshSuggestions.setRefreshing(false);
   tweetSearchSuggestions.setVisibility(View.VISIBLE);
}

 

References:

Continue ReadingImplementing Tweet Search Suggestions in Loklak Wok Android

Testing Errors and Exceptions Using Unittest in Open Event Server

Like all other helper functions in FOSSASIA‘s Open Event Server, we also need to test the exception and error helper functions and classes. The error helper classes are mainly used to create error handler responses for known errors. For example we know error 403 is Access Forbidden, but we want to send a proper source message along with a proper error message to help identify and handle the error, hence we use the error classes. To ensure that future commits do not mismatch the error, we implemented the unit tests for errors.

There are mainly two kind of error classes, one are HTTP status errors and the other are the exceptions. Depending on the type of error we get in the try-except block for a particular API, we raise that particular exception or error.

Unit Test for Exception

Exceptions are written in this form:

@validates_schema
    def validate_quantity(self, data):
        if 'max_order' in data and 'min_order' in data:
            if data['max_order'] < data['min_order']:
                raise UnprocessableEntity({'pointer': '/data/attributes/max-order'},
                                          "max-order should be greater than min-order")

 

This error is raised wherever the data that is sent as POST or PATCH is unprocessable. For example, this is how we raise this error:

raise UnprocessableEntity({'pointer': '/data/attributes/min-quantity'},

           "min-quantity should be less than max-quantity")

This exception is raised due to error in validation of data where maximum quantity should be more than minimum quantity.

To test that the above line indeed raises an exception of UnprocessableEntity with status 422, we use the assertRaises() function. Following is the code:

 def test_exceptions(self):
        # Unprocessable Entity Exception
        with self.assertRaises(UnprocessableEntity):
            raise UnprocessableEntity({'pointer': '/data/attributes/min-quantity'},
                                      "min-quantity should be less than max-quantity")


In the above code,
with self.assertRaises() creates a context of exception type, so that when the next line raises an exception, it asserts that the exception that it was expecting is same as the exception raised and hence ensures that the correct exception is being raised

Unit Test for Error

In error helper classes, what we do is, for known HTTP status codes we return a response that is user readable and understandable. So this is how we raise an error:

ForbiddenError({'source': ''}, 'Super admin access is required')

This is basically the 403: Access Denied error. But with the “Super admin access is required” message it becomes far more clear. However we need to ensure that status code returned when this error message is shown still stays 403 and isn’t modified in future unwantedly.

Here, errors and exceptions work a little different. When we declare a custom error class, we don’t really raise that error. Instead we show that error as a response. So we can’t use the assertRaises() function. However what we can do is we can compare the status code and ensure that the error raised is the same as the expected one. So we do this:

def test_errors(self):
        with app.test_request_context():
            # Forbidden Error
            forbidden_error = ForbiddenError({'source': ''}, 'Super admin access is required')
            self.assertEqual(forbidden_error.status, 403)

            # Not Found Error
            not_found_error = NotFoundError({'source': ''}, 'Object not found.')
            self.assertEqual(not_found_error.status, 404)


Here we firstly create an object of the error class
ForbiddenError with a sample source and message. We then assert that the status attribute of this object is 403 which ensures that this error is of the Access Denied type using the assertEqual() function, which is what was expected.
The above helps us maintain that no one in future unknowingly or by mistake changes the error messages and status code so as to maintain the HTTP status codes in the response.


Resources:
Continue ReadingTesting Errors and Exceptions Using Unittest in Open Event Server

How RSS Action Type is Implemented in SUSI Android

Important skills of SUSI.AI are to display web search queries, a map of any location and provide a list of relevant information of a topic. RSS action type is similar to websearch action type but when the web search is to be performed on the client side, it is denoted by websearch action type and when the web search is performed by the server itself, it is denoted by rss action type. In this blog, I will show you how rss action type is implemented in SUSI Android.

In case of RSS action type server searches the internet and using RSS feeds, returns an array of objects containing :

  • Title
  • Description
  • Link
{
  “title”: “dog-doh: Definitions Index”,
  “description”: “dog-doh: Definitions Index. dog dog and pony show dog biscuit dog collar dog days …”,
  “link”: “http://websters.yourdictionary.com/index/dog-doh/”,
}

title: Title related to user query

description: Description of user query.

link: If user want to know more information then user can use link to find more information.

How rss action type is parsed in SUSI Android

SUSI  reply in json format. It should be parsed properly to show it in android app. We used retrofit library developed by square to parse json data. Retrofit library parse data according to model class. We code model class according to expected json reply. For example, each susi response contains answer jsonarray. There are two jsonarray data and action inside answer jsonarray. We made a different model class for each jsonarray.

First model class is SusiResponse. We used this model class to parse ‘answers’ jsonarray.

@SerializedName(“answers”)
private List<Answer> answers = new ArrayList<>();

Here we used List<>  because ‘answers’ jsonarray contain a list of jsonarray and jsonobject. Answer is second model class. We used it to parse two important jsonarray ‘data’ and ‘action’. The ‘action’ attribute has information about action type.

public class Answer {
   @SerializedName(“data”)
   private RealmList<Datum> data = new RealmList<>();
}

Here also we used the list because data jsonarray also contains a list of jsonobject but instead of simple list we used RealmList<> because after parsing we save data using realm. ‘data’ jsonarray contain multiple jsonobject and each jsonobject contain three important information ‘title’, ‘description’ and ‘link’.

Datum class is the main model class which is used to save and retrieve ‘title’, ‘description’ and ‘link’. setTitle, setLink and setDescription method of Datum class are used to save ‘title’, ‘description’ and ‘link’ and getTitle, getDescription and getLink method are use to retrieve ‘title’, ‘description’ and ‘link’.

How rss action type data is retrieved and saved

As already mentioned we used retrofit to retrieve data and realm to save data. susiResponse is response we received from SUSI server. We used susiResponse to retrieve a list of Datum class type data.

List<Datum> datumList = susiResponse.getAnswers().get(0).getData();

We then loop through datumList and from each element we extract ‘title’, ‘description’ and ‘link’ using getTitle(), getDescription() and getLink() method respectively. Datum class is model class for both retrofit and realm. realmDatum is instance of Datum class and datumRealmList is an instance of RealmList of Datum class type. After extracting data we save data using setTitle(), setDescription() and setLink().

for (Datum datum : datumList) {
          Datum realmDatum = bgRealm.createObject(Datum.class);
          realmDatum.setDescription(datum.getDescription());
          realmDatum.setLink(datum.getLink());
          realmDatum.setTitle(datum.getTitle());
         datumRealmList.add(realmDatum);
       }      

Layout design to show rss action type reply

There are three textview with id ‘title’, ‘description’ and ‘link’ to show ‘title’, ‘description’ and ‘link’ retrieved from SUSI’s reply. We used recyclerview to show list of results.

Datum datum = datumList.get(position);

holder.titleTextView.setText(Html.fromHtml(datum.getTitle()));

holder.descriptionTextView.setText(Html.fromHtml(datum.getDescription()));

holder.descriptionTextView.setText(Html.fromHtml(datum.getDescription()));

Resources

Continue ReadingHow RSS Action Type is Implemented in SUSI Android