Query Model Structure of Loklak Search

Need to restructure

The earlier versions of loklak search applications had the issues of breaking changes whenever any new feature was added in the application. The main reason for these unintended bugs was identified to be the existing query structure. The query structure which was used in the earlier versions of the application only comprised of the single entity a string named as queryString.

export interface Query {
 queryString: string;
}

This simple query string property was good enough for simple string based searches which were the goals of the application in the initial phases, but as the application progressed we realized this simple string based implementation is not going to be feasible for the long term. As there are only a limited things we can do with strings. It becomes extremely difficult to set and reset the portions of the string according to the requirements. This was the main vision for the alternate architecture design scheme was to the ease of enabling and disabling the features on the fly.

Application Structure

Therefore, to overcome the difficulties faced with the simple string based structure we introduced the concept of an attribute based structure for queries. The attribute based structure is simpler to understand and thus easier to maintain the state of the query in the application.

export interface Query {
 displayString: string;
 queryString: string;
 routerString: string;
 filter: FilterList;
 location: string;
 timeBound: TimeBound;
 from: boolean;
}

The reason this is called an attribute based structure is that here each property of an interface is independent of one another. Thus each one can be thought of as a separate little key placed on the query, but each of these keys are different and are mutually exclusive. What this means is, if I want to write an attribute query, then it does not matter to me which other attributes are already present on the query. The query will eventually be processed and sent to the server and the corresponding valid results if exists will be shown to the user.

Now the question arises how do we modify the values of these attributes? Now before answering this I would like to mention that this interface is actually instantiated in the the Redux state, so now our question automatically gets answered, the modification to redux state corresponding to the query structure will be done by specific reducers meant for modification of each attribute. These reducers are again triggered by corresponding actions.

export const ActionTypes = {
 VALUE_CHANGE: '[Query] Value Change',
 FILTER_CHANGE: '[Query] Filter Change',
 LOCATION_CHANGE: '[Query] Location Change',
 TIME_BOUND_CHANGE: '[Query] Time Bound Change',
};

This ActionTypes object contains the the corresponding actions which are used to trigger the reducers. These actions can be dispatched in response to any user interaction by any of the components, thus modifying a particular state attribute via the reducer.

Converting from object to string

Now for our API endpoint to understand our query we need to send the proper string in API accepted format. For this there is need to convert dynamically from query state to query string, for this we need a simple function which take in query state as an input return the query string as output.

export function parseQueryToQueryString(query: Query): string {
 let qs: string;
 qs = query.displayString;
 if (query.location) {
qs += ` near:${query.location);

 if (query.timeBound.since) {
   qs += ` since:${parseDateToApiAcceptedFormat(query.timeBound.since)}`;

 if (query.timeBound.until) {
   qs += ` until:${parseDateToApiAcceptedFormat(query.timeBound.until)}`;

 return qs;
}

In this function we are just checking and updating the query string according to the various attributes set in the structure, and then returning the query string. So if eventually we have to convert to the string, then what is the advantage of this approach? The main advantage of this approach is that we know the query structure beforehand and we use the structure to build the string not just randomly selecting and removing pieces of information from a string. Whenever we update any of the attribute of the query state, the query is generated fresh, and not modifying the old string.

Conclusion

This approach makes the application to be able to modify the search queries sent to server in a streamlined and logical way, just by using simple data structure. This query model has provided us with a lot of advantages which are visible in the aspect of application stability and performance. This model has cuts out dirty regex matching, of typed queries and thus again help us to make simpler queries.

Resources and Links

Continue ReadingQuery Model Structure of Loklak Search

Adding tip to drop downs in Susper using CSS in Angular

To create simple drop downs using twitter bootstrap, it is fairly easy for developers. The issue faced in Susper, however, was to add a tip on the top over such dropdowns similar to Google:

This is how it looks finally, in Susper, with a tip over the standard rectangular drop-down:

This is how it was done:

  1. First, make sure you have designed your drop-down according to your requirements, added the desired height, width and padding. These were the specifications used in Susper’s drop-down.

.dropdown-menu{
height: 500px;
width: 327px;
padding: 28px;
}
  1. Next add the following code to your drop-down class css:

.dropdown-menu:before {
position: absolute;
top: -7px;
right: 19px;
display: inlineblock;
border-right: 7px solid transparent;
border-bottom: 7px solid #ccc;
border-left: 7px solid transparent;
border-bottom-color: rgba(0, 0, 0, 0.2);
content: ;
}
.dropdown-menu:after {
position: absolute;
top: -5px;
right: 20px;
display: inlineblock;
border-right: 6px solid transparent;
border-bottom: 6px solid #ffffff;
border-left: 6px solid transparent;
content: ;
}

In css, :before inserts the style before any other html, whereas :after inserts the style after the html is loaded. Some of the parameters are explained here:

  • Top: can be used to change the position of the menu tip vertically, according to the position of your button and menu.
  • Right: can be used to change the position of the menu tip horizontally, so that it can be positioned used below the menu icon.
  • Position : absolute is used to make sure all our values are absolute and not relative to the higher div hierarchically
  • Border: All border attributes are used to specify border thickness, color and transparency before and after, which collectively gives the effect of a tip for the drop down.
  • Content : This value is set to a blank string ‘’, because otherwise none of our changes will be visible, since the divs will have no space allocated to them.

Resources

Continue ReadingAdding tip to drop downs in Susper using CSS in Angular

Using JS Profiler to Solve App Slowness in Loklak Search

The loklak search web application had an issue of application slowdown in the development environment. There was a highly prominent lag and a slowdown in the application while typing a search query in the search box. This issue was being faced from but the reason for the lad and slowdown was unknown. So this blog explains how the issue was identified, what were the reasons behind it, and how the issue was solved.

The most important aspect of fixing any bug in the application is to first identify it. Here in our case what helped us to identify the issue was the proper use of JS profiler to find the hidden issue underneath.

Reporting of the issue

The issue was observed and reported in the issue #365. On discussion it was observed that the issue was not being observed by all the users but only some of them, this made it really interesting as the application, on the whole, was being shipped uniformly to all the users but still only fractions of them were facing this issue, and all the people who were experiencing this were the developers of the application

Detection

For the detection of the issue, it was important to have a starting point to search issue. Here the only thing which we knew about the issue was that it was being faced only when typing in the application search box. So this was our starting ground. The most important tool while solving such issues is the browser itself. The dev tools in the most modern browsers have a built in Javascript Profiler. A profiler is a tool which dynamically at the run time records the activities happening in the various threads of the application, most importantly main thread. So we can find the JS profiler in browser dev tools, in Chrome 59 it is in Performance tab.

In every profiler there is a record button (marked above) pressing this will start monitoring all the activities on various JS threads and other pipelines. So this profiler and understanding its logs is the key to solve any performance related issue in any web application.

So we started by hitting the record button and then typing in the search box. And this is the output which we saw for profiling. This output contains a lot of numbers and to make sense of all of them it is important to understand the important bits of it as the output, on the whole, can be a lot overwhelming.

Observations

We will go piece by piece and will try to make sense of the output and if possible identify the reason for our original issue.

 

  • This is the timeline strip which represents the overall view of the main thread of JS. The yellow portions show the percentage of CPU utilization with respect to time, here we can clearly see the consistent blocking CPU utilization by the main thread over a span of 10 seconds (during which we typed). The high CPU utilization starts with our first keystroke.
  • The red rectangles at the top represent the dropped frames, the frames which are never rendered due to heavy load, and the green line at the bottom represent the frames per second which you can see is practically zero. This implies that the CPU is entirely hogged up by the main thread and there is no room left for rendering frames.
  • Now, this is the Interactions panel this indicates all the user generated events which are triggered. It clearly shows that the key up and key character events are getting triggered but are taking a lot of processing time. The first key press, key character and key up events run normally. But then each subsequent event hogs up the CPU.
  • The above observation indicates what ever is the issue in the application it is in between the registering of an event and its final processing.
  • So the natural question arises what we do during this processing. On looking at the code it seems like we dispatch a Redux Action that the query string has updated, according to the action reducer updates the state. Now, this implies that our reducer is taking the time to update the state. But this does not make sense, as reducers are the pure functions, i.e. they don’t make any blocking calls. So they can’t hog up all the CPU cycles for 10 seconds.
  • So we need to investigate further to identify what is happening after the action is dispatched, to the store. This is the function call stack trace on the main thread. What we see here is each nested function call as we move down the tree.
  • The above trace shows that what all is happening when a key is pressed. There is again a key point to observe that the time to process the events is increasing per keystroke, for the first key stroke it starts from 112ms and jumps to 400ms in for the subsequent key presses, which causes the slowdown.
  • Now we can see in the above stack trace and find out the Rate determining step for each event processing. The RDS of a step is the slowest step which causes the whole process to execute slowly.
  • Now as we move down the tree we find that all the steps are taking very nominal time but some function down in the tree is causing the issues. So we move down the tree to identify the rate determining function call.
  • As we move down the tree we first find our action dispatch call as expected, the processing of this call is not taking much time but some of the functions in that process are causing our issue, thus we continue to move down the tree.
  • Here we find our culprit which is causing our issue the slowest step is a method which is making a call to our StoreDevtools module and notifying it about the latest store update. This notifying action is taking the most time.
  • As we move down the tree we first find our action dispatch call as expected, the processing of this call is not taking much time but some of the functions in that process are causing our issue, thus we continue to move down the tree.
  • Here we find our culprit which is causing our issue the slowest step is a method which is making a call to our StoreDevtools module and notifying it about the latest store update. This notifying action is taking the most time.
  • So the issue was caused by the import of dev-tools module in our app module
    StoreDevtoolsModule.instrumentOnlyWithExtension();
  • Now another important observation here is that why only a few people were facing this issue
    • The only people who faced this issue were the developers working on the project because as the import statement says “instrumentOnlyWithExtension” it specifies that this module will only run when there is Devtools Extension installed.
    • And this is the case with the application developers only.

Fixing the issue

  • For fixing the issue it was decided that for best performance the StoreDevTools module will not be included in the package by default.
  • The module can be selectively imported for testing purposes as and when required by the developers
  • It will always be removed while merging into the main branch.

Results

Now after all these efforts and fixing it is worth mentioning the speed gains which were achieved after removing store dev-tools module.

  • This the profiler output after running the profiling on similar queries as before. Without even going into detail we can see it is much cleaner now.
  • These are the main thread and frame rates achieved after the removal of the module. We can clearly see the main thread is completely free and we thus achieve high frame rates.
  • The last important observation is that the action dispatching which was taking about 350ms earlier is now taking only 2.5ms. Such a performance boost is huge at about 140x at each key press.

Conclusion

When a web application has slow response or lag, it makes the users leave the application altogether, and this causes the huge setback to the efforts which we developers make in increasing the user interaction. Thus, any issue on client side must be solved as quickly as possible. The most important tool with which we use to test and predict what is going wrong with the application is the browser itself. The browser and the dev-tools provide us important tools required to optimize the application for improving the performance.

Resources and links

Continue ReadingUsing JS Profiler to Solve App Slowness in Loklak Search

Using SUSI AI Server to Store User Feedback for a Skill

User feedback is valuable information that plays an important  role in improving the quality of service. In SUSI AI server we are planning to make a feedback mechanism to see if the user liked the answer or not. The result of that user input (which can be given using a vote button) will be then learned to enhance the future use of the rule. So as a first step for implementation of  skill rating system with guided learning, we need to store the user rating of a skill . In this blogpost we will learn how to make an endpoint for getting skill rating from user. This API endpoint will be used  by its web and mobile clients.
Before the implementation of API  let’s look how data is stored in SUSI AI Susi_server uses DAO in which skill rating is stored as JSONTray. 

 public JsonTray(File file_persistent, File file_volatile, int cachesize) throws IOException {
        this.per = new JsonFile(file_persistent);
        this.vol = new CacheMap<String, JSONObject>(cachesize);
        this.file_volatile = file_volatile;
        if (file_volatile != null && file_volatile.exists()) try {
            JSONObject j = JsonFile.readJson(file_volatile);
            for (String key: j.keySet()) this.vol.put(key, j.getJSONObject(key));
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

JsonTray takes three parameters the persistent file, volatile file and cache size to store them as cache map in String and JsonObject pairs. The HttpServlet class which provides methods, such as doGet and doPost, for handling HTTP-specific services.In Susi Server an abstract class AbstractAPIHandler extending HttpServelets and implementing API handler interface is provided. Next we will inherit our RateSkillService class from AbstractAPIHandler and implement APIhandler interface.

public class RateSkillService extends AbstractAPIHandler implements APIHandler {
    private static final long serialVersionUID =7947060716231250102L;
    @Override
    public BaseUserRole getMinimalBaseUserRole() {
        return BaseUserRole.ANONYMOUS;
    }

    @Override
    public JSONObject getDefaultPermissions(BaseUserRole baseUserRole) {
        return null;
    }

    @Override
    public String getAPIPath() {
        return "/cms/rateSkill.json";
    }

}

The getMinimalBaseRole method tells the minimum Userrole required to access this servlet it can also be ADMIN, USER. In our case it is Anonymous. A User need not to log in to access this endpoint. The getAPIPath() methods sets the API endpoint path, it gets appended to base path which is 127.0.0.1:4000/cms/rateSkill.json for local host .

Next we will implement serviceImpl method

  @Override
    public ServiceResponse serviceImpl(Query call, HttpServletResponse response, Authorization rights, final JsonObjectWithDefault permissions) {

        String model_name = call.get("model", "general");
        File model = new File(DAO.model_watch_dir, model_name);
        String group_name = call.get("group", "knowledge");
        File group = new File(model, group_name);
        String language_name = call.get("language", "en");
        File language = new File(group, language_name);
        String skill_name = call.get("skill", null);
        File skill = new File(language, skill_name + ".txt");
        String skill_rate = call.get("rating", null);

        JSONObject result = new JSONObject();
        result.put("accepted", false);
        if (!skill.exists()) {
            result.put("message", "skill does not exist");
            return new ServiceResponse(result);

        }
        JsonTray skillRating = DAO.skillRating;
        JSONObject modelName = new JSONObject();
        JSONObject groupName = new JSONObject();
        JSONObject languageName = new JSONObject();
        if (skillRating.has(model_name)) {
            modelName = skillRating.getJSONObject(model_name);
            if (modelName.has(group_name)) {
                groupName = modelName.getJSONObject(group_name);
                if (groupName.has(language_name)) {
                    languageName = groupName.getJSONObject(language_name);
                    if (languageName.has(skill_name)) {
                        JSONObject skillName = languageName.getJSONObject(skill_name);
                        skillName.put(skill_rate, skillName.getInt(skill_rate) + 1 + "");
                        languageName.put(skill_name, skillName);
                        groupName.put(language_name, languageName);
                        modelName.put(group_name, groupName);
                        skillRating.put(model_name, modelName, true);
                        result.put("accepted", true);
                        result.put("message", "Skill ratings updated");
                        return new ServiceResponse(result);
                    }
                }
            }
        }
        languageName.put(skill_name, createRatingObject(skill_rate));
        groupName.put(language_name, languageName);
        modelName.put(group_name, groupName);
        skillRating.put(model_name, modelName, true);
        result.put("accepted", true);
        result.put("message", "Skill ratings added");
        return new ServiceResponse(result);

    }

    /* Utility function*/
    public JSONObject createRatingObject(String skill_rate) {
        JSONObject skillName = new JSONObject();
        skillName.put("positive", "0");
        skillName.put("negative", "0");
        skillName.put(skill_rate, skillName.getInt(skill_rate) + 1 + "");
        return skillName;
    }

 

One can access any skill based on four tuples parameters model, group, language, skill. Before rating a skill we must ensure whether it exists or not. We can get the required parameters through call.get() method where first parameter is the key for which we want to get the value and second parameter is the default value. If skill.exists() method return false we generate error message stating “No such skill exists”. Otherwise check if the skill exist in our skillRating.json file if so, update the current ratings otherwise create a new json object and add it to rating file based on model, group and language. After successful implementation go ahead and test your endpoint on http://localhost:4000/cms/rateSkill.json?model=general&group=knowledge&skill=who&rating=positive

You can also check for the updated json file in  susi_server/data/skill_rating/skillRating.json 

{"general": {
 "assistants": {"en": {
   "language_translation": {
     "negative": "1",
     "positive": "0"
  }}},
 "smalltalk": {"en": {
   "aboutsusi": {
   "negative": "0",
   "positive": "1"
 }}},
 "knowledge": {"en": {
   "who": {
     "negative": "2",
     "positive": "4"
   }}}
}}

And if the skill is not present if will generate error message

We have successfully implemented the API endpoint for storing the user skill’s feedback. For more information take a look at Susi server and join gitter chat channel for discussions.

Resources 

Continue ReadingUsing SUSI AI Server to Store User Feedback for a Skill

Auto deployment of SUSI Skill CMS on gh pages

Susi Skill CMS is a web application framework to edit susi skills. It is currently in development stage, hosted on http://skills.susi.ai. It is built using ReactJS . In this blogpost we will see how to automatically deploy the repository on gh pages.
Setting up the project
Fork susi_skill_cms repository and clone it to your desktop, make sure you have node and npm versions greater than 6 and 3 respectively. Next go to cloned folder and install all the dependencies by running :

:$ npm install

Next run on http://localhost:3000 by running the command

:$ npm run start

To auto deploy changes on gh-pages branch, we need to setup Travis for the project. Register yourself on https://travis-ci.org/ and turn on the Travis for this repository. Next add .travis.yml in the root directory of the source folder.  

sudo: required
dist: trusty
language: node_js
node_js:
  - 6

before_install:
  - export CHROME_BIN=chromium-browser
  - export DISPLAY=:99.0
  - sh -e /etc/init.d/xvfb start

before_script:
  - npm run build

script:
  - npm run test

after_success:
  - bash ./deploy.sh

cache:
  directories: node_modules

# safelist
branches:
  only:
  - master 

Source: https://github.com/fossasia/susi_skill_cms/blob/master/.travis.yml

The travis configuration files will ensure that the project is building for every changes made, using npm run test command, in our case it will only consider changes made on master branch , if you want to watch other branches to add the respective branch name in travis configurations. After checking for build passing we need to automatically push the changes made for which we will use a bash script.

#!/bin/bash

SOURCE_BRANCH="master"
TARGET_BRANCH="gh-pages"

# Pull requests and commits to other branches shouldn't try to deploy.
if [ "$TRAVIS_PULL_REQUEST" != "false" -o "$TRAVIS_BRANCH" != "$SOURCE_BRANCH" ]; then
    echo "Skipping deploy; The request or commit is not on master"
    exit 0
fi

# Save some useful information
REPO=`git config remote.origin.url`
SSH_REPO=${REPO/https:\/\/github.com\//git@github.com:}
SHA=`git rev-parse --verify HEAD`

ENCRYPTED_KEY_VAR="encrypted_${ENCRYPTION_LABEL}_key"
ENCRYPTED_IV_VAR="encrypted_${ENCRYPTION_LABEL}_iv"
ENCRYPTED_KEY=${!ENCRYPTED_KEY_VAR}
ENCRYPTED_IV=${!ENCRYPTED_IV_VAR}
openssl aes-256-cbc -K $encrypted_2662bc12c918_key -iv $encrypted_2662bc12c918_iv -in deploy_key.enc -out ../deploy_key -d
chmod 600 ../deploy_key
eval `ssh-agent -s`
ssh-add ../deploy_key

# Cloning the repository to repo/ directory,
# Creating gh-pages branch if it doesn't exists else moving to that branch
git clone $REPO repo
cd repo
git checkout $TARGET_BRANCH || git checkout --orphan $TARGET_BRANCH
cd ..

# Setting up the username and email.
git config user.name "Travis CI"
git config user.email "$COMMIT_AUTHOR_EMAIL"

# Cleaning up the old repo's gh-pages branch except CNAME file and 404.html
find repo/* ! -name "CNAME" ! -name "404.html" -maxdepth 1  -exec rm -rf {} \; 2> /dev/null
cd repo

git add --all
git commit -m "Travis CI Clean Deploy : ${SHA}"

git checkout $SOURCE_BRANCH

# Actual building and setup of current push or PR.
npm install
npm run build
mv build ../build/

git checkout $TARGET_BRANCH
rm -rf node_modules/
mv ../build/* .
cp index.html 404.html

# Staging the new build for commit; and then committing the latest build
git add -A
git commit --amend --no-edit --allow-empty

# Deploying only if the build has changed
if [ -z `git diff --name-only HEAD HEAD~1` ]; then

  echo "No Changes in the Build; exiting"
  exit 0

else
  # There are changes in the Build; push the changes to gh-pages
  echo "There are changes in the Build; pushing the changes to gh-pages"

  # Actual push to gh-pages branch via Travis
  git push --force $SSH_REPO $TARGET_BRANCH
fi

Source : Bash script for automatic deployment

This bash script will enable travis ci user to push changes to gh pages, for this we need to store the credentials of the repository in encrypted form. To get the public/ private rsa keys use the following command

ssh-keygen -t rsa -b 4096 -C "your_email@example.com"

It will generate keys in .ssh/id_rsa folder in your home repository.

Make sure you do not enter any passphrase while generating credentials otherwise travis will get stuck at time of decrypting the keys. Copy the public key and deploy the key to repository by visiting  https://github.com/<your name>/<your repo>/settings/keys


Next install travis for encryption of keys.

sudo apt install ruby ruby-dev
sudo gem install travis

Encrypt your private deploy_key and add it to root of your repository using command

travis encrypt-file deploy_key

After successful encryption, you will see a message

Please add the following to your build script (before_install stage in your .travis.yml, for instance):

openssl aes-256-cbc -K $encrypted_2662bc12c918_key -iv $encrypted_2662bc12c918_iv -in deploy_key.enc -out ../deploy_key -d

Add the above generated script in travis and push the changes on your master branch. Do not push the deploy_key only the encryption file deploy_key.enc
Finally, add the deploy link of gh pages in package.json of your using key “homepage”.

 "homepage": "http://skills.susi.ai/"

And in scripts of package.json add

"deploy": "gh-pages -d build",

Commit and push your changes and from now onward all your changes will be automatically pushed to gh pages branch. For contribution visit Susi_Skill_CMS.

Resources

Continue ReadingAuto deployment of SUSI Skill CMS on gh pages

How SUSI Analyzes A Given Response

Ever wondered where SUSI’s answers come from? Now Susi has ability to do an answer analysis. To get that analysis, just ask susi “analysis”. This will set susi into an analysis mode, will tell where the latest answer came from and will give you the link for improving the skill.

Let’s check out how Susi analysis work. The skill for analysis is defined  en_0001_foundation.txt  as following

analysis|analyse|analyze|* analysis|* analyse|* analyze|analysis *|analyse *|analyze *
My previous answer is defined in the skill $skill$. You can help to improve this skill and <a href="$skill_link$" target="_blank"> edit it in the code repository here.</a>

$skill$ and $skill_link$ are the variable compiled using

public static final Pattern variable_pattern = Pattern.compile("\\$.*?\\$");

These variables are memorized in Susi cognition. A cognition is the combination of a query of a user with the response of susi.

SusiThought dispute = new SusiThought();
List<String> skills = clonedThought.getSkills();
 if (skills.size() > 0) {
    dispute.addObservation("skill", skills.get(0));
    dispute.addObservation("skill_link",getSkillLink(skills.get(0)));
   }

Susi Thought is a piece of data that can be remembered. The structure of the thought is modeled as a table in which information contained in it is organized in rows and columns.

 public SusiThought addObservation(String featureName, String observation) ;

One can memorize using addObservation() method.  It takes two parameter featureName the object key and observation the object value. It is a table of information pieces as a set of rows which all have the same column names. It inserts the new data always in front of existing similar data rather than overwriting them.

 public String getSkillLink(String skillPath) {
       String link=skillPath;
        if(skillPath.startsWith("/susi_server")) {
            link ="https://github.com/fossasia/susi_server/blob/development" + skillPath.substring("/susi_server".length());
        } else if (skillPath.startsWith("/susi_skill_data")) {
            link = "https://github.com/fossasia/susi_skill_data/blob/master" + skillPath.substring("/susi_skill_data".length());
        }
        return link;
    }

The getSkillLink is a utitlity method to return the link of the skill source github repository based on skillPath.

private String skill;
SusiThought recall;
final SusiArgument flow = new SusiArgument().think(recall);
this.skill = origin.getAbsolutePath();
 if (this.skill != null && this.skill.length() > 0) flow.addSkill(this.skill);

The source of the skill gets added in SusiIntent.java using getAbsolutePath() method which resolves the skill path in the filesystem. Intent  considers the key from the user query, matches the intent tokens to get the optimum result and produces json like

 "data": [
      {
        "object": "If you spend too much time thinking about a thing, you'll never get it done.",
        "0": "tell me a quote",
        "token_original": "quote",
        "token_canonical": "quote",
        "token_categorized": "quote",
        "timezoneOffset": "-330",
        "answer": "When you discover your mission, you will feel its demand. It will fill you with enthusiasm and a burning desire to get to work on it. ",
        "skill_link": "https://github.com/fossasia/susi_skill_data/blob/master/models/general/entertainment/en/quotes.txt",
        "query": "tell me a quote",
        "skill": "/susi_skill_data/models/general/entertainment/en/quotes.txt"
      },

The getskills() method returns list of skill from json which are later added for memorization.

    public List<String> getSkills() {
        List<String> skills = new ArrayList<>();
        getSkillsJSON().forEach(skill -> skills.add((String) skill));
        return skills;
    }

This is how Susi is able to fetch  where the answer came from. Next time when you have a chat with susi do check skill analysis and add your ideas to improve the skill. Take a look at Susi_skill_data for more skills and  read this tutorial  for creating skills for susi.

Resources

Continue ReadingHow SUSI Analyzes A Given Response

Adding unit tests for effects in Loklak Search

Loklak search uses @ngrx/effects to listen to actions dispatched by the user and sending API request to the loklak server. Loklak search, currently, has seven effects such as Search Effects,  Suggest Effects which runs to make the application reactive. It is important to test these effects to ensure that effects make API calls at the right time and then map the response to send it back to the reducer.

I will  explain here how I added unit tests for the effects. Surprisingly, the test coverage increased from 43% to 55% after adding these tests.

Effects to test

We are going to test effects for user search. This effect listens to the event of type USER_SEARCH and makes a call to the user-search service with the query as a parameter. After a response is received, it maps the response and passes it on the UserSearchCompleteSuccessAction action which performs the later operation. If the service fails to get a response, it makes a call to the UserSearchCompleteFailAction.

Code

ApiUserSearchEffects is the effect which detects if the USER_SEARCH action is dispatched from some component of the application and consequently, it makes a call to the UserSearchService and handles the JSON response received from the server. The effects then, dispatch the action new UserSearchCompleteSuccessAction if response is received from server or either dispatch the action new UserSearchCompleteFailAction if no response is received. The debounce time is set to 400 so that response can be flushed if a new USER_SEARCH is dispatched within the next 400ms.

For this effect, we need to test if the effects actually runs when USER_SEARCH action is made. Further, we need to test if the correct parameters are supplied to the service and response is handled carefully. We also, need to check if the response if really flushed out within the certain debounce time limit.

@Injectable()
export class ApiUserSearchEffects {@Effect()
search$: Observable<Action>
= this.actions$
.ofType(userApiAction.ActionTypes.USER_SEARCH)
.debounceTime(400)
.map((action: userApiAction.UserSearchAction) => action.payload)
.switchMap(query => {
const nextSearch$ = this.actions$.ofType(userApiAction.ActionTypes.USER_SEARCH).skip(1);const follow_count = 10;return this.apiUserService.fetchQuery(query.screen_name, follow_count)
.takeUntil(nextSearch$)
.map(response => new userApiAction.UserSearchCompleteSuccessAction(response))
.catch(() => of(new userApiAction.UserSearchCompleteFailAction()));
});constructor(
private actions$: Actions,
private apiUserService: UserService
) { }

}

Unit test for effects

  • Configure the TestBed class before starting the unit test and add all the necessary imports (most important being the EffectsTestingModule) and providers. This step will help to isolate the effects completely from all other components and testing it independently. We also need to create spy object which spies on the method userService.fetchQuery with provider being UserService.

beforeEach(() => TestBed.configureTestingModule({
imports: [
EffectsTestingModule,
RouterTestingModule
],
providers: [
ApiUserSearchEffects,
{
provide: UserService,
useValue: jasmine.createSpyObj(‘userService’, [‘fetchQuery’])
}
]
}));
  • Now, we will be needing a function setup which takes params which are the data to be returned by the Mock User Service. We can now configure the response to returned by the service. Moreover, this function will be initializing EffectsRunner and returning ApiUserSearchEffects so that it can be used for unit testing.

function setup(params?: {userApiReturnValue: any}) {
const userService = TestBed.get(UserService);
if (params) { userService.fetchQuery.and.returnValue(params.userApiReturnValue);
}return {
runner: TestBed.get(EffectsRunner),
apiUserSearchEffects: TestBed.get(ApiUserSearchEffects)
};
}

 

  • Now we will be adding unit tests for the effects. In these tests, we are going to test if the effects recognise the action and return some new action based on the response we want and if it maps the response only after a certain debounce time.We have used fakeAsync() which gives us access to the tick() function. Next, We are calling the function setup and pass on the Mock Response so that whenever User Service is called it returns the Mock Response and never runs the service actually. We will now queue the action UserSearchAction in the runner and subscribe to the value returned by the effects class. We can now test the value returned using expect() block and that the value is returned only after a certain debounce time using tick() block.

it(‘should return a new userApiAction.UserSearchCompleteSuccessAction, ‘ +
‘with the response, on success, after the de-bounce’, fakeAsync(() => {
const response = MockUserResponse;const {runner, apiUserSearchEffects} = setup({userApiReturnValue: Observable.of(response)});

const expectedResult = new userApiAction.UserSearchCompleteSuccessAction(response);

runner.queue(new userApiAction.UserSearchAction(MockUserQuery));

let result = null;
apiUserSearchEffects.search$.subscribe(_result => result = _result);
tick(399); // test debounce
expect(result).toBe(null);
tick(401);
expect(result).toEqual(expectedResult);
}));

it(‘should return a new userApiAction.UserSearchCompleteFailAction,’ +
‘if the SearchService throws’, fakeAsync(() => {
const { runner, apiUserSearchEffects } = setup({ userApiReturnValue: Observable.throw(new Error()) });

const expectedResult = new userApiAction.UserSearchCompleteFailAction();

runner.queue(new userApiAction.UserSearchAction(MockUserQuery));

let result = null;
apiUserSearchEffects.search$.subscribe(_result => result = _result );

tick(399); // Test debounce
expect(result).toBe(null);
tick(401);
expect(result).toEqual(expectedResult);
}));
});

Reference

Continue ReadingAdding unit tests for effects in Loklak Search

Introducing Customization in Loklak Media Wall

My GSoC Project includes implementing media wall in loklak search . One part of the issue is to include customization options in media wall. I looked around for most important features that can be implemented in a media wall to give the user a more appealing and personalized view. One of the feature that can be implemented is enabling Full Screen Mode.  This feature can help the user to display media wall on the projector or any big display screen without compromising with space available. In one part, I would be explaining how I implemented Full screen Mode in loklak media wall using fullscreen.js library.

Secondly, it is important to include a very reactive and user-friendly setting box. The setting box should be a central container in which all the customization options will be included.In loklak media wall,  setting box is implemented as a dialog box with various classifications in form of tabs. I would also be explaining  how I designed customization menu using Angular Material.

Implementation

Full Screen Mode

Since loklak search is an Angular 2 application and all the code is written in typescript, we can’t simply use the fullscreen.js library. We have to import the library into our application and create a directive that can be applied to the element to use it in the application.

  • Install fullscreen.js library in the application using Node Package Manager.

npm install save screenfull

import {Directive, HostListener, Output, EventEmitter} from ‘@angular/core’;
import * as screenfull from ‘screenfull’;@Directive({
selector: ‘[toggleFullscreen]’
})
export class ToggleFullscreenDirective {constructor() {}@HostListener(‘click’) onClick() {
if (screenfull.enabled) {
screenfull.toggle();
}
}
}
  • Import Directive into the module and add it to declaration. This allows directive to be used anywhere in the template.

import { ToggleFullscreenDirective } from ‘../shared//full-screen.directive’;
.
.
.
@NgModule({
.
.
.
declarations: [
.
.
.
ToggleFullscreenDirective,
.
.
.
]
})
export class MediaWallModule { }
  • Now, the directive is ready to use on the template. We just have to add this attribute directive to an element.

<i toggleFullscreen mdTooltip=“Full Screen” mdTooltipPosition=“below” class=“material-icons md-36”>fullscreen</i>

Customization Menu

Customization Menu is created using the idea of central container for customization. It is created using two components of Angular Material – Dialog Box and Tabs . We will now be looking how customization menu is implemented using these two components.

  • Create a component with the pre-configured position, height and width of the dialog box. This can be done simply using updatePosition and updateSize property of the MdDialogRef class.

export class MediaWallCustomizationComponent implements OnInit {
public query: string;constructor(
private dialogRef: MdDialogRef<MediaWallCustomizationComponent>,
private store: Store<fromRoot.State>,
private location: Location) { }ngOnInit() {
this.dialogRef.updatePosition(’10px’);
this.dialogRef.updateSize(‘80%’, ‘80%’);
}public searchAction() {
if (this.query) {
this.store.dispatch(new mediaWallAction.WallInputValueChangeAction(this.query));
this.location.go(‘/wall’, `query=${this.query}`);
}
}
}
  • Create a template for the Customization menu. We will be using md-tab and md-dialog to create a dialog box with options displayed using tabs. dynamicHeight should be set to true so that dialog box adjust according to the tabs. We can simply add an attribute md-dialog-close to the button which will close the dialog box. All the content should be added in the div with attribute md-dialog-content linked to it. Moreover, to make options look more user-friendly and adjustable on smaller screens, icons must be added with the Tab title.

<h1 mddialogtitle>Customization Menu</h1>
<button class=“form-close” mddialogclose>x</button>
<span mddialogcontent>
<mdtabgroup color=“accent” dynamicHeight=“true”>
<mdtab>
<ngtemplate mdtablabel>
<mdicon>search</mdicon>
Search For
</ngtemplate>
<h3> Search Customization </h3>
<mdinputcontainer class=“example-full-width” color=“accent”>
<input placeholder=“Search Term” mdInput type =“text” class=“input” name=“search-term” [(ngModel)]=“query”>
</mdinputcontainer>
<span class=“apply-button”>
<button mdraisedbutton color=“accent” mddialogclose (click)=“searchAction()”>Display</button>
</span>
</mdtab>
</mdtabgroup>
</span>

The code currently shows up code for search customization. It basically, records to the input using [(ngModel)] for two-way binding and makes the call the search action whenever user clicks on Display button.

  • Add a button which would open dialog box using open property of MdDialog class. This property would provide an instance for MediaWallCustomizationComponent and the component will show up dynamically.

<i class=“material-icons md-36” (click)=“dialog.open(MediaWallCustomizationComponent)”>settings</i>
  • It is important to add MediaWallCustomizationComponent as an entry component in the module so that AOT compiler can create a ComponentFactory for it during initialization.

import { MediaWallCustomizationComponent } from ‘./media-wall-customization/media-wall-customization.component’;

@NgModule({
entryComponents: [
MediaWallCustomizationComponent
]
})
export class MediaWallModule { }

 

This creates an appealing and user-friendly customization menu which acts a central container for customization options.

References

Continue ReadingIntroducing Customization in Loklak Media Wall

Implementation of Customizable Instant Search on Susper using Local Storage

Results on Susper could be instantly displayed as user types in a query. This was a strict feature till some time before, where the user doesn’t have customizable option to choose. But now one could turn off and on this feature.

To turn on and off this feature visit ‘Search Settings’ on Susper. This will be the link to it: http://susper.com/preferences and you will find different options to choose from.

How did we implement this feature?

Searchsettings.component.html:

<div>
 <h4><strong>Susper Instant Predictions</strong></h4>
 <p>When should we show you results as you type?</p>
 <input name="options" [(ngModel)]="instantresults" disabled value="#" type="radio" id="op1"><label for="op1">Only when my computer is fast enough</label><br>
 <input name="options" [(ngModel)]="instantresults" [value]="true" type="radio" id="op2"><label for="op2">Always show instant results</label><br>
 <input name="options" [(ngModel)]="instantresults" [value]="false" type="radio" id="op3"><label for="op3">Never show instant results</label><br>
</div>

User is displayed with options to choose from regarding instant search.when the user selects a new option, his selection is stored in the instantresults variable in search settings component using ngModel.

Searchsettings.component.ts:

Later when user clicks on save button the instantresults object is stored into localStorage of the browser

onSave() {
 if (this.instantresults) {
   localStorage.setItem('instantsearch', JSON.stringify({value: true}));
 } else {
   localStorage.setItem('instantsearch', JSON.stringify({ value: false }));
   localStorage.setItem('resultscount', JSON.stringify({ value: this.resultCount }));
 }
 this.router.navigate(['/']);
}

 

Later this value is retrieved from the localStorage function whenever a user enters a query in search bar component and search is made according to user’s preference.

Searchbar.component.ts

Later this value is retrieved from the localStorage function whenever a user enters a query in search bar component and search is made according to user’s preference.

onquery(event: any) {
 this.store.dispatch(new query.QueryAction(event));
 let instantsearch = JSON.parse(localStorage.getItem('instantsearch'));

 if (instantsearch && instantsearch.value) {
   this.store.dispatch(new queryactions.QueryServerAction({'query': event, start: this.searchdata.start, rows: this.searchdata.rows}));
   this.displayStatus = 'showbox';
   this.hidebox(event);
 } else {
   if (event.which === 13) {
     this.store.dispatch(new queryactions.QueryServerAction({'query': event, start: this.searchdata.start, rows: this.searchdata.rows}));
     this.displayStatus = 'showbox';
     this.hidebox(event);
   }
 }
}

 

Interaction of different components here:

  1. First we set the instantresults object in Local Storage from search settings component.
  2. Later this value is retrieved and used by search bar component using localstorage.get() method to decide whether to display results instantly or not.

Below, Gif shows how you could use this feature in Susper to customise the instant results in your browser.

References:

 

Continue ReadingImplementation of Customizable Instant Search on Susper using Local Storage

Adding Masonry Grid Layout to loklak Media Wall

Working on loklak media walls, I wanted to add a responsive self-adjusting grid layout for media walls. Going through the most trending media wall, I concluded that the most commonly used view for media walls is Masonry view. This view is a much similar view to the Pinterest grid layout. In fact, Masonry Desandro states that Masonry view can be one of the most responsive and most pretty ways to present cards. It is also beneficial to use masonry view as it avoids unnecessary gaps and makes full use of the display screen.

In this blog, we are going to see how I added a masonry view to the media walls without any other dependency than CSS using column-count and column-gap. We would also be looking upon how to adjust font-size using rem units to make text readable for all screen-sizes depending on number of columns.

HTML File

<span class=“masonry”>
<span class=“masonry-panel” *ngFor=“let item of (apiResponseResults$ | async)”>
<span class=“masonry-content”>
<mediawallcard [feedItem]=“item”></mediawallcard>
</span>
</span>
</span>
  • The first span with class masonry is like a container in which all the cards will be embedded. This div will provide a container to adjust the number of columns in which cards will be adjusted.
  • The second span with class masonry-panel will be the column division. These panels are like the elements of a matrix. These panels are responsive and will adjust according to the screen size.
  • The third span with class masonry-content are like the content box in which all the content will be embedded. This div will create a space in the panel for the card to be adjusted.
  • The fourth element media-wall-card are the cards in which all the feed items are placed.

CSS File

  • Adjusting columns – The column-count and column-gap property was introduced in CSS3 to divide the element in a specified number of column and to keep the specified number (whole number) of column gap between the elements respectively. For different screen sizes, these are the column count that are kept. We need adjust the number of columns according to various screen sizes so that cards neither look too stretched or too bleak. Media wall is now responsive enough to adjust on smaller screens like mobiles with one column and on larger screens too with five columns.

@media only screen and (max-width: 600px) {
.masonry {
columncount: 1;
columngap: 0;
}
} @media only screen and (min-width: 600px) and (max-width: 900px) {
.masonry {
columncount: 2;
columngap: 0;
}
} @media only screen and (min-width: 1280px) and (max-width: 1500px) {
.masonry {
columncount: 3;
columngap: 0;
}
} @media only screen and (min-width: 1500px) and (max-width: 1920px) {
.masonry {
columncount: 4;
columngap: 0;
}
} @media only screen and (min-width: 1920px) {
.masonry {
columncount: 5;
columngap: 0;
}
}
  • Adjusting Font-Size – For a fixed aspect ratio of various divisions of the media wall card, we need a central unit that can be adjusted to keep this ratio fixed. Using px will rather make the sizes fixed and adjusting these sizes for various screen sizes will make it hectic and would spoil the ratio. We will be instead using rem as a font-unit to adjust the font size for different screen sizes. Firstly, we need to assign a certain font size to all the elements in the media wall card. Now, we can configure the central font-size of the root unit for all the screen sizes using @media tag.

One thing that should be kept in mind is that The root font size should be kept more than 14px always.

@media only screen and (max-width: 600px) {
:root {
font-size: 14px;
}
}@media only screen and (min-width: 600px) and (max-width: 800px) {
:root {
font-size: 16px;
}
}@media only screen and (min-width: 800px) and (max-width: 1200px) {
:root {
font-size: 17px;
}
}@media only screen and (min-width: 1200px) and (max-width: 1500px) {
:root {
font-size: 18px;
}
}@media only screen and (min-width: 1500px) {
:root {
font-size: 20px;
}
}

 

This will create a masonry layout and now, all the cards can be adjusted in this self-adjusting grid and will look readable on all types of screen.

Reference

Continue ReadingAdding Masonry Grid Layout to loklak Media Wall