Adding Image Editor in Phimpme Android app

Image editing is a core feature of our Phimpme Android Application. We are using this already existing components Image Editor. Currently our app flow is Gallery → Open images. Now we add an option menu in the top named “Edit” which redirect to the edit Image Activity with that image path. Perform the edit operations there and apply the changed and finally updated the album with new file.

The task is to first get the image path pass to the EditImageActivity and then save the edited image in different folder after user finish with editing.

Below are the high level architecture of how it is achieved with help of ImageEditor.

Steps to integrate Editor:

  • Get the selected Image URI

Firstly, add the permission in manifest file

<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />

Get the local images in your internal or SD Card using Media Store content provider.

Uri uri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI; 

String[] projection = { MediaStore.Images.Media._ID, MediaStore.Images.Media.BUCKET_ID,MediaStore.Images.Media.BUCKET_DISPLAY_NAME }; 

Cursor cursor = getContentResolver().query(uri, projection, null, null, null);

After that you can iterate the cursor to end and get details.

  • Create a separate folder where edited images will be saved.

Now after editing we have to place the edited image into a seperate folder with a new name. You can create folder like this:

File aviaryFolder = new File(baseDir, FOLDER_NAME);

Generating new file which will be saved as edited image

public static File getEmptyFile(String name) {

  File folder = FileUtils.createFolders();

  if (folder != null) {

     if (folder.exists()) {

        File file = new File(folder, name);

        return file;

     }

  }

  return null;

}
FileUtils.getEmptyFile("edited"

     + System.currentTimeMillis() + ".png");

 

It will save new image as named edited<timeinmillis> so it will be unique.

  • StartActivityforResult with the image URI and edit activity

So firstly, when user open image and click on edit. We create an empty file using the above method. Getting the image uri and pass it into the editImageActivity. Other than Uri, also pass the output file path.

And startActivityForResult with intent and request code.

You can track the editing done by user and finally save if there is some edit done, like in Phimpme we keep a counter which increased on change of Bitmap.

Update the Album by putting the image Media Store

ContentValues values = new ContentValues(2);

String extensionName = getExtensionName(dstPath);

values.put(MediaStore.Images.Media.MIME_TYPE, "image/" + (TextUtils.isEmpty(extensionName) ? "jpeg" : extensionName));

values.put(MediaStore.Images.Media.DATA, dstPath);

context.getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);

Please visit the whole code here: github.com/fossasia/phimpme-android

Resources

Continue ReadingAdding Image Editor in Phimpme Android app

Editing files and piped data with “sed” in Loklak server

What is sed ?

“sed” is used in “Loklak Server” one of the most popular projects of FOSSASIA. “sed” is acronym for “Stream Editor” used for filtering and transforming text, as mentioned in the manual page of sed. Stream can be a file or input from a pipeline or even standard input. Regular expressions are used to filter the text and transformation are carried out using sed commands, either inline or from a file. So, most of the time writing a single line does the work of text substitution, removal or to obtaining a value from a text file.

Basic Syntax of “sed”

$sed [options]... {inline commands or file having sed commands} [input_file]...

Loklak Server uses a config.properties file – contains key-value pairs – which is the basis of the server as it contains configuration values, used by the server during runtime for various operations.

Let’s go through a simple sed example that prints line containing word “https” at the beginning in the config.properties file.

$sed -n '/^https/p' config.properties

Here “-n” option suppresses automatically printing of pattern space (pattern space is where each line is put that is to be processed by sed). Without “-n” option sed will print the whole file.

Now, the regular expression part,  “/^https” matches all the lines that has “https” at the start of line and “/p” is print command to print the output in console. Finally we provide the filename i.e. config.properties. If filename is not provided then sed waits for input from standard input.

Use cases of “sed” in Loklak Server

  • Displaying proper port number in message while starting or installing Loklak Server

The default port of loklak server is port number 9000, but it can be started in any non-occupied port by using “-p” flag with bin/start.sh and bin/installation.sh like

$ bin/installation.sh -p 8888

starts installation of Loklak Server in port 8888. To display the proper localhost address so that user can open it in a browser the port number in shortlink.urlstub parameter in config.properties needs to be changed. This is carried out by the function change_shortlink_urlstub in bin/utility.sh. The function is defined as

Now let’s try to understand what the sed command is doing.

“-i” option is used for in-place editing of the specified file i.e. config.properties in conf directory.

s” is substitute command of sed. The regular expression can be divided into two parts, between “/”:

  1. \(shortlink\.urlstub=http:.*:\)\(.*\) this is used to find the match in a line.
  2. \1′”$1″ is used to substitute the matched string in part 1.

The regular expressions can be split into groups so that operations can be performed on each group separately. A group is enclosed between “\(“ and “\)”. In our 1st part of regular expressions, there are two groups.

Dissecting the first group i.e. \(shortlink\.urlstub=http:.*:\):

  • shortlink\.urlstub=http:” will match the expression “shortlink.urlstub=http:”, here “\” is used as an escape sequence as “.” in regex represents any character.
  • “.*:”, “.” represents any character and “*” represents 0 or more characters of the previous character. So, it will basically match any expression. But, it ends with a “:”, which means any expression that ends with a “:”. Thus it matches the expression “//localhost:”.

So, the first group collectively matches the expression “shortlink.urlstub=http://localhost:”.

As described above second group i.e. \(.*\) will match any expression, here matches “9000”.

Now coming to the 2nd part of regular expression i.e. \1′”$1″:

  • “\1” represents the match of the first group in 1st part i.e. “shortlink.urlstub=http://localhost:”.
  • “$1” is the value of the first parameter provided to our function “change_shortlink_urlstub” which is an unused port where we want to run Loklak Server.

So 2nd part picks up the match from the first group and concatenates with the first parameter of the function. Assuming the first parameter to the function is “8888”, the expression for 2nd part becomes “shortlink.urlstub=http://localhost:8888” which replaces “shortlink.urlstub=http://localhost:9000”.

So a correct localhost address is displayed in the console while starting or installing Loklak Server.

  • Extracting value of a key from config.properties

“grep” and “sed” are used to extract the values of key from config.properties in bash scripts e.g. extracting default port, value of “port.http” in bin/start.sh, value of “shortlink.urlstub” in bin/start.sh and bin/installation.sh.

$ grep -iw 'port.http' conf/config.properties | sed 's/^[^=]*=//'

Here grep is used to filter a line and pass the filtered line to sed by piping the output. “i” flag is for ignoring case sensitivity and “w” flag is used for matching of word only. The output of

$ grep -iw 'port.http' conf/config.properties
port.http=9000

The aim is to get “9000”, the value of “port.http”. The approach used here is to substitute “port.http=” in the output of grep command above with a string of zero characters, that way only “9000” remains.

Let’s deconstruct the “sed” part, s/^[^=]*=//:

s” command of sed is used for substitution.  Here 1st part is “^[^=]*=” and 2nd part is nothing, as no characters are enclosed within “//”.

  • “^” means to match at the start.
  • [^characters] represents not to consider a set of characters. For matching a set of characters “^” is not used inside square brackets, [characters]. Here [^=] means not to include equal to – “=” symbol – and “*” after that makes it to match characters that are not “=”.

So “^[^=]*=” matches a sequence of characters that doesn’t start with “=” and followed by “=”. Thus the matched expression in this case is “http.port=” (“h” is not “=” and it ends with “=”), which is substituted by a string with zero characters which leaves “9000”.  Finally, “9000” is assigned to the required variable is bash script.

Continue ReadingEditing files and piped data with “sed” in Loklak server

Auto Deploying accounts.susi.ai on gh-pages

While migrating web apps from susi server repository to accounts.susi.ai repository. Our team autodeployed accounts.susi.ai on gh-pages branch i.e., each time changes are made to code, it gets auto deployed on gh-pages branch. And changes are automatically live and visible on site.

To do this we have to setup travis in our repository. Travis can be easily set in github repository just by adding .travis.yml file into root directory your repo. Depending on type of repository you are dealing with, the configuration of travis changes. For accounts.susi.ai, It is a written on top of ReactJs framework. So we used the following code.

sudo: required
dist: trusty
language: node_js
node_js:
  - 6
script:
  - npm test
deploy:
  provider: script
  script: "./deploy.sh"
  env:
  global:
  - ENCRYPTION_LABEL: "<.... encryption label from previous step ....>"
  - COMMIT_AUTHOR_EMAIL: "you@example.com"
cache:
  directories:
    - node_modules
branches:
  only:
    - master

 

The above travis configuration file, After every commit checks whether the build is passing or not by running npm test command. After checking the commit, next script deploy.sh is run on the commit. The last line tells us to only execute this script for commits in master branch. deploy.sh can be placed anywhere, we jus need to modify its path in .tavis.yml file. Here is code of deploy.sh :

#!/bin/bash
set -e 

SOURCE_BRANCH="master"
TARGET_BRANCH="gh-pages"

if [ "$TRAVIS_PULL_REQUEST" != "false" -o "$TRAVIS_BRANCH" != "$SOURCE_BRANCH" ]; then
    echo "Skipping deploy; just doing a build."
    doCompile
    exit 0
fi
# Save some useful information
REPO=`git config remote.origin.url`
SSH_REPO=${REPO}
SHA=`git rev-parse --verify HEAD`

git clone $REPO out
cd out
git checkout $TARGET_BRANCH || git checkout --orphan $TARGET_BRANCH
cd ..
git config user.name "Travis CI"
git config user.email "travis-ci@github.com"

if git diff --quiet; then
    echo "No changes to the output on this push; exiting."
    exit 0
fi

# Commit the "changes", i.e. the new version.
# The delta will show diffs between new and old versions.
git add -A .
git commit -m "Deploy to GitHub Pages: ${SHA}"

# Get the deploy key by using Travis's stored variables to decrypt deploy_key.enc
ENCRYPTED_KEY_VAR="encrypted_${ENCRYPTION_LABEL}_key"
ENCRYPTED_IV_VAR="encrypted_${ENCRYPTION_LABEL}_iv"
ENCRYPTED_KEY=${!ENCRYPTED_KEY_VAR}
ENCRYPTED_IV=${!ENCRYPTED_IV_VAR}
openssl aes-256-cbc -K $ENCRYPTED_KEY -iv $ENCRYPTED_IV -in ../deploy_key.enc -out ../deploy_key -d
chmod 600 ../deploy_key
eval `ssh-agent -s`
ssh-add deploy_key

# Now that we're all set up, we can push.
git push $SSH_REPO $TARGET_BRANCH

 

deploy.sh is automatically deploying master on gh pages branch. To perform this task, it needs admin authorisation of github repo. We do this authentication by encrypted keys.
To create encrypted keys we need to generate a new SSH key, SSH keys can be generated by running following command in terminal.

ssh-keygen -t rsa -b 4096 -C "your_email@example.com"

 

Then add this key to your project’s git repository at the github repo of project.

https://github.com/<your name>/<your repo>/settings/keys

Now, We can generate encrypted keys by running following command.

travis encrypt-file deploy_key

 

This command generates deploy_key.enc file, which shall be placed in the repo. Its location needs to be updated in the placeholders of .travis.yml and deploy.sh.

Hence, we achieved the task of auto deployment to gh-pages in accounts.susi.ai. Please ensure that account through which encrypted keys are created, always have admin access to repository for continued deployment.

Continue ReadingAuto Deploying accounts.susi.ai on gh-pages

Scraping in JavaScript using Cheerio in Loklak

FOSSASIA recently started a new project loklak_scraper_js. The objective of the project is to develop a single library for web-scraping that can be used easily in most of the platforms, as maintaining the same logic of scraping in different programming languages and project is a headache and waste of time. An obvious solution to this was writing scrapers in JavaScript, reason JS is lightweight, fast, and its functions and classes can be easily used in many programming languages e.g. Nashorn in Java.

Cheerio is a library that is used to parse HTML. Let’s look at the youtube scraper.

Parsing HTML

Steps involved in web-scraping:

  1. HTML source of the webpage is obtained.
  2. HTML source is parsed and
  3. The parsed HTML is traversed to extract the required data.

For 2nd and 3rd step we use cheerio.

Obtaining the HTML source of a webpage is a piece of cake, and is done by function getHtml, sync-request library is used to send the “GET” request.

Parsing of HTML can be done using the load method by passing the obtained HTML source of the webpage, as in getSearchMatchVideos function.

var $ = cheerio.load(htmlSourceOfWebpage);

 

Since, the API of cheerio is similar to that of jquery, as a convention the variable to reference cheerio object which has parsed HTML is named “$”.

Sometimes, the requirement may be to extract data from a particular HTML tag (the tag contains a large number of nested children tags) rather than the whole HTML that is parsed. In that case, again load method can be used, as used in getVideoDetails function to obtain only the head tag.

var head = cheerio.load($("head").html());

html” method provides the html content of the selected tag i.e. <head> tag. If a parameter is passed to the html method then the content of selected tag (here <head>) will be replaced by the html of new parameter.

Extracting data from parsed HTML

Some of the contents that we see in the webpage are dynamic, they are not static HTML. When a “GET” request is sent the static HTML of webpage is obtained. When Inspect element is done it can be seen that the class attribute has different value in the webpage we are using than the static HTML we obtain from “GET” request using getHtml function. For example, inspecting the link of one of suggested videos, see the different values of class attribute :

 

  • In website (for better view):

  • In static HTML, obtained from “GET” request using getHtml function (for better view):

So, it is recommended to do a check first, whether attributes have same values or not, and then proceed accordingly.

Now, let’s dive into the actual scraping stuff.

As most of the required data are available inside head tag in meta tag. extractMetaAttribute function extracts the value of content attribute based on another provided attribute and its value.

function extractMetaAttribute(cheerioObject, metaAttribute, metaAttributeValue) {
	var selector = 'meta[' + metaAttribute + '="' + metaAttributeValue + '"]';
	return cheerioFunction(selector).attr("content");
}

cheerioObject” here will be the “head” object created above.

For example, our final JSONObject contains a og_url key-value pair, to get that we need to obtain the following html element.

<meta property="og:url" content="https://www.youtube.com/watch?v=KVGRN7Z7T1A">

 

This can be obtained by:

  1. Writing a selector for property attribute of meta. The selector would be ‘meta[property=”og:url”]’.
  2. The selector is passed to cheerioObject.
  3. Then attr method is used to obtain the value of content attribute.
  4. Finally, we set the obtained value of content attribute as the value of JSONObject’s key.

Similarly og:site_name, og:url and other values can be extracted, which in the final JSONObject would be the value of keys og_site_name, og_url and similarly. Since, a lot of data needs to be extracted this way, the extractMetaAttribute function generalizes it, where metaAttribute is “property” and metaAttributeValue is “og:url” in the above example.

If one parameter is provided in attr method, then it is used as a getter method, the value of that attribute is returned. If two parameters are provided then first parameter is the name of attribute and second parameter is the value of attribute, in this case it is used as a setter method.

Now, what if the provided selector matches more than one html element and we need to extract data or perform some operations on all of them. The answer is using each method on the cheerio Object, it iterates over the matched elements and executes the passed function – as a parameter – on them. The passed function has two parameters, the index of matched element and the matched element itself. To break out of the loop early, false is returned.

One of the use case of each method in youtube scraper is to extract related “tags” of the video.

Selector for this would be ‘meta[property=”og:video:tag”]’ and as it is inside a head tag, we can use the already created head tag. Applying the each method, it becomes:

head('meta[property="og:video:tag"]').each(function(i, element) {
    // the logic goes here
});

 

Here for the first iteration the value of “i” will be “0” and “element” will be

<meta property="og:video:tag" content="Iggy">

 

and so on. We need to obtain the value of content attribute, so we can use attr method as used above. Finally all the values are pushed to an array. Hence, the final code snippet with logic.

var ary = [];
head('meta[property="og:video:tag"]').each(function(i, element) {
    ary.push(head(element).attr("content"));
});

 

The same functionality is implemented in extractMetaProperties method.

function extractMetaProperties(cheerioObj, metaProperty) {
	var properties = [];
	var selector = 'meta[property="' + metaProperty + '"]';
	cheerioObj(selector).each(function(i, element) {
		properties.push(cheerioObj(element).attr("content"));
	});
	return properties;}
Continue ReadingScraping in JavaScript using Cheerio in Loklak

Adding Custom Scrollbar to SUSI AI Web Chat

Scrollbar represents the depth of content on your current screen. It appears when the content has overflown the depth of screen and cannot fit it anymore. We see scrollbars everywhere. By default, the scrollbar provided by the browser is not very attractive but efficient in doing its job.

We decided that as our SUSI.AI Web App is improving in both UI and functionality, let’s add a custom scrollbar to it.

Earlier we had a standard  scrollbar in our SUSI.AI webchat:

For adding a custom scrollbar to our web chat we decided to use react-custom-scrollbars npm-package.

Our reasons to choose this package were:

  • Auto Hide feature in the scrollbar, after a specific period of time, which we can modify too.
  • No requirement for extra CSS styles to style our scrollbar.
  • It is well tested and trusted by many developers in open source

To install this npm package:

npm install -S react-custom-scrollbars 

Now comes the usage part, we need to import this into our JavaScript file:

 import { Scrollbars } from 'react-custom-scrollbars';

After importing, wrap it around the data where you want to show a custom scrollbar. In our case it was messageListItems, the code snippet looked like:

<Scrollbars>
 {messageListItems}
</Scrollbars>

This made our scrollbar look much better than the default one:

Now to add Auto Hide functionality to our scrollbar we need to add some attributes to our <Scrollbars>  tag.

    1. autoHide: It allows the auto-hide feature to our scrollbar.
    2. autoHideTimeout: It allows us to set custom time of hiding delay of a scrollbar in milli-seconds.
    3. autoHideDuration: it allows us to set the duration of hiding animation in milliseconds.

After adding the above-mentioned attributes our code changes to:

<Scrollbars
 autoHide
 autoHideTimeout={1000}
 autoHideDuration={200}>
 {messageListItems}
</Scrollbars>

Resources:

A lot more of custom attributes can be found in the documentation of Malte Wessel here.

Testing Link:

Now we had a much better scrollbar for our web chat which can be tested here.

Continue ReadingAdding Custom Scrollbar to SUSI AI Web Chat

Refactor of Dropdown Menu in Susper

The first version of the Susper top menu was providing links to resources and tutorials. In the next version of the menu, we were looking for a menu with more colorful icons, a cleaner UI design and a menu that should appear on the homepage as well. In this blog, I will discuss about refactoring the dropdown menu. This is how earlier dropdown of Susper looks like:

We decided to create a separate component for the menu DropdownComponent.

At first, I created a drop down menu with matching dimensions similar to what Google follows. Then, I gave padding: 28px to create similar UI to market leader. This will make a dropdown menu with clean UI design. I replaced the old icons with colorful icons. In the dropdown we have:

  • Added more projects of FOSSASIA like eventyay, loklak, susi and main website of FOSSASIA. Here how it looks now :

The main problem I faced was aligning the content inside the dropdown and they should not get disturbed when the screen size changes.
I kept the each icon dimensions as 48 x 48 inside drop down menu. I also arranged them in a row. It was easy to use div element to create rows rather than using ul and li tags which were implemented earlier.

To create a horizontal grey line effect, I used the hr element. I made sure, padding remained the same above and below the horizontal line.

At the end of drop down menu, @mariobehling suggested instead of writing ‘more’, it should redirect to projects page of FOSSASIA.

This is how I worked on refactoring drop down menu and added it on the homepage as well.

Resources

Continue ReadingRefactor of Dropdown Menu in Susper

How SUSI AI Searches the Web For You

SUSI is now capable of performing web search to answer your queries. When SUSI doesn’t know how to answer your queries, it performs a web search on the client side and displays all the results as horizontally swipeable tiles with each tile giving a brief description and also providing a link to the relevant source.

Lets visit SUSI WebChat and try it out.

Query : Search for Google
Response : <Web Search Results>

How does SUSI know when to perform a websearch?

It uses action types to identify if a web search is to be performed or not. The API Response is parsed to check for the action types and if a websearch action type is present, then an API call is made using the duckduckgo api with the relevant query and the results are displayed as tiles with :

  • Category : The Topic related to the given query
  • Text : The result from the websearch
  • Image : Image related to the query if present
  • Link : A url redirecting to the relevant source

Parsing the actions :

Let us look at the API response for a query.

Sample Query: search for google

Response: <API Response>

"actions": [
  {
    "type": "answer",
    "expression": "Here is a web search result:"
  },
  {
    "type": "websearch",
    "query": "google"
  }
]

Note: The API Response has been trimmed to show only the relevant content

We find a websearch type action and the query to be searched as google . So we now make a API call using duckduckgo api to get our websearch results.

API Call Format : api.duckduckgo.com/?q={query}&format=json

API Call for query google : http://api.duckduckgo.com/?q=google&format=json

And from the duckduckgo API response we generate our websearch tiles showing relevant data using the fields present in each object.

This is the sample object from duckduckgo API response under the RelatedTopics , which we use to create our websearch result tiles.

{
  "Result": "<a href=\"https:\/\/duckduckgo.com\/Google\">Google<\/a> An American multinational technology company specializing in Internet-related services and...",
  "Icon": {
    "URL": "https:\/\/duckduckgo.com\/i\/8f85c93f.png",
    "Height": "",
    "Width": ""
  },
  "FirstURL": "https:\/\/duckduckgo.com\/Google",
  "Text": "Google An American multinational technology company specializing in Internet-related services and..."
},

Let us look at the code for querying data from the API

if (actions.indexOf('websearch')>=0) {

  let actionIndex = actions.indexOf('websearch');
  let query = response.answers[0].actions[actionIndex].query;

  $.ajax({
    url: 'http://api.duckduckgo.com/?format=json&q='+query,
    dataType: 'jsonp',
    crossDomain: true,
    timeout: 3000,
    async: false,

    success: function (data) {
      receivedMessage.websearchresults = data.RelatedTopics;

      if(data.AbstractText){
        let abstractTile = {
          Text:'',
          FirstURL:'',
          Icon:{URL:''},
        }
        abstractTile.Text = data.AbstractText;
        abstractTile.FirstURL = data.AbstractURL;
        abstractTile.Icon.URL = data.Image;
        receivedMessage.websearchresults.unshift(abstractTile);
    }

    let message =  ChatMessageUtils.getSUSIMessageData(
receivedMessage, currentThreadID);

    ChatAppDispatcher.dispatch({
      type: ActionTypes.CREATE_SUSI_MESSAGE,
      message
    });
  },

    error: function(errorThrown) {
      console.log(errorThrown);
      receivedMessage.text = 'Please check your internet connection';
    }

  });

}

Here, from the actions object, we get the query needed to search the web. We then make a ajax call using that query to the duckduckgo API. If the API call succeeds then we collect the required data to create tiles as array of objects and store it as websearchresults. and dispatch the message with the websearchresults which gets reflected in the message store and when passed to the components we use it to create the result tiles.

<MuiThemeProvider>
  <Paper zDepth={0} className='tile'>
    <a rel='noopener noreferrer'
    href={tile.link} target='_blank'
    className='tile-anchor'>
    {tile.icon &&
    (<Paper className='tile-img-container'>
      <img src={tile.icon}
      className='tile-img' alt=''/>
     </Paper>
    )}
  <Paper className='tile-text'>
    <p className='tile-title'>
      <strong>
        {processText(tile.title,'websearch-rss')}
      </strong>
    </p>
    {processText(tile.description,'websearch-rss')}
  </Paper>
  }
  </a>
  </Paper>
</MuiThemeProvider>

We then display the tiles as horizontally swipeable carousel ensuring a good and interactive UX.

React-Slick module was used to implement the horizontal swiping feature.

function renderTiles(tiles){

if(tiles.length === 0){
  let noResultFound = 'NO Results Found';
  return(<center>{noResultFound}</center>);
}

let resultTiles = drawTiles(tiles);

var settings = {
  speed: 500,
  slidesToShow: 3,
  slidesToScroll: 1,
  swipeToSlide:true,
  swipe:true,
  arrows:false
};

return(
    <Slider {...settings}>
      {resultTiles}
    </Slider>
);

}

Here we are handling the corner case when there are no results to display by rendering `NO Results found`. We then have our web search results displayed as swipeable tiles with a image, title, description and link to the source.

This is how SUSI performs web search to respond to user queries ensuring that no query goes unanswered! Don’t forget to swipe left and go through all the results displayed!

Resources

Continue ReadingHow SUSI AI Searches the Web For You

Generate Sine Waves with PSLab Device

Sine wave is type of a waveform with much of a use in frequency related studies in laboratories as well as power electronics to control the level of input to devices. PSLab device  is capable of generating sine waves with a very high accuracy using PSLab-firmware and a set of filters implemented in the PSLab-hardware.

How Sine Wave is generated in PSLab Device

PSLab device uses a PIC micro-controller as its main processor. It has several pins which can generate square pulses at different duty cycles. These are known as PWM pins. PWM waves are a type of a waveform with the shape resembling a set of square pulses. They have an attributed called ‘Duty Cycle’ which varies between 0% to 100%. A PWM wave with 0% duty cycle means simply a zero amplitude block of square pulses repeating at every period. When duty cycle is set to 100%, it is a set of square pulses with the highest amplitude throughout the period repeating in every period. The following figure illustrates how the PWM wave changes according to its duty cycle. Image is extracted from http://static.righto.com/images/pwm1.gif PSLab device is capable of generating this type of pulses with arbitrary duty cycles as per user requirements.

In this context where sine waves are generated, these PWM pins are used to generate a Sinusoidal Pulse Width Modulated (SPWM) waveform as the first step to output a sine wave with high frequency accuracy. The name SPWM is derived from the fact that the duty cycle of the waveform follows an alternatively increasing and decreasing pattern as illustrated in the figure below.

Deriving a set of duty cycles which follows a sinusoidal pattern is a redundant task. Without deriving them mathematically, PSLab firmware has four hard-coded sine_tables which stores different duty cycle values related to a SPWM waveform. These sine_tables in the firmware related to different resolutions set by the PSLab device user. The following code block is extracted from PSLab firmware related to one of the sine_tables. It is used to generate the SPWM wave with 512 data points. Each data point represents a square pulse with a different pulse width. The duty ratio is calculated from dividing an entry by the value 512 and converting it to a percentage.

sineTable1[] = {256, 252, 249, 246, 243, 240, 237, 234, 230, 227, 224, 221, 218, 215, 212, 209, 206, 203, 200, 196, 193, 190, 187, 184, 181, 178, 175, 172, 169, 166, 164, 161, 158, 155, 152, 149, 146, 143, 141, 138, 135, 132, 130, 127, 124, 121, 119, 116, 114, 111, 108, 106, 103, 101, 98, 96, 93, 91, 89, 86, 84, 82, 79, 77, 75, 73, 70, 68, 66, 64, 62, 60, 58, 56, 54, 52, 50, 48, 47, 45, 43, 41, 40, 38, 36, 35, 33, 32, 30, 29, 27, 26, 25, 23, 22, 21, 19, 18, 17, 16, 15, 14, 13, 12, 11, 10, 9, 8, 8, 7, 6, 6, 5, 4, 4, 3, 3, 2, 2, 2, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 2, 2, 2, 3, 3, 4, 4, 5, 6, 6, 7, 8, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 21, 22, 23, 25, 26, 27, 29, 30, 32, 33, 35, 36, 38, 40, 41, 43, 45, 47, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 73, 75, 77, 79, 82, 84, 86, 89, 91, 93, 96, 98, 101, 103, 106, 108, 111, 114, 116, 119, 121, 124, 127, 130, 132, 135, 138, 141, 143, 146, 149, 152, 155, 158, 161, 164, 166, 169, 172, 175, 178, 181, 184, 187, 190, 193, 196, 200, 203, 206, 209, 212, 215, 218, 221, 224, 227, 230, 234, 237, 240, 243, 246, 249, 252, 256, 259, 262, 265, 268, 271, 274, 277, 281, 284, 287, 290, 293, 296, 299, 302, 305, 308, 311, 315, 318, 321, 324, 327, 330, 333, 336, 339, 342, 345, 347, 350, 353, 356, 359, 362, 365, 368, 370, 373, 376, 379, 381, 384, 387, 390, 392, 395, 397, 400, 403, 405, 408, 410, 413, 415, 418, 420, 422, 425, 427, 429, 432, 434, 436, 438, 441, 443, 445, 447, 449, 451, 453, 455, 457, 459, 461, 463, 464, 466, 468, 470, 471, 473, 475, 476, 478, 479, 481, 482, 484, 485, 486, 488, 489, 490, 492, 493, 494, 495, 496, 497, 498, 499, 500, 501, 502, 503, 503, 504, 505, 505, 506, 507, 507, 508, 508, 509, 509, 509, 510, 510, 510, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 510, 510, 510, 509, 509, 509, 508, 508, 507, 507, 506, 505, 505, 504, 503, 503, 502, 501, 500, 499, 498, 497, 496, 495, 494, 493, 492, 490, 489, 488, 486, 485, 484, 482, 481, 479, 478, 476, 475, 473, 471, 470, 468, 466, 464, 463, 461, 459, 457, 455, 453, 451, 449, 447, 445, 443, 441, 438, 436, 434, 432, 429, 427, 425, 422, 420, 418, 415, 413, 410, 408, 405, 403, 400, 397, 395, 392, 390, 387, 384, 381, 379, 376, 373, 370, 368, 365, 362, 359, 356, 353, 350, 347, 345, 342, 339, 336, 333, 330, 327, 324, 321, 318, 315, 311, 308, 305, 302, 299, 296, 293, 290, 287, 284, 281, 277, 274, 271, 268, 265, 262, 259};

 

The frequency of the sine wave is achieved using interrupts generated at different time intervals depending on the frequency set by the user. The accuracy of the frequency depends on the number of elements in the sine table array which is known as resolution. As the number of points in the table increases, the accuracy will be increased or resolution will be high. PSLab uses Timer3 and Timer4 counters available in the PIC micro-controller to generate the interrupt time intervals. If the required frequency is f, the interrupt time interval can be derived as

Interrupt time = f/512

The derived SPWM waveform will be then passed through a cascaded setup of Op Amp and RC filter as in Figure 1 to cut off the high frequency components and combine the square pulses in such a manner that a smoother waveform is derived resembling a sine wave.

Filter Circuit in PSLab

Figure 1

This is an inverting filter circuit designed using Op Amps available in PSLab-hardware. The SPWM waveform will be connected to the circuit through R1 resistor. It uses C1 capacitor which creates a short circuit path to high frequency components to ground which will let only the low frequency components to pass through. This will smoothen the square waves reducing the sharp edges forming a simple RC filter circuit.

The C2 capacitor plays an important role in generating the sine wave. It will compensate any voltage drops and absorb excess voltage levels that might occur during transition to let the output waveform follow a path which is similar to a smooth sine wave.

The output waveform can be observed from the SINE1 pin of the PSLab device. As in this schematic, it is the ‘Sine Wave’ pin to the right starting from the Op Amp output.

Resources:

Continue ReadingGenerate Sine Waves with PSLab Device

React Routes to Deploy 404 page on gh-pages and surge

Web applications need 404 page to handle broken urls. If we can have a productive 404 page we can keep users with our application. This is how we made 404 page to SUSI.AI web chat application.
React routes ?
We use react routes to navigate throughout the application.we have to define which page to go from each and every route. If user is trying to go to broken Link application should show 404 page.
In the SUSI Web Chat application we have setuped routes in index.js file which is on the root of the application.

<Router history={hashHistory}>
        <MuiThemeProvider>
            <Switch>
                <Route exact path="/" component={ChatApp} />
                <Route exact path="/signup" component={SignUp} />
                <Route exact path="/logout" component={Logout} />
                <Route exact path="/forgotpwd" component={ForgotPassword} />
                <Route exact path="*" component={NotFound} />
            </Switch>
        </MuiThemeProvider>
    </Router>

 

“<Route exact path=”*” component={NotFound} />”  This line defines the 404 page route . this should be defined after all other routes. Because application should first search for defined routes.If the requested route is not defined, Application should show the 404 route.
To use these JSX elements we have to import this dependency on top of the index.js

import {
    BrowserRouter as Router,
    Route,
    Switch,
    hashHistory
} from 'react-router-dom';

 

After you define like this It will work correctly on your localhost after you deployed it on gh-pages or surge it will not be work there as we wish.
When we try to access URL directly or when we try to access wrong URL it will redirect to default github 404 page.

After we built our application we get static index.html file and another set of files.when we try to access chat.susi.ai it will load index.html file. If we access another url it will automatically loads default github pages 404 page.
So we can do a little hack for use direct URLs like this .
We can add same copy of index.html file as 404.html then user tries to go to different URL instead of chat.susi.ai It loads our 404.html file since it contains all other routes it redirects to the correct route.if there is no matching route it shows our 404 page instead of default github pages 404 page.
Since all our deployment tasks are handle by we have to add this actions into deploy.sh file like this.

rm -rf node_modules/	 
mv ../build/* .
cp index.html 404.html   <--- this part

 

Then it will create 404.html file with the content of index.html file after each and every commit.
If you need to do the same thing on surge.sh (we use surge for show previews of every PR ) we have to add a copy of index.html file as 200.html .
You can do this after you run the

npm run deploy

 

Just before giving the deployment URL you need to create a copy of index.html file on build folder and it should be renamed as 200.html and continue.now it will work as we wish. This is all for today.

Resources

  • Read More – Adding a 200 page for client-side routing : https://surge.sh/help/adding-a-200-page-for-client-side-routing
Continue ReadingReact Routes to Deploy 404 page on gh-pages and surge

How SUSI AI Tabulates Answers For You

SUSI is an artificial intelligence chat bot that responds to all kinds of user queries. It isn’t any regular chat bot replying in just plain text. It supports various response types which we refer to as ‘actions’. One such action is the “table” type. When the response to a user query contains a list of answers which can be grouped, it is better visualised as a table rather than plain text.

Lets visit SUSI WebChat and try it out. In our example we ask SUSI for the 2009 race statistics of British Formula 1 racing driver Lewis Hamilton.

Query: race stats of hamilton in f1 season 2009

Response: <table> (API response)

 

 

How does SUSI do that? Let us look at the skill teaching SUSI to give table responses.

# Returns race stats as a table

race summary of  * in f1 season *|race stats of  * in f1 season *
!console:
{
  "url":"http://ergast.com/api/f1/$2$/drivers/$1$/status.json",
  "path":"$.MRData.StatusTable.Status",
  "actions":[{
     "type":"table",
     "columns":{"status":"Race Status","count":"Number Of Races"}
   }]
}
eol

Here, we are telling SUSI that the data type is a table through type attribute in actions and also defining column names and under which column each value must be put using their respective keys. Using this information SUSI generates a response accordingly with the table schema and data points.

How do we know when to render a table?

We know it through the type attribute in the actions from the API response.

"actions": [{
  "type": "table",
  "columns": {
    "status": "Race Status",
    "count": "Number Of Races"
  },
  "count": -1
  }]
}],

We can see that the type is table so we now know that we have to render a table.

But what is the table schema? What do we fill it with?

There is a columns key under actions and from the value of the columns key we get a object whose key value pairs give us column names and what data to put under each column.

Here, we have two columns – Race Status and Number Of Races

And the data to put under each column is found in answers[0].data with same keys as those for each column name i.e ‘status’ and ‘count’.

Sample data object from the API response:

{
  "statusId": "2",
  "count": "1",
  "status": "Disqualified"
}

The value under ‘status’ key is ‘Disqualified’ and the column name for ‘status’ key is ‘Race Status’, so Disqualified is entered under Race Status column in the table. Similarly 1  is entered under Number Of Races column. We thus have a row of our table. We populate the table for each object in the data array using the same procedure.

let coloumns = data.answers[0].actions[index].columns;
let count = data.answers[0].actions[index].count;
let table = drawTable(coloumns,data.answers[0].data,count);

We also have a ’count’ attribute in the API response . This is used to denote how many rows to populate in the table. If count = -1 , then it means infinite or to display all the results.

function drawTable(coloumns,tableData,count){

let parseKeys;
let showColName = true;

if(coloumns.constructor === Array){
  parseKeys = coloumns;
  showColName = false;
}
else{
  parseKeys = Object.keys(coloumns);
}

let tableheader = parseKeys.map((key,i) =>{
return(<TableHeaderColumn key={i}>{coloumns[key]}</TableHeaderColumn>);
});

let rowCount = tableData.length;

if(count > -1){
  rowCount = Math.min(count,tableData.length);
}

let rows = [];

for (var j=0; j < rowCount; j++) {

  let eachrow = tableData[j];

  let rowcols = parseKeys.map((key,i) =>{
    return(
        <TableRowColumn key={i}>
          <Linkify properties={{target:'_blank'}}>
            {eachrow[key]}
          </Linkify>
        </TableRowColumn>
      );
  });

  rows.push(
      <TableRow key={j}>{rowcols}</TableRow>
  );

}

const table =
  <MuiThemeProvider>
    <Table selectable={false}>
      <TableHeader displaySelectAll={false} adjustForCheckbox={false}>
        { showColName && <TableRow>{tableheader}</TableRow>}
      </TableHeader>
      <TableBody displayRowCheckbox={false}>{rows}</TableBody>
    </Table>
  </MuiThemeProvider>

return table;

}

Here we first determine how many rows to populate using the count attribute and then parse the columns to get the column names and keys. We then loop through the data and populate each row.

This is how SUSI responds with tabulated data!

You can create your own table skill and SUSI will give the tabulated response you need. Check out this tutorial to know more about SUSI and the various other action types it supports.

Resources

Continue ReadingHow SUSI AI Tabulates Answers For You