Adding Event Overview Route in Open Event Frontend

In Open Event Frontend we have an event overview route which is like a mini dashboard for an event where information regarding event sponsors, general info, roles, tickets, event setup etc. is present. All of the information is present in their corresponding components and this dashboard is made up of those components. To create this dashboard we will first create its components.

To create a component we will use following ember command-

ember -g component <component-name>

This command will give us three files: a template, a component and a test file corresponding to that component. We will use this command to generate all our components.

Now let’s discuss each component separately and see how many of them are combined to form this route-

The event-setup-checklist component contains semantic ui’s steps to maintain checklist of basic-details, sponsors, session & microlocation, call for speakers, session and speakers form customization so that it becomes easy to identify which step is complete and which is not.

Next is general-info component which shows basic information about an event like start-time, end-time, location, number of speakers, number of sponsors etc. It also shows whether the event is live or not.

In manage-roles component, manage the role for a given person, add people and assign different roles to them, edit roles for different people. Also we can see who are invited for a given role and who accepted them.

In event-sponsors component we manage the sponsors for the event, edit an existing sponsor, add a new sponsor with their logo, name, type and level. Also we can delete an existing sponsor.

Next is the ticket component which displays the details of number of orders, number of tickets sold, and total sales. Also it displays the number of types of tickets are sold.

Next is our app-component which has two choices. First is to generate android app for the event and second is to generate webapp of the event.

And finally in our view.index template, we add these components using ui stackable grid layout. Whenever we want to conditionally show or hide a component, we can do that in our event.index template and hence it becomes very easy to manage huge amounts content on a single page.

Resources

Continue ReadingAdding Event Overview Route in Open Event Frontend

Displaying Upcoming Sessions at a Microlocation Open Event Android

When I am attending a session in a room, I don’t get information on what is coming up.”

The issue that the user expressed was that he wanted to know what were the upcoming sessions at a microlocation. While I took up this issue in Open Event Android a few days back, I was thinking of ways about how this can be implemented. The app should be easy-to-use even for non-developers and thus, any new feature shouldn’t be too complex in its implementation. We decided upon doing the following:

  • Adding an “upcoming” option in the options menu of the Location activity.
  • This option’s purpose was to trigger the app to show information about the upcoming session in that microlocation.

Initial changes in LocationActivity.java

First of all, we added a new icon in the options menu of LocationActivity.java. One of the things that we learnt there was to use ifRoom|collapseActionView option for the app:showAsAction  

attribute as frequently as possible. This option ensures that the title in the option’s menu is visible at all times irrespective of the options being visible along with their icons.

So in case, the title is too big and there is very little room for the options to appear individually, then instead of squeezing down the title, the “ifRoom” attribute will collapse the option icons and insert a 3-dotted drop-down option list with all the options appearing in the drop-down.

Something like this:

The icon’s XML element and UI looked something like this:

<item
       android:id="@+id/upcoming_sessions"
       android:icon="@drawable/ic_timeline_white_24dp"
       android:title="@string/upcoming"
       app:showAsAction="ifRoom|collapseActionView"
app:actionViewClass="android.support.v7.widget.Button"/>

About the drawable icon that you see in the screenshot above, it was a tough find. Before I talk about how I came across this icon, I will talk about adding an icon in Android Studio.

How to add an icon in Android Studio?

Adding an item in Android studio means adding a drawable at a basic level. You can find all drawables under the app/src/main/res/drawable folder.

To add a new drawable, right-click on the drawable folder and go to new –>Vector asset. A window similar to what is shown below will appear.

Now, on selecting the “icon” option you will be taken to a huge list of icons that you can add in your app and then use them subsequently. But the problem here is that it is tough at times to find the icon that will be fit for your purpose. Like in my case, there was no direct icon for “upcoming”. This is when we had to do something more. We had to browse to this amazing site by Google: https://material.io/icons/ This site shows all the available icons in a much more interactive way and it was a lot more easier for me to come across the icon we wanted using this site.

The vector drawable file for the icon we chose looks like this:

<vector xmlns:android="http://schemas.android.com/apk/res/android"
       android:width="24dp"
       android:height="24dp"
       android:viewportWidth="24.0"
       android:viewportHeight="24.0">
   <path
       android:fillColor="#FFFFFF"
       android:pathData="M23,8c0,1.1 -0.9,2 -2,2 -0.18,0 -0.35,-0.02 -0.51,-0.07l-3.56,3.55c0.05,0.16 0.07,0.34 0.07,0.52 0,1.1 -0.9,2 -2,2s-2,-0.9 -2,-2c0,-0.18 0.02,-0.36 0.07,-0.52l-2.55,-2.55c-0.16,0.05 -0.34,0.07 -0.52,0.07s-0.36,-0.02 -0.52,-0.07l-4.55,4.56c0.05,0.16 0.07,0.33 0.07,0.51 0,1.1 -0.9,2 -2,2s-2,-0.9 -2,-2 0.9,-2 2,-2c0.18,0 0.35,0.02 0.51,0.07l4.56,-4.55C8.02,9.36 8,9.18 8,9c0,-1.1 0.9,-2 2,-2s2,0.9 2,2c0,0.18 -0.02,0.36 -0.07,0.52l2.55,2.55c0.16,-0.05 0.34,-0.07 0.52,-0.07s0.36,0.02 0.52,0.07l3.55,-3.56C19.02,8.35 19,8.18 19,8c0,-1.1 0.9,-2 2,-2s2,0.9 2,2z"/>
</vector>

What would the upcoming icon do?

Keeping in mind the necessity for the feature to be less complex, I decided that the upcoming icon will lead the user to a dialog box that shows the status of upcoming sessions in that micro location. The implementation for this feature involved 2 main things:

  1. Finding out the upcoming session from the list of sessions in the microlocation.
  2. Generate a dialog box that shows information about that session.

Finding position of upcoming session in Recycler View

Upcoming session will be a session whose starting time comes after the current time. So the approach was simple.

  1. Run a loop on a sorted list of all sessions in a microlocation.
  2. Find out every session’s start time.
  3. Compare the start time of every session with the current time.
  4. Find the first session whose start time comes after the current time.
  5. Store that session’s position, name, ID and other stuff like track name and track color.
  6. Break out of the loop.

This was the basic logic or algorithm, so to say. Here’s the implementation in the upcomingSession() function:

public void upcomingSession(){
   String upcomingTitle = "";
   String track = "";
   String color = null;
   Date current = new Date();
   for (Session sess:sortedSessions){
       try {
           Date start = DateUtils.getDate(sess.getStartsAt());
           if (start.after(current)){
               upcomingTitle = sess.getTitle();
               track = sess.getTrack().getName();
               color = sess.getTrack().getColor();
               break;
           }
       } catch (ParseException e) {
           e.printStackTrace();
       }
   }

Now, displaying a dialog box consisting of all the necessary information is an easy thing to do once you have the required information. So, I’ll just provide some code for it here without explaining much about it.

The initialisations:

public void upcomingSessionsInitial(){
   upcomingDialogBox = new Dialog(this);
           upcomingDialogBox.setContentView(R.layout.upcoming_dialogbox);
           trackImageIcon = (ImageView)upcomingDialogBox.findViewById(R.id.track_image_drawable);
           upcomingSessionText = (TextView) upcomingDialogBox.findViewById(R.id.upcoming_session_textview);
           upcomingSessionTitle = (TextView) upcomingDialogBox.findViewById(R.id.upcoming_Session_title);
           Button dialogButton = (Button) upcomingDialogBox.findViewById(R.id.upcoming_button);
           dialogButton.setOnClickListener(new View.OnClickListener() {
               @Override
               public void onClick(View view) {
                   upcomingDialogBox.dismiss();
               }
           });
}

The calling:

switch (item.getItemId()){
       case R.id.action_map_location:
           FragmentManager fragmentManager = getSupportFragmentManager();
           FragmentTransaction fragmentTransaction = fragmentManager.beginTransaction();

           Bundle bundle = new Bundle();
           bundle.putBoolean(ConstantStrings.IS_MAP_FRAGMENT_FROM_MAIN_ACTIVITY, false);
           bundle.putString(ConstantStrings.LOCATION_NAME, location);

           Fragment mapFragment = ((OpenEventApp)getApplication())
                   .getMapModuleFactory()
                   .provideMapModule()
                   .provideMapFragment();
           mapFragment.setArguments(bundle);
           fragmentTransaction.replace(R.id.content_frame_location, mapFragment, FRAGMENT_TAG_LOCATION).addToBackStack(null).commit();

           sessionRecyclerView.setVisibility(View.GONE);
           noSessionsView.setVisibility(View.GONE);
           menu.setGroupVisible(R.id.menu_group_location_activity, false);
           return true;
       case android.R.id.home:
           onBackPressed();
           getSupportFragmentManager().popBackStack();
           sessionRecyclerView.setVisibility(View.VISIBLE);
           return true;
       case R.id.upcoming_sessions:
           upcomingDialogBox.show();
           return true;
       default:
           return true;
   }
}

Final result:

This is the final result or solution that we generated for the issue that was addressed by one of the users:

Some useful links are:

Continue ReadingDisplaying Upcoming Sessions at a Microlocation Open Event Android

File Upload Validations on Open Event Frontend

In Open Event Frontend we have used semantics ui’s form validations to validate different fields of a form. There are certain instances in our app where the user has to upload a file and it is to be validated against the suggested format before uploading it to the server. Here we will discuss how to perform the validation.

Semantics ui allows us to validate by facilitating pass of an object along with rules for its validation. For fields like email and contact number we can pass type as email and number respectively but for validation of file we have to pass a regular expression with allowed extension.The following walks one through the process.

fields : {
  file: {
    identifier : 'file',
    rules      : [
      {
        type   : 'empty',
        prompt : this.l10n.t('Please upload a file')
      },
      {
        type   : 'regExp',
        value  : '/^(.*.((zip|xml|ical|ics|xcal)$))?[^.]*$/i',
        prompt : this.l10n.t('Please upload a file in suggested format')
      }
    ]
  }
}

Here we have passed file element (which is to be validated) inside our fields object identifier, which for this field is ‘file’, and can be identified by its id, name or data-validate property of the field element. After that we have passed an array of rules against which the field element is validated. First rule gives an error message in the prompt field in case of an empty field.

The next rule checks for allowed file extensions for the file. The type of the rule will be regExp as we are passing a regular expression which is as follows-

/^(.*.((zip|xml|ical|ics|xcal)$))?[^.]*$/i

It is little complex to explain it from the beginning so let us breakdown it from end-

 

$ Matches end of the string
[^.]* Negated set. Match any character not in this set. * represents 0 or more preceding token
( … )? Represents if there is something before (the last ?)
.*.((zip|xml|ical|ics|xcal)$) This is first capturing group ,it contains tocken which are combined to create a capture group ( zip|xml|ical|ics|xcal ) to extract a substring
^ the beginning of the string

Above regular expression filters all the files with zip/xml/ical/xcal extensions which are the allowed format for the event source file.

References

  • Ivaylo Gerchev blog on form validation in semantic ui
  • Drmsite blog on semantic ui form validation
  • Semantic ui form validation docs
  • Stackoverflow regex for file extension
Continue ReadingFile Upload Validations on Open Event Frontend

Custom SeekBar in PSLab Android

By default Seekbar in Android only return integer values greater than zero. But there can be some situation where we need the SeekBar to return a float and negative values. To implement trigger functionality in the Oscilloscope activity of PSLab Android app, we require a SeekBar that sets a voltage level that should trigger the capture sequence. Since this voltage value ranges between -16.5 V to 16.5 V, default Seekbar don’t serve the purpose.

The solution is to create a custom SeekBar that returns float values. Let’s understand how to implement it.

Create a FloatSeekBar class which extends AppCompatSeekBar. Create a constructor for the class with Context, AttributeSet and defStyle as parameters. AttributeSet is the set of properties specified in an XML resource file whereas defStyle is default style to apply to this view.

public class FloatSeekBar extends android.support.v7.widget.AppCompatSeekBar {
   private double max = 0.0;
   private double min = 0.0;

   public FloatSeekBar(Context context, AttributeSet attrs, int defStyle) {
       super(context, attrs, defStyle);
       applyAttrs(attrs);
   }

 

Then define setters method which set the max and min values of the SeekBar. This method basically sets the range of the SeekBar.

public void setters(double a, double b)
{
   min = a;
   max = b;
}

 

getValue is a method that manipulates current progress of the SeekBar and returns the value. Here the equation used to determine the value is (max – min) * (getProgress() / getMax()) + min.

public double getValue() {
   DecimalFormat df = new DecimalFormat("#.##");
   Double value = (max - min) * ((float) getProgress() / (float) getMax()) + min;
   value = Double.valueOf(df.format(value));
   return value;
}

 

setValue method takes the double value, and accordingly set the progress of the SeekBar.

public void setValue(double value) {
   setProgress((int) ((value - min) / (max - min) * getMax()));
}

This creates a custom SeekBar, it can be used just like a normal SeekBar.

Now, set the range of custom SeekBar between -16.5 and 16.5.

seekBarTrigger = (FloatSeekBar) v.findViewById(R.id.seekBar_trigger);
seekBarTrigger.setters(-16.5, 16.5);

 

In order to get value of the custom SeekBar call getValue method.

seekBarTrigger.getValue();

In order to follow the entire code for custom SeekBar implementation in PSLab refer FloatSeekBar.java, Attr.xml and TimebaseTrigger.java.

A glimpse of custom SeekBar in PSLab Android.

Resources

 

Continue ReadingCustom SeekBar in PSLab Android

Comparing Different Graph View Libraries and Integrating Them in PSLab Android Application

There is a significant role of graphs in PSLab, they’re used for the following purpose:

For this, we need to implement real time graphs that stimulate real time data from the PSLab device efficiently. It is necessary to analyze each and every Graph View Library, compare them and integrate the best one in PSLab Android app.

Available Graph Libraries

The available Graph View libraries of Android are:

  1. MPAndroidChart
  2. Graph-View
  3. SciChart

Which one is the best with respect to the PSLab project?

MPAndroidChart

Line Graph plotted using MPAndroidChart (image source)

It is an open source graph view library by Philipp Jahoda. The following are the features of MPAndroidChart

  • There are 8 different chart types
  • Scaling on both axes. Scaling can be done using pinch zoom gesture.
  • Dual Axes, we can have 2 Y-axis.
  • Real time support
  • Customizable axis ie we can define different labels to the axis
  • Save chart to SD-Card
  • Predefined color templates
  • Legends which are used to define which line depicts what.
  • Animations
  • Fully customizable, from background color to color of the lines and grids.

On trying MPAndroidChart, I found it to be a slightly difficult to implement.

Graph-View

Line Graph plotted using GraphView Library (image source)

It is also an open source graph view library by Jonas Gehring. The following are features of the Graph-View

  • Supports Line Chart, Bar Chart and Points.
  • Scrolling vertical and horizontal
  • Scaling on both axes.
  • Realtime Graph support
  • Draw multiple series of data. Let the diagram show more that one series in a graph. You can set a color and a description for every series.
  • Legends (as discussed in MPAndroidChart)
  • Custom labels
  • Manual Y axis limits can be set.

SciChart

It is rich APIs for Axis Ranging, Label Formatting, Chart Modifiers (interaction) and Renderable Series. It is packed with features but unfortunately, it is not open sourced.

The Verdict

Both MPAndroidChart and Graph-View are good libraries, packed with a lot of features. GraphView is easier to implement as compared to MPAndroidChat (not that difficult either). Both of them have the features like pinch zoom. MPAndroidChart had the feature of scale adjustment even when the graph is being plotted. The rate of plotting was comparable in both but it was slightly faster in MPAndroidChart. So, finally GraphView is easier to implement but MPAndroidChart has slightly better performance. So, we integrated MPAndroidChart in PSLab Android application.

Integrating MPAndroidChart in PSLab Android App

In order to integrate MPAndroidChart in the Android project add the following code in the build.gradle of your project.

 

compile 'com.github.PhilJay:MPAndroidChart:v3.0.1'

Creating Oscilloscope like graph

If we observe an Oscilloscope, it has a black/blue screen with grid lines. An oscilloscope is a voltage vs time graph hence the x axis represents the time elapsed and y axis the voltage of the signal at the instant of time. There are left and right y axis for different channels.

An Oscilloscope

In order to implement a graph similar to that of Oscilloscope in PSLab Android App using MPAndroidChart library, the graph needed to be customized.

The following step was taken to customized the graph in Oscilloscope Activity.

Background Color

mChart.setBackgroundColor(Color.BLACK);

This sets the background color of the graph as black. mChart is an object of the Line graph.

Legend

Legend l = mChart.getLegend();
l.setForm(Legend.LegendForm.LINE);
l.setTextColor(Color.WHITE);

Here we are setting the Legend form. There are many options available for the same like SQUARE, CIRCLE, and LINE. We are using LINE Legend form.  Also, we set the white color for the legend text.

X Axis Customization

x = mChart.getXAxis();
x.setTextColor(Color.WHITE);

First, we create an object of XAxis and set the textcolor as white.

x.setDrawGridLines(true);

The above method draws the grid lines along the x axis.

x.setAxisMinimum(0f);
x.setAxisMaximum(875f);

Now we will set the range of x axis by setting minimum value as 0 and the maximum value is 875.

Y Axis  Customization

y1 = mChart.getAxisLeft();
y1.setTextColor(Color.WHITE);
y1.setAxisMaximum(16f);
y1.setAxisMinimum(-16f);
y1.setDrawGridLines(true);

This is similar to what we did in x axis formatting.

After performing the above steps we got the following results.

To follow the entire code for graph customization refer chartinit method in Oscilloscope Activity, PSLab Android repository.

Resources

Continue ReadingComparing Different Graph View Libraries and Integrating Them in PSLab Android Application

Plotting Digital Logic Lines In PSLab Android App

The PSLab device offers the Logic Analyzer functionality. A Logic Analyzer is a laboratory instrument that can capture and display digital signals from a digital system or circuit. It is similar to what an oscilloscope is for analog signals and is used to study timing relationship between different logic lines. It plots the logic lines/timing diagram which tells us the information about the state of the Digital System at any instant of time. For example, in the image below we can study the states of digital signals from channels ID1, ID2, ID3 at different times and find parameters like the propagation delay. It’s also used to find errors in Integrated Circuits (ICs) and debug logic circuits.

How I plotted ideal logic lines using MPAndroid Chart library?

Conventional method of adding data points results in the plot as illustrated in the image below. By conventional method I mean basically adding Y-axis (logic state) values corresponding to X-axis values (timestamp).

Result with normal adding and plotting data-points

In the above plot, logic lines follow non-ideal behaviour i.e they take some time in changing their state from high to low. This non-ideal behaviour of these lines increases when the user zooms in graph to analyse timestamps.

Solution to how we can achieve ideal behaviour of logic lines:

A better solution is to make use of timestamps for generating logic lines i.e time instants at which logic made a transition from HIGH -> LOW or LOW -> HIGH. Lets try to figure out with an example:

Timestamps = { 1, 3, 5, 8, 12 } and initial state is HIGH ( i.e at t = 0, it’s HIGH ). This implies that at t = 1, transition from HIGH to LOW took place so at t = 0, it’s HIGH, t = 1 it’s both HIGH and LOW,  at t = 2 it’s LOW.
Now at t = 0 & t = 2, you can simple put y = 1 and 0 respectively. But how do you add data-point for t = 1. Trick is to see how transition is taking place, if it’s HIGH to LOW then add first 1 for t = 1 and then 0 for t = 1.
So the set of points look something like this:

( Y, X ) ( LOGIC , TIME ) -> ( 1, 0 ) ( 1, 1 ) ( 0, 1) ( 0, 2 ) ( 0, 3 ) ( 1, 3 )  ( 1, 4 ) …

Code snippet for adding coordinates in this fashion:

int[] time = timeStamps.get(j);
for (int i = 0; i < time.length; i++) {
   if (initialState) {
       // Transition from HIGH -> LOW
       tempInput.add(new Entry(time[i], 1));
       tempInput.add(new Entry(time[i], 0));
   } else {
       // Transition from LOW -> HIGH
       tempInput.add(new Entry(time[i], 0));
       tempInput.add(new Entry(time[i], 1));
   }

   // changing state variable
   initialState = !initialState;
}

After adding data-points in above mentioned way, we obtained ideal logic lines successfully as illustrated in the image given below

Resources

Continue ReadingPlotting Digital Logic Lines In PSLab Android App

Adding JSONAPI Support in Open Event Android App

The Open Event API Server exposes a well documented JSONAPI compliant REST API that can be used in The Open Even App Generator and Frontend to access and manipulate data. So it is also needed to add support of JSONAPI in external services like The Open Even App Generator and Frontend. In this post I explain how to add JSONAPI support in Android.

There are many client libraries to implement JSONAPI support in Android or Java like moshi-jsonapi, morpheus etc. You can find the list here. The main problem is most of the libraries require to inherit attributes from Resource model but in the Open Event Android App we already inherit from a RealmObject class and in Java we can’t inherit from more than one model or class. So we will be using the jsonapi-converter library which uses annotation processing to add JSONAPI support.

1. Add dependency

In order to use jsonapi-converter in your app add following dependencies in your app module’s build.gradle file.

dependencies {
	compile 'com.github.jasminb:jsonapi-converter:0.7'
}

2.  Write model class

Models will be used to represent requests and responses. To support JSONAPI we need to take care of followings when writing the models.

  • Each model class must be annotated with com.github.jasminb.jsonapi.annotations.Type annotation
  • Each class must contain a String attribute annotated with com.github.jasminb.jsonapi.annotations.Id annotation
  • All relationships must be annotated with com.github.jasminb.jsonapi.annotations.Relationship annotation

In the Open Event Android we have so many models like event, session, track, microlocation, speaker etc. Here I am only defining track model because of its simplicity and less complexity.

@Type("track")
public class Track extends RealmObject {

        	@Id(IntegerIdHandler.class)
        	private int id;
        	private String name;
        	private String description;
        	private String color;
        	private String fontColor;
        	@Relationship("sessions")
        	private RealmList<Session> sessions;

        	//getters and setters
}

Jsonapi-converter uses Jackson for data parsing. To know how to use Jackson for parsing follow my previous blog.

Type annotation is used to instruct the serialization/deserialization library on how to process given model class. Each resource must have the id attribute. Id annotation is used to flag an attribute of a class as an id attribute. In above class the id attribute is int so we need to specify IntegerIdHandler class which is ResourceHandler in the annotation. Relationship annotation is used to designate other resource types as a relationship. The value in the Relationship annotation should be as per JSONAPI specification of the server. In the Open Event Project each track has the sessions so we need to add a Relationship annotation for it.

3.  Setup API service and retrofit

After defining models, define API service interface as you would usually do with standard JSON APIs.

public interface OpenEventAPI {
    @GET("tracks?include=sessions&fields[session]=title")
    Call<List<Track>> getTracks();
}

Now create an ObjectMapper & a retrofit object and initialize them.

ObjectMapper objectMapper = OpenEventApp.getObjectMapper();
Class[] classes = {Track.class, Session.class};

OpenEventAPI openEventAPI = new Retrofit.Builder()
                    .client(okHttpClient)
                    .baseUrl(Urls.BASE_URL)
                    .addConverterFactory(new JSONAPIConverterFactory(objectMapper, classes))
                    .build()
                    .create(OpenEventAPI.class);

 

The classes array instance contains a list of all the model classes which will be supported by this retrofit builder and API service. Here the main task is to add a JSONAPIConverterFactory which will be used to serialize and deserialize data according to JSONAPI specification. The JSONAPIConverterFactory constructor takes two parameters ObjectMapper and list of classes.

4.  Use API service  

Now after setting up all the things according to above steps, you can use the openEventAPI instance to fetch data from the server.

openEventAPI.getTracks();

Conclusion

JSON API is designed to minimize both the number of requests and the amount of data transmitted between clients and servers

Continue ReadingAdding JSONAPI Support in Open Event Android App

Generating Real-Time Graphs in PSLab Android App

In PSLab Android App, we need to log data from the sensors and correspondingly generate real-time graphs. Real-time graphs mean a data streaming chart that automatically updates itself after every n second. This was different from what we did in Oscilloscope’s graph, here we need to determine the relative time at which the data is recorded from the sensor by the PSLab.

Another thing we need to take care of was the range of x axis. Since the data to be streamed is ever growing, setting a large range of the x axis will only make reading sensor data tedious for the user. For this, the solution was to make real time rolling window graph. It’s like when the graph exceeds the maximum range of x axis, the graph doesn’t show the initial plots. For example, if I set that graph should show the data only for the 10-second window when the 11th-second data would be plot, the 1st-second data won’t be shown by the graph and maintains the difference between the maximum and the minimum range of the graph. The graph library we are going to use is MPAndroidChart. Let’s break-down the implementation step by step.

First, we create a long variable, startTime which records the time at which the entire process starts. This would be the reference time. Flags make sure when to reset this time.

if (flag == 0) {
   startTime = System.currentTimeMillis();
   flag = 1;
}

 

We used Async Tasks approach in which the data is from the sensors is acquired in the background thread and the graph is updated in the UI thread. Here we consider an example of the HMC5883L sensor, which is actually Magnetometer. We are calculating time elapsed by subtracting current time with the sartTime and the result is taken as the x coordinate.

private class SensorDataFetch extends AsyncTask<Void, Void, Void> {
   ArrayList<Double> dataHMC5883L = new ArrayList<Double>();
   long timeElapsed;

   @Override
   protected Void doInBackground(Void... params) {
       
     timeElapsed = (System.currentTimeMillis() - startTime) / 1000;

     entriesbx.add(new Entry((float) timeElapsed, dataHMC5883L.get(0).floatValue()));
     entriesby.add(new Entry((float) timeElapsed, dataHMC5883L.get(1).floatValue()));
     entriesbz.add(new Entry((float) timeElapsed, dataHMC5883L.get(2).floatValue()));
       
     return null;
   }

 

As we need to create a rolling window graph we require to add few lines of code with the standard implementation of the graph using MPAndroidChart. This entire code is placed under onPostExecute method of AsyncTasks. The following code sets data set for the Line Chart and tells the Line Chart that a new data is acquired. It’s very important to call notifyDataSetChanged, without this the things won’t work.

mChart.setData(data);
mChart.notifyDataSetChanged();

 

Now, we will set the visible range of x axis. This means that the graph window of the graph won’t change until and unless the range set by this method is not achieved. Here we are setting it to be 10 as we need a 10-second window.

mChart.setVisibleXRangeMaximum(10);

Then we will call moveViewToX method to move the view to the latest entry of the graph. Here, we have passed data.getEntryCount method which returns the no. of data points in the data set.

mChart.moveViewToX(data.getEntryCount());

 

We will get following results

To see the entire code visit this link.

Resources

Continue ReadingGenerating Real-Time Graphs in PSLab Android App

Sending Data between components of SUSI MagicMirror Module

SUSI MagicMirror module is a module to add SUSI assistant right on your MagicMirror. The software for MagicMirror constitutes of an Electron app to which modules can be added easily. Since there are many modules, there might be functionalities that need interaction between various modules by transfer of information. MagicMirror also provides a node_helper script that facilitates a module to perform some background tasks. Therefore, a mechanism to transfer information from node_helper to various components of module is also needed.

MagicMirror provides an inbuilt module notification system that can be used to send notification across the modules and a socket notification system to send information between node_helper and various components of the system.

Our codebase for SUSI MagicMirror is divided mainly into two parts. A Main module that handles all the process of hotword detection, speech recognition, calling SUSI API and saving audio after Text to Speech and a Renderer module which performs the task of managing the display of content on the Mirror Screen and playing back the file obtained by Speech Synthesis. Plainly put, Main module mainly handles the backend logic of the application and the Renderer handles the frontend. Main and Renderer module work on different layers of the application and to facilitate communication between them, we need to make a mechanism. A schematic of flow that is needed to be maintained can be highlighted as:

As you can see in the above diagram, we need to transfer a lot of information between the components. We display animation and text based on the current state of recognition in the  module, thus we need to transfer this information frequently. This task is accomplished by utilizing the inbuilt socket notification system in the MagicMirror. For every event like when system enters into listening , busy or recognized speech state, we need to pass message to renderer. To achieve this, we made a rendererSend function to send notification to renderer.

const rendererSend =  (event: NotificationType , payload: any) => {
   this.sendSocketNotification(event, payload);
}

This function takes an event and a payload as arguments. Event tells which event occurred and payload is any data that we wish to send. This method in turn calls the method provided by MagicMirror module to send socket notifications within the module.

When certain events occur like when system enters busy state or listening state, we trigger the rendererSend call to send a socket notification to the module. The rendererSend method is supplied in the State Machine Components available to every state. The task of sending notifications can be done using the code snippet as follows:

// system enters busy state
this.components.rendererSend("busy", {});
// send speech recognition hypothesis text to renderer
this.components.rendererSend("recognized", {text: recognizedText});
// send susi api output json to renderer to display interactive results while Speech Output is performed
this.components.rendererSend("speak", {data: susiResponse});

The socket notification sent via the above method is received in SUSI Module via a callback called socketNotificationReceived . We need to define this callback with implementation while registering module to MagicMirror. So, we register the MMM-SUSI-AI module by adding the definition for socketNotificationReceived method.

Module.register("MMM-SUSI-AI", {
//other function definitions
***
   // define socketNotificationReceived function
   socketNotificationReceived: function (notification, payload) {
       susiMirror.receivedNotification(notification, payload);
   },
***
});

In this way, we send all the notification received to susiMirror object in the renderer module by calling the receivedNotification method of susiMirror object

We can now receive all the notifications in the SusiMirror and update UI. To handle notifications, we define receivedNotification method as follows:

public receivedNotification(type: NotificationType, payload: any): void {

   this.visualizer.setMode(type);
   switch (type) {
       case "idle":
            // handle idle state
           break;
       case "listening":
           // handle listening state
           break;
       case "busy":
           // handle busy state
         break;
       case "recognized":
           // handle recognized state. This notification also contains a payload about the hypothesis text           
           break;
       case "speak":
           // handle speaking state. We need to play back audio file and display text on screen for SUSI Output. Notification Payload contains SUSI Response
           break;
   }
}

In this way, we utilize the Socket Notification System provided by the MagicMirror Electron Application to send data across the components of Magic Mirror module for SUSI AI.

Resources

Continue ReadingSending Data between components of SUSI MagicMirror Module

Implementing Hiding App Bar of SUSI Web Chat Application

In the SUSI Web Chat application we got a requirement to build a responsive app bar for static pages and there was another requirement to  show and hide the app bar when user scrolls. Basically this is how it should work: The app bar should be hidden after user scrolls down to a certain extent. When user scrolls up, It should appear again.

First we tried readymade node packages to do this task. But these packages are hard to customize. So we planned to make this feature from the sketch. We used Jquery for this. This is how we built this.

First we installed jQuery package using this command.

npm install jquery

Next we imported it on top of the application like this.

import $ from 'jquery'

We have discussed about this app bar and how we made it in previous blog post. Our app bar is like this.

             <header className="nav-down" id="headerSection">
             <AppBar
               className="topAppBar"
               title={<img src="susi-white.svg" alt="susi-logo"
               className="siteTitle"/>}
               style={{backgroundColor:'#0084ff'}}
               onLeftIconButtonTouchTap={this.handleDrawer}
               iconElementRight={<TopMenu />}
             />
             </header>

We have to use these HTML elements to write jQuery code. But we can’t refer HTML elements before it renders. So we have to define it soon after the render method executes. We can do it using “React LifeCycle” method. We have to add our code into the “componentDidMount()” method.
This is how we used jQuery inside the “componentDidMount()” lifeCycle method. Here we assigned the height of the App Bar using “$(‘header’).outerHeight();”

componentDidMount(){
     var didScroll;
     var lastScrollTop = 0;
     var delta = 5;
     var navbarHeight = $('header').outerHeight();

Here we assigned the height of the app bar to “navbarHeight” variable.

     $(window).scroll(function(event){
         didScroll = true;
     });

In this part we checked whether the user has scrolled or not. If user scrolled we set the value of “didScroll” to “true”.
Now we have to define what to do if user has scrolled.

     function hasScrolled() {
         var st = $(window).scrollTop();
         if(Math.abs(lastScrollTop - st) <= delta){

             return;
         }

Here we get the absolute scrolled height. If the height is less than the delta value we defined, it does not do anything. It just returns.

         if (st > lastScrollTop && st > navbarHeight){
             $('header').removeClass('nav-down').addClass('nav-up');
         } else if(st + $(window).height() < $(document).height()) {
             $('header').removeClass('nav-up').addClass('nav-down');
         }
         lastScrollTop = st;
     }

Here we hide the app bar after user scrolled down more than the height of the app bar. If we need to change the height which app bar should disappear, we just need to add a value to the condition like this.

if (st > lastScrollTop && st > navbarHeight + 200){

If the user scrolled down more than that value we change the class name of the element “nav-down” to “nav-up”.
And we change the className “nav-up” to “nav-down” when user is scrolling up.
We defined CSS classes in the stylesheet to do these things and the animations of action.

header {
   background: #f5b335;
   height: 40px;
   position: fixed;
   top: 0;
   transition: top 0.5s ease-in-out;
   width: 100%;
}

.nav-up {
   top: -100px;
}

We have defined the things which we need to do when user scrolls.
Now we have to call this function if user has scrolled

     setInterval(function() {
         if (didScroll) {
             hasScrolled();
             didScroll = false;
         }
     }, 2500);
   }

If the “didcroll” is “true” we execute the “hasScrolled()” function. And set 2500 millisecond time interval. Because of that app bar does not hide right after user scrolls. It triggers the function after 2.5 seconds later.
This is how we built the scroll bar hiding feature using react JS and jQuery.

Resources:

  • Learn more about React LifeCycle Methods http://busypeoples.github.io/post/react-component-lifecycle/
  • Use jQuery in React component: https://medium.com/@shuvohabib/using-jquery-in-react-component-the-refs-way-969de9aa651f
Continue ReadingImplementing Hiding App Bar of SUSI Web Chat Application