Open Event Server – Export Attendees as CSV File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he/she makes his event public, others can view it and buy tickets if interested.

The organizer can see all the attendees in a very detailed view in the event management dashboard. He can see the statuses of all the attendees. The possible statuses are completed, placed, pending, expired and canceled, checked in and not checked in. He/she can take actions such as checking in the attendee.

If the organizer wants to download the list of all the attendees as a CSV file, he or she can do it very easily by simply clicking on the Export As and then on CSV.

Let us see how this is done on the server.

Server side – generating the Attendees CSV file

Here we will be using the csv package provided by python for writing the csv file.

import csv
  • We define a method export_attendees_csv which takes the attendees to be exported as a CSV file as the argument.
  • Next, we define the headers of the CSV file. It is the first row of the CSV file.
def export_attendees_csv(attendees):
   headers = ['Order#', 'Order Date', 'Status', 'First Name', 'Last Name', 'Email',
              'Country', 'Payment Type', 'Ticket Name', 'Ticket Price', 'Ticket Type']
  • A list is defined called rows. This contains the rows of the CSV file. As mentioned earlier, headers is the first row.
rows = [headers]
  • We iterate over each attendee in attendees and form a row for that attendee by separating the values of each of the columns by a comma. Here, every row is one attendee.
  • The newly formed row is added to the rows list.
for attendee in attendees:
   column = [str(attendee.order.get_invoice_number()) if attendee.order else '-',
             str(attendee.order.created_at) if attendee.order and attendee.order.created_at else '-',
             str(attendee.order.status) if attendee.order and attendee.order.status else '-',
             str(attendee.firstname) if attendee.firstname else '',
             str(attendee.lastname) if attendee.lastname else '',
             str(attendee.email) if attendee.email else '',
             str(attendee.country) if attendee.country else '',
             str(attendee.order.payment_mode) if attendee.order and attendee.order.payment_mode else '',
             str(attendee.ticket.name) if attendee.ticket and attendee.ticket.name else '',
             str(attendee.ticket.price) if attendee.ticket and attendee.ticket.price else '0',
             str(attendee.ticket.type) if attendee.ticket and attendee.ticket.type else '']

   rows.append(column)
  • rows contains the contents of the CSV file and hence it is returned.
return rows
  • We iterate over each item of rows and write it to the CSV file using the methods provided by the csv package.
writer = csv.writer(temp_file)
from app.api.helpers.csv_jobs_util import export_attendees_csv
content = export_attendees_csv(attendees)
for row in content:
   writer.writerow(row)

Obtaining the Attendees CSV file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/attendees/csv

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the attendees of the event as a CSV file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his/her Celery task. If the task completed successfully he/she will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above-mentioned URL.

References

Continue ReadingOpen Event Server – Export Attendees as CSV File

Open Event Server – Export Orders as CSV File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he/she makes his event public, others can view it and buy tickets if interested.

The organizer can see all the orders in a very detailed view in the event management dashboard. He can see the statuses of all the orders. The possible statuses are completed, placed, pending, expired and canceled.

If the organizer wants to download the list of all the orders as a CSV file, he or she can do it very easily by simply clicking on the Export As and then on CSV.

Let us see how this is done on the server.

Server side – generating the Orders CSV file

Here we will be using the csv package provided by python for writing the csv file.

import csv
  • We define a method export_orders_csv which takes the orders to be exported as a CSV file as the argument.
  • Next, we define the headers of the CSV file. It is the first row of the CSV file.
def export_orders_csv(orders):
   headers = ['Order#', 'Order Date', 'Status', 'Payment Type', 'Total Amount', 'Quantity',
              'Discount Code', 'First Name', 'Last Name', 'Email']
  • A list is defined called rows. This contains the rows of the CSV file. As mentioned earlier, headers is the first row.
rows = [headers]
  • We iterate over each order in orders and form a row for that order by separating the values of each of the columns by a comma. Here, every row is one order.
  • The newly formed row is added to the rows list.
for order in orders:
   if order.status != "deleted":
       column = [str(order.get_invoice_number()), str(order.created_at) if order.created_at else '',
                 str(order.status) if order.status else '', str(order.paid_via) if order.paid_via else '',
                 str(order.amount) if order.amount else '', str(order.get_tickets_count()),
                 str(order.discount_code.code) if order.discount_code else '',
                 str(order.user.first_name)
                 if order.user and order.user.first_name else '',
                 str(order.user.last_name)
                 if order.user and order.user.last_name else '',
                 str(order.user.email) if order.user and order.user.email else '']
       rows.append(column)
  • rows contains the contents of the CSV file and hence it is returned.
return rows
  • We iterate over each item of rows and write it to the CSV file using the methods provided by the csv package.
writer = csv.writer(temp_file)
from app.api.helpers.csv_jobs_util import export_orders_csv
content = export_orders_csv(orders)
for row in content:
   writer.writerow(row)

Obtaining the Orders CSV file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/orders/csv

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the orders of the event as a CSV file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}</span

The user can go to the above-returned URL and check the status of his/her Celery task. If the task completed successfully he/she will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the aabove-mentionedURL.

References

Continue ReadingOpen Event Server – Export Orders as CSV File

Open Event Server – Export Event as a Pentabarf XML File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as a Pentabarf XML file. Pentabarf XML is used to store events/conferences in a format which most of the scheduling applications can read and add that particular event/conference to the user’s schedule.

Server side – generating the Pentabarf XML file

Here we will be using the pentabarf package for Python for parsing and creating the file.

from pentabarf.Conference import Conference
from pentabarf.Day import Day
from pentabarf.Event import Event
from pentabarf.Person import Person
from pentabarf.Room import Room
  • We define a class PentabarfExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the event in the required format:
event = EventModel.query.get(event_id)
diff = (event.ends_at - event.starts_at)

conference = Conference(title=event.name, start=event.starts_at, end=event.ends_at,
                       days=diff.days if diff.days > 0 else 1,
                       day_change="00:00", timeslot_duration="00:15",
                       venue=event.location_name)
dates = (db.session.query(cast(Session.starts_at, DATE))
        .filter_by(event_id=event_id)
        .filter_by(state='accepted')
        .filter(Session.deleted_at.is_(None))
        .order_by(asc(Session.starts_at)).distinct().all())
  • We have queried for the dates of the event and saved it in dates.
  • We will now iterate over each date and query the microlocations who have a session on that particular date.
for date in dates:
   date = date[0]
   day = Day(date=date)
   microlocation_ids = list(db.session.query(Session.microlocation_id)
                            .filter(func.date(Session.starts_at) == date)
                            .filter_by(state='accepted')
                            .filter(Session.deleted_at.is_(None))
                            .order_by(asc(Session.microlocation_id)).distinct())
  • For each microlocation thus obtained, we will query for accepted sessions to be held at those microlocations.
  • We will also initialize a Room for each microlocation.
for microlocation_id in microlocation_ids:
   microlocation_id = microlocation_id[0]
   microlocation = Microlocation.query.get(microlocation_id)
   sessions = Session.query.filter_by(microlocation_id=microlocation_id) \
       .filter(func.date(Session.starts_at) == date) \
       .filter_by(state='accepted') \
       .filter(Session.deleted_at.is_(None)) \
       .order_by(asc(Session.starts_at)).all()

   room = Room(name=microlocation.name)
  • We will now iterate over the aabove-obtained sessions and instantiate an Event for each session.
  • Then we will iterate over all the speakers of that session and instantiate a Person for each speaker.
  • Finally, we will add that Event to the Room we created earlier.
for session in sessions:

   session_event = Event(id=session.id,
                         date=session.starts_at,
                         start=session.starts_at,
                         duration=str(session.ends_at - session.starts_at) + "00:00",
                         track=session.track.name,
                         abstract=session.short_abstract,
                         title=session.title,
                         type='Talk',
                         description=session.long_abstract,
                         conf_url=url_for('event_detail.display_event_detail_home',
                                          identifier=event.identifier),
                         full_conf_url=url_for('event_detail.display_event_detail_home',
                                               identifier=event.identifier, _external=True),
                         released="True" if event.schedule_published_on else "False")

   for speaker in session.speakers:
       person = Person(id=speaker.id, name=speaker.name)
       session_event.add_person(person)

   room.add_event(session_event)
  • Then we will add the room to the day and then add each day to the conference.
day.add_room(room)
conference.add_day(day)
  • Finally, we will call the generate method of the conference to generate the XML file. This can be directly written to the file.
return conference.generate("Generated by " + get_settings()['app_name'])

Obtaining the Pentabarf XML file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/pentabarf

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as a Pentabarf XML file. It returns the task of the URL to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above-mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the Pentabarf XML format.

References

Continue ReadingOpen Event Server – Export Event as a Pentabarf XML File

Open Event Server – Export Event as xCalendar File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as an xCalendar file. xCal is an XML representation of the iCalendar standard. xCal is not an alternative nor next generation of iCalendar. xCal represents iCalendar components, properties, and parameters as defined in iCalendar. This format was selected to ease its translation back to the iCalendar format using an XSLT transform.

Server side – generating the xCal file

Here we will be using the xml.etree.ElementTree package for Python for parsing and creating XML data.

from xml.etree.ElementTree import Element, SubElement, tostring
  • We define a class XCalExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the calendar:
event = Event.query.get(event_id)

tz = event.timezone or 'UTC'
tz = pytz.timezone(tz)

i_calendar_node = Element('iCalendar')
i_calendar_node.set('xmlns:xCal', 'urn:ietf:params:xml:ns:xcal')
v_calendar_node = SubElement(i_calendar_node, 'vcalendar')
version_node = SubElement(v_calendar_node, 'version')
version_node.text = '2.0'
prod_id_node = SubElement(v_calendar_node, 'prodid')
prod_id_node.text = '-//fossasia//open-event//EN'
cal_desc_node = SubElement(v_calendar_node, 'x-wr-caldesc')
cal_desc_node.text = "Schedule for sessions at " + event.name
cal_name_node = SubElement(v_calendar_node, 'x-wr-calname')
cal_name_node.text = event.name
  • We query for the accepted sessions of the event and store it in sessions
sessions = Session.query \
   .filter_by(event_id=event_id) \
   .filter_by(state='accepted') \
   .filter(Session.deleted_at.is_(None)) \
   .order_by(asc(Session.starts_at)).all()
  • We then iterate through all the sessions in sessions.
  • If it is a valid session, we instantiate a SubElement and store required details
v_event_node = SubElement(v_calendar_node, 'vevent')

method_node = SubElement(v_event_node, 'method')
method_node.text = 'PUBLISH'

uid_node = SubElement(v_event_node, 'uid')
uid_node.text = str(session.id) + "-" + event.identifier

dtstart_node = SubElement(v_event_node, 'dtstart')
dtstart_node.text = tz.localize(session.starts_at).isoformat()

…. So on
  • We then loop through all the speakers in that particular session and add it to the xCal calendar node object as well.
for speaker in session.speakers:
   attendee_node = SubElement(v_event_node, 'attendee')
   attendee_node.text = speaker.name
  • And finally, the string of the calendar node is returned. This is the xCalendar file contents. This can be directly written to a file.
return tostring(i_calendar_node)

Obtaining the xCal file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/xcal

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as an xCal file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the xcs format.

References

Continue ReadingOpen Event Server – Export Event as xCalendar File

Open Event Server – Export Event as an iCalendar File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as an iCalendar file. Going by the Wikipedia definition, iCalendar is a computer file format which allows Internet users to send meeting requests and tasks to other Internet users by sharing or sending files in this format through various methods. The files usually have an extension of .ics. With supporting software, such as an email reader or calendar application, recipients of an iCalendar data file can respond to the sender easily or counter propose another meeting date/time. The file format is specified in a proposed internet standard (RFC 5545) for calendar data exchange.

Server side – generating the iCal file

Here we will be using the icalendar package for Python as the file writer.

from icalendar import Calendar, vCalAddress, vText
  • We define a class ICalExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the calendar:
event = EventModel.query.get(event_id)

cal = Calendar()
cal.add('prodid', '-//fossasia//open-event//EN')
cal.add('version', '2.0')
cal.add('x-wr-calname', event.name)
cal.add('x-wr-caldesc', "Schedule for sessions at " + event.name)
  • We query for the accepted sessions of the event and store it in sessions.
sessions = Session.query \
   .filter_by(event_id=event_id) \
   .filter_by(state='accepted') \
   .filter(Session.deleted_at.is_(None)) \
   .order_by(asc(Session.starts_at)).all()
  • We then iterate through all the sessions in sessions.
  • If it is a valid session, we instantiate an icalendar event and store required details.
event_component = icalendar.Event()
event_component.add('summary', session.title)
event_component.add('uid', str(session.id) + "-" + event.identifier)
event_component.add('geo', (event.latitude, event.longitude))
event_component.add('location', session.microlocation.name or '' + " " + event.location_name)
event_component.add('dtstart', tz.localize(session.starts_at))
event_component.add('dtend', tz.localize(session.ends_at))
event_component.add('email', event.email)
event_component.add('description', session.short_abstract)
event_component.add('url', url_for('event_detail.display_event_detail_home',
                                  identifier=event.identifier, _external=True))
  • We then loop through all the speakers in that particular session and add it to the iCal Event object as well.
for speaker in session.speakers:
   # Ref: http://icalendar.readthedocs.io/en/latest/usage.html#file-structure
   # can use speaker.email below but privacy reasons
   attendee = vCalAddress('MAILTO:' + event.email if event.email else 'undefined@email.com')
   attendee.params['cn'] = vText(speaker.name)
   event_component.add('attendee', attendee)
  • This event_component is then added to the cal object that we created in the beginning.
cal.add_component(event_component)
  • And finally, the cal.to_ical() is returned. This is the iCalendar file contents. This can be directly written to a file.
return cal.to_ical()

Obtaining the iCal file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/ical

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as an iCal file. It returns the task of the URL to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the ics format.

References

Continue ReadingOpen Event Server – Export Event as an iCalendar File

Building PSLab Android app with Fdroid

Fdroid is a place for open source enthusiasts and developers to host their Free and Open Source Software (FOSS) for free and get more people onboard into their community. Hosting an app in Fdroid is not a fairly easy process just like hosting one in Google Play. We need to perform a set of build checks prior to making a merge request (which is similar to pull request in GitHub) in the fdroid-data GitLab repository. PSLab Android app by FOSSASIA has undergone through all these checks and tests and now ready to be published.

Setting up the fdroid-server and fdroid-data repositories is one thing. Building our app using the tools provided by fdroid is another thing. It will involve quite a few steps to get started. Fdroid requires all the apps need to be built using:

$ fdroid build -v -l org.fossasia.pslab

 

This will output a set of logs which tell us what went wrong in the builds. The usual one in a first time app is obviously the build is not taking place at all. The reason is our metadata file needs to be changed to initiate a build.

Build:<versioncode>,<versionname>
    commit=<commit which has the build mentioned in versioncode>
    subdir=app
    gradle=yes

 

When a metadata file is initially created, this build is disabled by default and commit is set to “?”. We need to fill in those blanks. Once completed, it will look like the snippet above. There can be many blocks of “Build” can be added to the end of metadata file as we are advancing and upgrading through the app. As an example, the latest PSLab Android app has the following metadata “Build” block:

Build:1.1.5,7
    commit=0a50834ccf9264615d275a26feaf555db42eb4eb
    subdir=app
    gradle=yes

 

In case of an update, add another “Build” block and mention the version you want to appear on the Fdroid repository as follows:

Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.1.5
Current Version Code:7

 

Once it is all filled, run the build command once again. If you have properly set the environment in your local PC, build will end successfully assuming there were no Java or any other language syntax errors.

It is worth to mention few other facts which are common to Android software projects. Usually the source code is packed in a folder named “app” inside the repository and this is the common scenario if Android Studio builds up the project from scratch. If this “app” folder is one level below the root, that is “android/app”, the build instructions shown above will throw an error as it cannot find the project files.

The reason behind this is we have mentioned “subdir=app” in the metadata file. Change this to “subdir=android/app” and run the build again. The idea is to direct the build to find where the project files are.

Apart from that, the commit can be represented by a tag instead of a long commit hash. As an example, if we had merge commits in PSLab labeled as “v.<versioncode>”, we can simply use “commit=v.1.1.5” instead of the hash code. It is just a matter of readability.

Happy Coding!

Reference:

  1. Metadata : https://f-droid.org/docs/Build_Metadata_Reference/#Build
  2. PSLab Android app Fdroid : https://gitlab.com/fdroid/fdroiddata/merge_requests/3271/diffs
Continue ReadingBuilding PSLab Android app with Fdroid

Publish an Open Source app on Fdroid

Fdroid is a famous software repository hosted with numerous free and open source Android apps. They have a main repository where they allow developers hosting free and ad free software after a thorough check up on the app. This blog will tell you how to get your project hosted in their repository using steps I followed to publish the PSLab Android app.

Before you get started, make sure you have the consent from your developer community to publish their app on Fdroid. Fdroid requires your app to use all kind of open resources to implement features. If there is any closed source libraries in your app and you still want to publish it on Fdroid, you may have to reimplement that feature by any other mean without using closed source resources. They will also not allow to have Google’s proprietary “play-services” in your app along with proprietary ad services. You can find the complete inclusion policy document from their official page.

When your app is fully ready, you can get started with the inclusion procedure. Unlike how we are publishing apps on Google Play, publishing an app on Fdroid is as simple as sending a pull request to their main repository. That’s exactly what we have to do. In simple terms all we have to do is:

  1. Fork the Fdroid main data repository
  2. Make changes to their files to include our app
  3. Do a pull request

First of all you need a GitLab account as the Fdroid repository is hosted in GitLab. Once you are ready with a GitLab account, fork and clone the f-droid-data repository. The next step is to install the fdroid-server. This can be simply done using apt:

$ sudo apt install fdroidserver

Once that is done, go into the directory where you cloned the repository and run the following command to read current meta data where it saves all the information related to existing apps on Fdroid;

$ fdroid readmeta

This will list out various details about the current meta files. Next step is to add our app details into this meta file. This can be done easily using following command or you can manually create folders and files. But the following is safer;

$ fdroid import --url https://github.com/fossasia/pslab-android --subdir app

Replace the link to repository from the –url tag in the above command. For instance the following will be the link for fossasia-phimpme android;

$ fdroid import --url https://github.com/fossasia/phimpme-android --subdir app

This will create a file named as “org.fossasia.pslab” in the metadata directory. Open up this text file and we have to fill in our details.

  1. Categories
  2. License
  3. Web Site
  4. Summary
  5. Description

Description needs to be terminated with a newline and a dot to avoid build failures.

Once the file is filled up, run the following command to make sure that the metadata file is complete.

$ fdroid readmeta

Then run the following command to clean up the file

$ fdroid rewritemeta org.fossasia.pslab

We can automatically add version details using the following command:

$ fdroid checkupdates org.fossasia.pslab

Now run the lint test to see if the app is building correctly.

$ fdroid lint org.fossasia.pslab

If there are any errors thrown, fix them to get to the next step where we actually build the app:

$ fdroid build -v -l org.fossasia.pslab

Now you are ready to make the pull request which will then get reviewed by developers in Fdroid community to get it merged into their main branch. Make a commit and then push to your fork. From there it is pretty straightforward to make a pull request to the main repository. Once that is done, they will test the app for any insecurities. If all of them are passed, the app will be available in Fdroid!

Reference:

  1. Quick Start: https://gitlab.com/fdroid/fdroiddata/blob/master/README.md#quickstart
  2. Making merge requests: https://gitlab.com/fdroid/fdroiddata/blob/master/CONTRIBUTING.md#merge-requests
Continue ReadingPublish an Open Source app on Fdroid

Implementing Clickable Images

PSLab Android application is a feature rich compact app to user interface the PSLab hardware device. Similarly the PSLab device itself is a compact device with a plenty of features to replace almost all the analytical instruments in a school science lab. When a first time user takes the device and connect it with the Android app, there are so many pins labeled with abbreviations. This creates lots of complications unless the user checks the pinout diagram separately.

As a workaround a UI is proposed to integrate a layout containing the PSLab PCB image where user can click on each pin to get a dialog box explaining him what that specific pin is and what it does. This implementation can be done it two ways;

  • Using an Image map
  • Using (x,y) coordinates

The first implementation is more practical and can be applied with any device with any dimension. The latter requires some transformation to capture the correct position when user has clicked on a pin. So the first method will be implemented.

The idea behind using an image map is to have two images with exact dimensions on top of each other. The topmost image will be the color map which we create ourselves using unique colors at unique heat points. This image will have the visibility setting invisible as the main idea is to let the  user see a meaningful image and capture the positions using a secondary in the back end.

To make things much clear, let’s have a look at a color map image I am suggesting here for a general case.

If we overlap the color map with the PSLab layout, we will be able to detect where user has clicked using Android onTouchEvent.

@Override
public boolean onTouchEvent(MotionEvent ev) {
   final int action = ev.getAction();
   final int evX = (int) ev.getX();
   final int evY = (int) ev.getY();
   switch (action) {
       case MotionEvent.ACTION_UP :
         int touchColor = getHotspotColor (R.id.backgroundMap, evX, evY);
         /* Display the relevant pin description dialog box here */
         break;
   }
   return true;
}

 
Color of the clicked position can be captured using the following code;

public int getHotspotColor (int hotspotId, int x, int y) {
   ImageView img = (ImageView) findViewById (hotspotId);
   img.setDrawingCacheEnabled(true);
   Bitmap hotspots = Bitmap.createBitmap(img.getDrawingCache());
   img.setDrawingCacheEnabled(false);
   return hotspots.getPixel(x, y);
}

 
If we go into details, from the onTouchEvent we capture the (x,y) coordinates related to user click. Then this location is looked up for a unique color by creating a temporary bitmap and then getting the pixel value at the captured coordinate.

There is an error in this method as the height parameter always have an offset. This offset is introduced by the status bar and the action bar of the application. If we use this method directly, there will be an exception thrown out saying image height is less than the height defined by y.

Solving this issue involves calculating status bar and actionbar heights separately and then subtract them from the y coordinate.

Actionbar and status bar heights can be calculated as follows;

Rect rectangle = new Rect();
Window window = getWindow();
window.getDecorView().getWindowVisibleDisplayFrame(rectangle);
int statusBarHeight = rectangle.top;
int contentViewTop = window.findViewById(Window.ID_ANDROID_CONTENT).getTop();
int titleBarHeight= contentViewTop - statusBarHeight;

 
Using them, we can modify the captured coordinates as follows;

int touchColor = getHotspotColor (R.id.imageArea, evX, evY - statusBarHeight);

 
This way the exception is handled by adjusting the cursor position. Once this is done, it is all about displaying the correct pin description dialog box.

Reference:

Calculate status bar height: https://stackoverflow.com/questions/3407256/height-of-status-bar-in-android

Continue ReadingImplementing Clickable Images

Database Listener for User Centric Events

Badgeyay is an open-source utility developed by FOSSASIA to generate badges for conferences and events. The project is separated into two components to ease maintainability. First is the frontend part which is in ember and second part is backend which is in Flask. The choice of database to support backend is PostgreSQL.

Now comes the problem, whenever a user is registered in the database, he should receive  a verification mail, that he is successfully registered on the platform. For this case we have to listen to the database events on User model. This issue has greater extendibility than only sending greeting or verification mail to the user. We can extend this to trigger services that are dependent on user registration, like subscribing the user to some set of services based on the plan he opted while registration and many more.

These type of issues cannot be handled by normal relationship with tables and other entities, there has to be logic in place to support such functionalities. So the challenges for tackling the problem are as follows:

  • Listen to the insert_action in User model
  • Extracting the details necessary for the logic
  • Execute particular logic

Procedure

  1. Attaching insert_action listener to the User model. This function will get triggered whenever an entity is saved in the User model.

<!– HTML generated using hilite.me –>

@db.event.listens_for(User, "after_insert")
def logic(mapper, connection, target): {
......
}
  1. When the function gets triggered, extract the details of the saved user that is necessary for the logic. As currently we are sending greeting mail to the user,we only need the email of the user. Target is the actual saved user passed as argument to the listening function from the library.

<!– HTML generated using hilite.me –>

msg = {}
msg['subject'] = "Welcome to Badgeyay"
msg['receipent'] = target.email
msg['body'] = "It's good to have you onboard with Badgeyay. Welcome to " \
"FOSSASIA Family."
sendMail(msg)
  1. Now the details are passed to sendMail() function for sending mail which uses flask-mail library to send mail to the recipient.
    def sendMail(message):
    if message and message.receipent:
    try:
    msg = Message(
    subject=message.subject,
    sender=app.config['MAIL_USERNAME'], Response(200).generateMessage(
    recipients=[message.receipent],
    body=message.body)
    Mail(app).send(msg)
    except Exception as e:
    return jsonify(
    Response(500).exceptWithMessage(
    str(e),
    'Unable to send the mail'))
    return jsonify(
    Response(200).generateMessage(
    'Mail Sent'))
    else:
    return jsonify(
    Response(403).generateMessage(
    'No data received')) 'No data received'))
    
  2. This will send mail to the user who has been registered to the application.

Similarly we can use separate logics according to the need of the application.

 

The Pull Request for the above functionality is at this Link

Topics Involved

Working on the issue involve following topics:

  • Configuring mail service to allow insecure apps access.
  • Sending mail from the flask-mail to end user
  • Attaching listener to listen for database change
  • Extraction of data from saved object in database sqlalchemy.

Resources

  • Sending Mails Programmatically –  Link
  • Flask Mail Documentation – Link
  • Listening to database events – Link
  • Enabling access to GMAIL to send mails to recipient – Link
Continue ReadingDatabase Listener for User Centric Events

Refactoring and Remodeling Badgeyay API

When we build a full scale production application, we make sure that everything is modeled correctly and accordingly to the need of the code. The code must be properly maintained as well as designed in such a way that it is less prone to errors and bugs.

Badgeyay is also targeting to be a full production application, and in order to achieve it we first need to re-factor the code and model it using a strong yet maintainable structure.

What is the current state of Badgeyay?

Currently Badgeyay is divided into two sub folders.

\badgeyay
    \frontend
    \backend
    .
    .

It is backed by two folders, viz backend and frontend. The ‘backend’ folder handles the API that the service is currently running. The ‘frontend’ folder houses the Ember based frontend logic of the application.

Improvements to Badgeyay Backend

We have worked on improving Backend for Badgeyay. Instead of traditional methods, i.e. current method, of API development; We employ a far better approach of using Flask Blueprint as a method of refactoring the API.

The new backend API resides inside the following structure.

\badgeyay
    \backend
        \blueprint
            \api

The API folder currently holds the new API being formatted from scratch using

  • Flask Blueprint
  • Flask Utilities like jsonify, response etc

The new structure of Badgeyay Backend will follow the following structure

api
    \config
    \controllers
    \helpers
    \models
    \utils
    db.py
    run.py

The folders and their use cases are given below

  • \config
    • Contain all the configuration files
    • Configurations about URLs, PostgreSQL etc
  • \controllers
    • This will contain the controllers for our API
    • Controllers will be the house to our routes for APIs
  • \helpers
    • Helpers folder will contain the files directly related to API
  • \models
    • Models folder contains the Schemas for PostgreSQL
    • Classes like User etc will be stored in here
  • \utils
    • Utils will contain the helper functions or classes
    • This classes or functions are not directly connected to the APIs
  • db.py
    • Main python file for Flask SQLAlchemy
  • run.py
    • This is the main entry point.
    • Running this file will run the entire Flask Blueprint API

How does it help?

  • It helps in making the backend more solid.
  • It helps in easy understanding of application with maintained workflow.
  • Since we will be adding a variety of features during Google Summer of Code 2018 therefore we need to have a well structured API with well defined paths for every file being used inside it.
  • It will help in easy maintaining for any maintainer on this project.
  • Development of the API will be faster in this way, since everything is divided into sub parts therefore many people can work on many different possibilities on the same time.

Further Improvements

Since this structure has been setup correctly in Badgeyay now, so we can work on adding separate routes and different functionalities can be added simultaneously.

It ensures faster development of the project.

Resources

Continue ReadingRefactoring and Remodeling Badgeyay API