Implementing Session and Speaker Creation From Event Panel In Open Event Frontend

Open-Event Front-end uses Ember data for handling Open Event Orga API which abides by JSON API specs. It allows the user to manage the event using the event panel of that event. This panel lets us create or update sessions & speakers. Each speaker must be associated with a session, therefore we save the session before saving the speaker.
In this blog we will see how to Implement the session & speaker creation via event panel. Lets see how we implemented it?

Passing the session & speaker models to the form
On the session & speaker creation page we need to render the forms using the custom form API and create new speaker and session entity. We create a speaker object here and we pass in the relationships for event and the user to it, likewise we create the session object and pass the event relationship to it.

These objects along with form which contains all the fields of the custom form, tracks which is a list of all the tracks & sessionTypes which is a list of all the session types of the event is passed in the model.

return RSVP.hash({
  event : eventDetails,
  form  : eventDetails.query('customForms', {
    'page[size]' : 50,
    sort         : 'id'
  }),
  session: this.get('store').createRecord('session', {
    event: eventDetails
  }),
  speaker: this.get('store').createRecord('speaker', {
    event : eventDetails,
    user  : this.get('authManager.currentUser')
  }),
  tracks       : eventDetails.query('tracks', {}),
  sessionTypes : eventDetails.query('sessionTypes', {})
});

We bind the speaker & session object to the template which has contains the session-speaker component for form validation. The request is only made if the form gets validated.

Saving the data

In Open Event API each speaker must be associated with a session, i.e we must define a session relationship for the speaker. To accomplish this we first save the session into the server and once it has been successfully saved we pass the session as a relation to the speaker object.

this.get('model.session').save()
  .then(session => {
    let speaker = this.get('model.speaker');
    speaker.set('session', session);
    speaker.save()
      .then(() => {
        this.get('notify').success(this.l10n.t('Your session has been saved'));
        this.transitionToRoute('events.view.sessions', this.get('model.event.id'));
      });
  })

We save the objects using the save method. After the speakers and sessions are save successfully we notify the user by showing a success message via the notify service.

Thank you for reading the blog, you can check the source code for the example here.

Resources

Continue ReadingImplementing Session and Speaker Creation From Event Panel In Open Event Frontend

Creating Dynamic Forms Using Custom-Form API in Open Event Front-end

In Open Event Front-end allows the the event creators to customise the sessions & speakers forms which are implemented on the Orga server using custom-form API. While event creation the organiser can select the forms fields which will be placed in the speaker & session forms.

In this blog we will see how we created custom forms for sessions & speakers using the custom-form API. Lets see how we did it.

Retrieving all the form fields

Each event has custom form fields which can be enabled on the sessions-speakers page, where the organiser can include/exclude the fields for speakers & session forms which are used by the organiser and speakers.

return this.modelFor('events.view').query('customForms', {});

We pass return the result of the query to the new session route where we will create a form using the forms included in the event.

Creating form using custom form API

The model returns an array of all the fields related to the event, however we need to group them according to the type of the field i.e session & speaker. We use lodash groupBy.

allFields: computed('fields', function() {
  return groupBy(this.get('fields').toArray(), field => field.get('form'));
})

For session form we run a loop allFields.session which is an array of all the fields related to session form. We check if the field is included and render the field.

{{#each allFields.session as |field|}}
  {{#if field.isIncluded}}
    <div class="field">
      <label class="{{if field.isRequired 'required'}}" for="name">{{field.name}}</label>
      {{#if (or (eq field.type 'text') (eq field.type 'email'))}}
        {{#if field.isLongText}}
          {{widgets/forms/rich-text-editor textareaId=(if field.isRequired (concat 'session_' field.fieldIdentifier '_required'))}}
        {{else}}
          {{input type=field.type id=(if field.isRequired (concat 'session_' field.fieldIdentifier '_required'))}}
        {{/if}}
      {{/if}}
    </div>
  {{/if}}
{{/each}}

We also use a unique id for all the fields for form validation. If the field is required we create a unique id as `session_fieldName_required` for which we add a validation in the session-speaker-form component. We also use different components for different types of fields eg. for a long text field we make use of the rich-text-editor component.

Thank you for reading the blog, you can check the source code for the example here.

Resources

Continue ReadingCreating Dynamic Forms Using Custom-Form API in Open Event Front-end

Implementing Notifications in Open Event Server

In FOSSASIA’s Open Event Server project, along with emails, almost all actions have necessary user notifications as well. So, when a new session is created or a session is accepted by the event organisers, along with the email, a user notification is also sent. Though showing the user notification is mainly implemented in the frontend site but the content to be shown and on which action to show is strictly decided by the server project.

A notification essentially helps an user to get the necessary information while staying in the platform itself and not needing to go to check his/her email for every action he performs. So unlike email which acts as a backup for the informations, notification is more of an instant thing.

The API

The Notifications API is mostly like all other JSON API endpoints in the open event project. However in Notifications API we do not allow any to send a POST request. The admin of the server is able to send a GET a request to view all the notifications that are there in the system while a user can only view his/her notification. As of PATCH we allow only the user to edit his/her notification to mark it as read or not read. Following is the schema for the API:

class NotificationSchema(Schema):
    """
    API Schema for Notification Model
    """

    class Meta:
        """
        Meta class for Notification API schema
        """
        type_ = 'notification'
        self_view = 'v1.notification_detail'
        self_view_kwargs = {'id': '<id>'}
        self_view_many = 'v1.microlocation_list_post'
        inflect = dasherize

    id = fields.Str(dump_only=True)
    title = fields.Str(allow_none=True, dump_only=True)
    message = fields.Str(allow_none=True, dump_only=True)
    received_at = fields.DateTime(dump_only=True)
    accept = fields.Str(allow_none=True, dump_only=True)
    is_read = fields.Boolean()
    user = Relationship(attribute='user',
                        self_view='v1.notification_user',
                        self_view_kwargs={'id': '<id>'},
                        related_view='v1.user_detail',
                        related_view_kwargs={'notification_id': '<id>'},
                        schema='UserSchema',
                        type_='user'
                        )


The main things that are shown in the notification from the frontend are the
title and message. The title is the text that is shown without expanding the entire notification that gives an overview about the message in case you don’t want to read the entire message. The message however provides the entire detail that is associated with the action performed. The user relationship stores which user the particular notification is related with. It is a one-to-one relationship where one notification can only belong to one user. However one user can have multiple notifications. Another important attribute is the is_read attribute. This is the only attribute that is allowed to be changed. By default, when we make an entry in the database, is_read is set to FALSE. Once an user has read the notification, a request is sent from the frontend to change is_read to TRUE.

The different actions for which we send notification are stored in the models/notification.py file as global variables.

USER_CHANGE_EMAIL = "User email"'
NEW_SESSION = 'New Session Proposal'
PASSWORD_CHANGE = 'Change Password'
EVENT_ROLE = 'Event Role Invitation'
TICKET_PURCHASED = 'Ticket(s) Purchased'
TICKET_PURCHASED_ATTENDEE = 'Ticket(s) purchased to Attendee    '
EVENT_EXPORTED = 'Event Exported'
EVENT_EXPORT_FAIL = 'Event Export Failed'
EVENT_IMPORTED = 'Event Imported'

HTML Templates

The notification title and message that is stored in the database and later served via the Notification API is created using some string formatting HTML templates. We firstly import all the global variables that represent the various actions from the notification model. Then we declare a global dict type variable named NOTIFS which stores all title and messages to be stored in the notification table.

NEW_SESSION: {
        'title': u'New session proposal for {event_name}',
        'message': u"""The event <strong>{event_name}</strong> has received
             a new session proposal.<br><br>
            <a href='{link}' class='btn btn-info btn-sm'>View Session</a>""",
        'recipient': 'Organizer',
    },


This is an example of the contents stored inside the dict. For every action, there is a dict with attributes
title, message and recipient. Title contains the brief overview of the entire notification whereas message contains a more vivid description with proper links. Recipient contains the one who receives the notification. So for example in the above code snippet, it is a notification for a new session created. The notification goes to the organizer. The title and message contains named placeholders which are later replaced by particular values using python’s .format() function.

Notification Helper

Notification helper module contains two main parts :-

  1. A parent function which saves the notification to the table related to the user to whom the notification belongs.
  2. Individual notification helper functions that are used by the APIs to save notification after various actions are performed.

Parent Function

def send_notification(user, action, title, message):
    if not current_app.config['TESTING']:
        notification = Notification(user_id=user.id,
                                    title=title,
                                    message=message,
                                    action=action
                                    )
        save_to_db(notification, msg="Notification saved")
        record_activity('notification_event', user=user, action=action, title=title)


send_notification
() is the parent function which takes as parameters user, action, title and message and stores them in the notification table in the database. The user is the one to whom the notification belongs to, action represents the particular action in an API which triggered the notification. Title and message are the contents that are shown in the frontend in the form of a notification. The frontend can implement it as a dropdown notification like facebook or a desktop notification like gitter or whatsapp. After the notification is saved we also update the activity table with the action that a notification has been saved for a user with the following action and title. Later, the Notification API mentioned in the very beginning of the blog uses this data that is being stored now and serves it as a JSON response.

Individual Functions

Apart from this, we have individual functions that uses the parent function to store notifications particular to a particular actions. For example, we have a send_notif_new_session_organizer() function which is used to save notification for all the organizers of an event that a new session has been added to their particular event. This function is called when a POST request is made in the Sessions API and the data is saved successfully. The function is executed for all the organizers of the event for which the session has been created.

def send_notif_new_session_organizer(user, event_name, link):
    message_settings = MessageSettings.query.filter_by(action=NEW_SESSION).first()
    if not message_settings or message_settings.notification_status == 1:
        notif = NOTIFS[NEW_SESSION]
        action = NEW_SESSION
        title = notif['title'].format(event_name=event_name)
        message = notif['message'].format(event_name=event_name, link=link)

        send_notification(user, action, title, message)


In the above function, we take in 3 parameters, user, event_name and link. The value of the user parameter is used to link the notification to that particular user. Event_name and link are used in the title and message of the notification that is saved in the database. Firstly in the function we check if there is certain message setting which tells that the user doesn’t want to receive notifications related to new sessions being created. If not, we proceed. We get the title and message strings from the NOTIFS dict from the system_notification.py file.

After that, using string formatting we get the actual message. For example,

u'New session proposal for {event_name}'.format(‘FOSSASIA’)

would give us a resulting string of the form:

u'New session proposal for FOSSASIA'

After this, we use this variables and send them as parameters to the send_notification() parent function to save the notification properly.

 

Reference:

Continue ReadingImplementing Notifications in Open Event Server

Implement Email in Open Event Server

In FOSSASIA’s Open Event Server project, we send out emails when various different actions are performed using the API. For example, when a new user is created, he/she receives an email welcoming him to the server as well as an email verification email. Users get role invites from event organisers in the form of emails, when someone buys a ticket he/she gets a PDF link to the ticket as email. So as you can understand all the important informations that are necessary to be notified to the user are sent as an email to the user and sometimes to the organizer as well.

In FOSSASIA, we use sendgrid’s API or an SMTP server depending on the admin settings for sending emails. You can read more about how we use sendgrid’s API to send emails in FOSSASIA here. Now let’s dive into the modules that we have for sending the emails. The three main parts in the entire email sending are:

  1. Model – Storing the Various Actions
  2. Templates – Storing the HTML templates for the emails
  3. Email Functions – Individual functions for various different actions

Let’s go through each of these modules one by one.

Model

USER_REGISTER = 'User Registration'
USER_CONFIRM = 'User Confirmation'
USER_CHANGE_EMAIL = "User email"
INVITE_PAPERS = 'Invitation For Papers'
NEXT_EVENT = 'Next Event'
NEW_SESSION = 'New Session Proposal'
PASSWORD_RESET = 'Reset Password'
PASSWORD_CHANGE = 'Change Password'
EVENT_ROLE = 'Event Role Invitation'
SESSION_ACCEPT_REJECT = 'Session Accept or Reject'
SESSION_SCHEDULE = 'Session Schedule Change'
EVENT_PUBLISH = 'Event Published'
AFTER_EVENT = 'After Event'
USER_REGISTER_WITH_PASSWORD = 'User Registration during Payment'
TICKET_PURCHASED = 'Ticket(s) Purchased'


In the Model file, named as
mail.py, we firstly declare the various different actions for which we send the emails out. These actions are globally used as the keys in the other modules of the email sending service. Here, we define global variables with the name of the action as strings in them. These are all constant variables, which means that there value remains throughout and never changes. For example, USER_REGISTER has the value ‘User Registration’, which essentially means that anything related to the USER_REGISTER key is executed when the User Registration action occurs. Or in other words, whenever an user registers into the system by signing up or creating a new user through the API, he/she receives the corresponding emails.
Apart from this, we have the model class which defines a table in the database. We use this model class to store the actions performed while sending emails in the database. So we store the action, the time at which the email was sent, the recipient and the sender. That way we have a record about all the emails that were sent out via our server.

class Mail(db.Model):
    __tablename__ = 'mails'
    id = db.Column(db.Integer, primary_key=True)
    recipient = db.Column(db.String)
    time = db.Column(db.DateTime(timezone=True))
    action = db.Column(db.String)
    subject = db.Column(db.String)
    message = db.Column(db.String)

    def __init__(self, recipient=None, time=None, action=None, subject=None,
                 message=None):
        self.recipient = recipient
        self.time = time
        if self.time is None:
            self.time = datetime.now(pytz.utc)
        self.action = action
        self.subject = subject
        self.message = message

    def __repr__(self):
        return '<Mail %r to %r>' % (self.id, self.recipient)

    def __str__(self):
        return unicode(self).encode('utf-8')

    def __unicode__(self):
        return 'Mail %r by %r' % (self.id, self.recipient,)


The table name in which all the information is stored is named as mails. It stores the recipient, the time at which the email is sent (timezone aware), the action which initiated the email sending, the subject of the email and the entire html body of the email. In case a datetime value is sent, we use that, else we use the current time in the time field.

HTML Templates

We store the html templates in the form of key value pairs in a file called system_mails.py inside the helpers module of the API. Inside the system_mails, we have a global dict variable named MAILS as shown below.

MAILS = {
    EVENT_PUBLISH: {
        'recipient': 'Organizer, Speaker',
        'subject': u'{event_name} is Live',
        'message': (
            u"Hi {email}<br/>" +
            u"Event, {event_name}, is up and running and ready for action. Go ahead and check it out." +
            u"<br/> Visit this link to view it: {link}"
        )
    },
    INVITE_PAPERS: {
        'recipient': 'Speaker',
        'subject': u'Invitation to Submit Papers for {event_name}',
        'message': (
            u"Hi {email}<br/>" +
            u"You are invited to submit papers for event: {event_name}" +
            u"<br/> Visit this link to fill up details: {link}"
        )
    },
    SESSION_ACCEPT_REJECT: {
        'recipient': 'Speaker',
        'subject': u'Session {session_name} has been {acceptance}',
        'message': (
            u"Hi {email},<br/>" +
            u"The session <strong>{session_name}</strong> has been <strong>{acceptance}</strong> by the organizer. " +
            u"<br/> Visit this link to view the session: {link}"
        )
    },
    SESSION_SCHEDULE: {
        'recipient': 'Organizer, Speaker',
        'subject': u'Schedule for Session {session_name} has been changed',
        'message': (
            u"Hi {email},<br/>" +
            u"The schedule for session <strong>{session_name}</strong> has been changed. " +
            u"<br/> Visit this link to view the session: {link}"
        )
    },


Inside the MAILS dict, we have key-value pairs, where in keys we use the global variables from the Model to define the action related to the email template. In the value, we again have 3 different key-value pairs – recipient, subject and message. The recipient defines the group who should receive this email, the subject goes into the subject part of the email while message forms the body for the email. For subject and message we use unicode strings with named placeholders that are used later for formatting using python’s
.format() function.

Email Functions

This is the most important part of the entire email sending system since this is the place where the entire email sending functionality is implemented using the above two modules. We have all these functions inside a single file namely mail.py inside the helpers module of the API. Firstly, we import two things in this file – The global dict variable MAILS defined in the template file above, and the various global action variables defined in the model. There is one main module which is used by every other individual modules for sending the emails defined as send_email(to, action, subject, html). This function takes as parameters the email to which the email is to be sent, the subject string, the html body string along with the action to store it in the database.

Firstly we ensure that the email address for the recipient is present and isn’t an empty string. After we have ensured this, we retrieve the email service as set in the admin settings. It can either be “smtp” or “sendgrid”. The email address for the sender has different formatting depending on the email service we are using. While sendgrid uses just the email say for example “medomag20@gmail.com”, smtp uses a format  a little different like this: Medozonuo Suohu<medomag20@gmail.com>. So we set that as well in the email_from variable.

def send_email(to, action, subject, html):
    """
    Sends email and records it in DB
    """
    if not string_empty(to):
        email_service = get_settings()['email_service']
        email_from_name = get_settings()['email_from_name']
        if email_service == 'smtp':
            email_from = email_from_name + '<' + get_settings()['email_from'] + '>'
        else:
            email_from = get_settings()['email_from']
        payload = {
            'to': to,
            'from': email_from,
            'subject': subject,
            'html': html
        }

        if not current_app.config['TESTING']:
            if email_service == 'smtp':
                smtp_encryption = get_settings()['smtp_encryption']
                if smtp_encryption == 'tls':
                    smtp_encryption = 'required'
                elif smtp_encryption == 'ssl':
                    smtp_encryption = 'ssl'
                elif smtp_encryption == 'tls_optional':
                    smtp_encryption = 'optional'
                else:
                    smtp_encryption = 'none'

                config = {
                    'host': get_settings()['smtp_host'],
                    'username': get_settings()['smtp_username'],
                    'password': get_settings()['smtp_password'],
                    'encryption': smtp_encryption,
                    'port': get_settings()['smtp_port'],
                }

                from tasks import send_mail_via_smtp_task
                send_mail_via_smtp_task.delay(config, payload)


After this we create the payload containing the email address for the recipient, the email address of the sender, the subject of the email and the html body of the email.
For unittesting and any other testing we avoid email sending since that is really not required in the flow. So we check that the current app is not configured to run in a testing environment. After that we have two different implementation depending on the email service used.

SMTP

There are 3 kind of possible encryptions for the email that can be used with smtp server – tls, ssl and optional. We determine this based on the admin settings again. Also, from the admin settings we collect the host, username, password and port for the smtp server.

After this we start a celery task for sending the email. Since email sending to a number of clients can be time consuming so we do it using the celery queueing service without disturbing the main workflow of the entire system.

@celery.task(name='send.email.post.smtp')
def send_mail_via_smtp_task(config, payload):
    mailer_config = {
        'transport': {
            'use': 'smtp',
            'host': config['host'],
            'username': config['username'],
            'password': config['password'],
            'tls': config['encryption'],
            'port': config['port']
        }
    }

    mailer = Mailer(mailer_config)
    mailer.start()
    message = Message(author=payload['from'], to=payload['to'])
    message.subject = payload['subject']
    message.plain = strip_tags(payload['html'])
    message.rich = payload['html']
    mailer.send(message)
    mailer.stop()

Inside the celery task, we use the Mailer and Message classes from the marrow module of python. We configure the Mailer according to the various settings received from the admin and then use the payload to send the email.

Sendgrid

For sending email using the sendgrid API, we need to set the Bearer key which is used for authenticating the email service. This key is also defined in the admin settings. After we have set the Bearer key as the authorization header, we again initiate the celery task corresponding to the sendgrid email sending service.

@celery.task(name='send.email.post')
def send_email_task(payload, headers):
    requests.post(
        "https://api.sendgrid.com/api/mail.send.json",
        data=payload,
        headers=headers
    )


For sending the email service, all we need to do is make a POST request to the api endpoint “
https://api.sendgrid.com/api/mail.send.json” with the headers which contains the Bearer Key and the data which contains the payload containing all the information related to the recipient, sender, subject of email and the body of the email.

Apart from these, this module implements all the individual functions that are called based on the various functions that occur. For example, let’s look into the email sending function in case a new session is created.

def send_email_new_session(email, event_name, link):
    """email for new session"""
    send_email(
        to=email,
        action=NEW_SESSION,
        subject=MAILS[NEW_SESSION]['subject'].format(
            event_name=event_name
        ),
        html=MAILS[NEW_SESSION]['message'].format(
            email=email,
            event_name=event_name,
            link=link
        )
    )


This function is called inside the Sessions API, for every speaker of the session as well as for every organizer of the event to which the session is submitted. Inside this function, we use the
send_email().  But firstly we need to create the subject of the email and the message body of the email using the templates and by replacing placeholders by actual value using python formatting. MAILS[NEW_SESSION] returns a unicode string: u’New session proposal for {event_name}’ . So what we do is use the .format() function to replace {event_name} by the actual event_name received as parameter. So it is equivalent to doing something like:

u'New session proposal for {event_name}'.format(‘FOSSASIA’)

which would give us a resulting string of the form:

u'New session proposal for FOSSASIA'

Similarly, we create the html message body using the templates and the parameters received. After this is done, we make a function call to send_email()  which then sends the final email.

References:

Continue ReadingImplement Email in Open Event Server

Implementing Search Functionality In Calendar Mode On Schedule Page In Open Event Webapp

f79b5a2b-b679-4095-8311-cf403193e0cc.png

Calendar Mode

fef75c16-da0a-4145-a2e5-620d63637194.png

The list mode of the page already supported the search feature. We needed to implement it in the calendar mode. The corresponding issue for this feature is here. The whole work can be seen here.

First, we see the basic structure of the page in the calendar mode.

<div class="{{slug}} calendar">
 <!-- slug represents the currently selected date -->
 <!-- This div contains all the sessions scheduled on the selected date -->
 <div class="rooms">
   <!-- This div contains all the rooms of an event -->
   <!-- Each particular room has a set of sessions associated with it on that particular date -->
   <div class="room">
     <!-- This div contains the list of session happening in a particular room -->
     <div class="session"> <!-- This div contains all the information about a session -->
       <div class="session-name"> {{title}} </div> <!-- Title of the session -->
       <h4 class="text"> {{{description}}} </h4> <!-- Description of the session -->
       <!-- This div contains the info of the speakers presenting the session -->
       <div class="session-speakers-list">
         <div class="speaker-name"><strong>{{{title}}}</div> <!-- Name of the speaker -->
           <div class="session-speakers-more"> {{position}} {{organisation}} </div> <!-- Position and organization of speaker-->
         </div>
       </div>
     </div>
   </div>
 </div>
</div>

The user will type the query in the search bar near the top of the page. The search bar has the class fossasia-filter.

c30a84c8-c456-4116-86fa-c5ffc382bdb3.png

We set up a keyup event listener on that element so that whenever the user will press and release a key, we will invoke the event handler function which will display only those elements which match the current query entered in the search bar. This way, we are able to change the results of the search dynamically on user input. Whenever a single key is pressed and lifted off, the event is fired which invokes the handler and the session elements are filtered accordingly.

Now the important part is how we actually display and hide the session elements. We actually compare few session attributes to the text entered in the search box. The text attributes that we look for are the title of the session, the name, position , and organization of the speaker(s) presenting the session. We check whether the text entered by the user in the search bar appears contiguously in any of the above-mentioned attributes or not. If it appears, then the session element is shown. Otherwise, its display is set to hidden. The checking is case insensitive. We also count the number of the visible sessions on the page and if it is equal to zero, display a message saying that no results were found.

For example:- Suppose the user enters the string ‘wel’ in the search bar, then we will iterate over all the different sessions and only those who have ‘wel’ in their title or in the name/ position/organization of the speakers will be visible. Rest all the sessions would be hidden.

Here is the excerpt from the code. The whole file can be seen here

$('.fossasia-filter').change(function() {
 var filterVal = $(this).val(); // Search query entered by user
 $('.session').each(function() { // Iterating through all the sessions. Check for the title of the session and the name of the
   // speaker and its position and organization
if ($(this).find('.session-name').text().toUpperCase().indexOf(filterVal.toUpperCase()) >= 0 ||
 $(this).find('.session-speakers-list a p span').text().toUpperCase().indexOf(filterVal.toUpperCase()) >= 0 || $(this).find('.speaker-name').text().toUpperCase().indexOf(filterVal.toUpperCase()) >= 0) {
     $(this).show(); // Matched so display the session
   } else {
     $(this).hide(); // Hide the Element
   }
 });
 var calFilterLength = $('.calendar:visible').length;
 if((isCalendarView && calFilterLength == 0)) { // No session elements found
   $('.search-filter:first').after('<p id="no-results">No matching results found.</p>');
 }
}).keyup(function() {
 $(this).change();
});

Below is the default view of the calendar mode on the schedule page

471d6b30-d8db-4dea-8593-df0ad68072a6.png

On entering ‘wel’ in the search bar, sessions get filtered

d6eec31e-fde1-4640-881b-a0216b68533d.png

 References:

Continue ReadingImplementing Search Functionality In Calendar Mode On Schedule Page In Open Event Webapp

Customising URL Using Custom Adapters in Open Event Front-end

Open-Event Front-end uses Ember data for handling Open Event Orga API which abides by JSON API specs. The API has relationships which represent models in the database, however there are some API endpoints for which the URL is not direct. We make use of custom adapter to build a custom URL for the requests.
In this blog we will see how to Implement relationships which do not have a model in the API server. Lets see how we implemented the admin-statistics-event API using custom adapter?

Creating Order-statistics model
To create a new model we use ember-cli command:

ember g model admin-statistics-event

The generated model:

export default ModelBase.extend({
  draft     : attr('number'),
  published : attr('number'),
  past      : attr('number')
})

The API returns 3 attributes namely draft, published & past which represent the total number of drafted, live and past event in the system. The admin-statistics-event is an admin related model.
Creating custom adapter
To create a new adapter we use ember-cli command:

ember g adapter event-statistics-event

If we try to do a GET request the URL for the request will be ‘v1/admin-statistics-event’ which is an incorrect endpoint. We create a custom adapter to override the buildURL method.

buildURL(modelName, id, snapshot, requestType, query) {
  let url = this._super(modelName, id, snapshot, requestType, query);
  url = url.replace('admin-statistics-event', 'admin/statistics/event');
  return url;
}

We create a new variable url which holds the url generated by the buildURL method of the super adapter. We call the super method using ‘this._super’. We will now replace the ‘admin-statistics-event’ with ‘admin/statistics/event’ in url variable. We return the new url variable. This results in generation of correct URL for the request.
Thank you for reading the blog, you can check the source code for the example here.
Resources

Continue ReadingCustomising URL Using Custom Adapters in Open Event Front-end

Create Event by Importing JSON files in Open Event Server

Apart from the usual way of creating an event in  FOSSASIA’s Orga Server project by using POST requests in Events API, another way of creating events is importing a zip file which is an archive of multiple JSON files. This way you can create a large event like FOSSASIA with lots of data related to sessions, speakers, microlocations, sponsors just by uploading JSON files to the system. Sample JSON file can be found in the open-event project of FOSSASIA. The basic workflow of importing an event and how it works is as follows:

  • First step is similar to uploading files to the server. We need to send a POST request with a multipart form data with the zipped archive containing the JSON files.
  • The POST request starts a celery task to start importing data from JSON files and storing them in the database.
  • The celery task URL is returned as a response to the POST request. You can use this celery task for polling purposes to get the status. If the status is FAILURE, we get the error text along with it. If status is SUCCESS we get the resulting event data
  • In the celery task, each JSON file is read separately and the data is stored in the db with the proper relations.
  • Sending a GET request to the above mentioned celery task, after the task has been completed returns the event id along with the event URL.

Let’s see how each of these points work in the background.

Uploading ZIP containing JSON Files

For uploading a zip archive instead of sending a JSON data in the POST request we send a multipart form data. The multipart/form-data format of sending data allows an entire file to be sent as a data in the POST request along with the relevant file informations. One can know about various form content types here .

An example cURL request looks something like this:

curl -H "Authorization: JWT <access token>" -X POST -F 'file=@event1.zip' http://localhost:5000/v1/events/import/json

The above cURL request uploads a file event1.zip from your current directory with the key as ‘file’ to the endpoint /v1/events/import/json. The user uploading the feels needs to have a JWT authentication key or in other words be logged in to the system as it is necessary to create an event.

@import_routes.route('/events/import/<string:source_type>', methods=['POST'])
@jwt_required()
def import_event(source_type):
    if source_type == 'json':
        file_path = get_file_from_request(['zip'])
    else:
        file_path = None
        abort(404)
    from helpers.tasks import import_event_task
    task = import_event_task.delay(email=current_identity.email, file=file_path,
                                   source_type=source_type, creator_id=current_identity.id)
    # create import job
    create_import_job(task.id)

    # if testing
    if current_app.config.get('CELERY_ALWAYS_EAGER'):
        TASK_RESULTS[task.id] = {
            'result': task.get(),
            'state': task.state
        }
    return jsonify(
        task_url=url_for('tasks.celery_task', task_id=task.id)
    )


After the request is received we check if a file exists in the key ‘file’ of the form-data. If it is there, we save the file and get the path to the saved file. Then we send this path over to the celery task and run the task with the
.delay() function of celery. After the celery task is started, the corresponding data about the import job is stored in the database for future debugging and logging purposes. After this we return the task url for the celery task that we started.

Celery Task to Import Data

Just like exporting of event, importing is also a time consuming task and we don’t want other application requests to be paused because of this task. Hence, we use a celery queue to execute this task. Whenever an import task is started, it is added to the celery queue. When it comes to the front of the queue it is executed.

For importing, we have created a celery task, import.event which calls the import_event_task_base() function that uses the import helper functions to get the data from JSON files imported and saved in the DB. After the task is completed, we update the import job data in the table with the status as either SUCCESS or FAILURE depending on the outcome of the celery task.

As a result of the celery task, the newly created event’s id and the frontend link from where we can visit the url is returned. This along with the status of the celery task is returned as the response for a GET request on the celery task. If the celery task fails, then the state is changed to FAILURE and the error which the celery faced is returned as the error message in the result key. We also print an error traceback in the celery worker.

@celery.task(base=RequestContextTask, name='import.event', bind=True, throws=(BaseError,))
def import_event_task(self, file, source_type, creator_id):
    """Import Event Task"""
    task_id = self.request.id.__str__()  # str(async result)
    try:
        result = import_event_task_base(self, file, source_type, creator_id)
        update_import_job(task_id, result['id'], 'SUCCESS')
        # return item
    except BaseError as e:
        print(traceback.format_exc())
        update_import_job(task_id, e.message, e.status if hasattr(e, 'status') else 'failure')
        result = {'__error': True, 'result': e.to_dict()}
    except Exception as e:
        print(traceback.format_exc())
        update_import_job(task_id, e.message, e.status if hasattr(e, 'status') else 'failure')
        result = {'__error': True, 'result': ServerError().to_dict()}
    # send email
    send_import_mail(task_id, result)
    # return result
    return result

 

Save Data from JSON

In import helpers, we have the functions which perform the main task of reading the JSON files, creating sqlalchemy model objects from them and saving them in the database. There are few global dictionaries which help maintain the order in which the files are to be imported and saved and also the file vs model mapping. The first JSON file to be imported is the event JSON file. Since all the other tables to be imported are related to the event table so first we read the event JSON file. After that the order in which the files are read is as follows:

  1. SocialLink
  2. CustomForms
  3. Microlocation
  4. Sponsor
  5. Speaker
  6. Track
  7. SessionType
  8. Session

This order helps maintain the foreign constraints. For importing data from these files we use the function create_service_from_json(). It sorts the elements in the data list  based on the key “id”. It then loops over all the elements or dictionaries contained in the data list. In each iteration delete the unnecessary key-value pairs from the dictionary. Then set the event_id for that element to the id of the newly created event from import instead of the old id present in the data.  After all this is done, create a model object based on the mapping with the filename with the dict data. Then save that model data into the database.

def create_service_from_json(task_handle, data, srv, event_id, service_ids=None):
    """
    Given :data as json, create the service on server
    :service_ids are the mapping of ids of already created services.
        Used for mapping old ids to new
    """
    if service_ids is None:
        service_ids = {}
    global CUR_ID
    # sort by id
    data.sort(key=lambda k: k['id'])
    ids = {}
    ct = 0
    total = len(data)
    # start creating
    for obj in data:
        # update status
        ct += 1
        update_state(task_handle, 'Importing %s (%d/%d)' % (srv[0], ct, total))
        # trim id field
        old_id, obj = _trim_id(obj)
        CUR_ID = old_id
        # delete not needed fields
        obj = _delete_fields(srv, obj)
        # related
        obj = _fix_related_fields(srv, obj, service_ids)
        obj['event_id'] = event_id
        # create object
        new_obj = srv[1](**obj)
        db.session.add(new_obj)
        db.session.commit()
        ids[old_id] = new_obj.id
        # add uploads to queue
        _upload_media_queue(srv, new_obj)

    return ids


After the data has been saved, the next thing to do is upload all the media files to the server. This we do using the
_upload_media_queue()  function. It takes paths to upload the files to from the storage.py helper file for APIs. Then it uploads the files using the various helper functions to the static data storage services like AWS S3, Google storage, etc.

Other than this, the import helpers also contains the function to create an import job that keeps a record of all the imports along with the task url and the user id of the user who started the importing task. It also stores the status of the task. Then there is the get_file_from_request()  function which saves the file that is uploaded through the POST request and returns the path to that file.

Get Response about Event Imported

The POST request returns a task url of the form /v1/tasks/ebe07632-392b-4ae9-8501-87ac27258ce5. To get the final result, you need to keep polling this URL. To know more about polling read my previous blog about exporting event or visit this link. So when the task is completed you would get a “result” key along with the status. The state can either be SUCCESS or FAILURE. If it is a FAILURE you will get a corresponding error message due to which the celery task failed. If it is a success then you get data related to the corresponding event that was created because of import. The data returned are the event id, event name and the event url which you can use to visit the event from the frontend. This data is also sent to the user as an email and notification.

An example response looks something like this:

{ 
    “result”: {
“event_name” : “FOSSASIA 2016”,
     “id” : “24”,
     “url” : “https://eventyay.com/events/ab3de6
},
    “state” : “SUCCESS”
}

The corresponding event name and the url is also sent to the user who started the import task. From the frontend, one can use the object value of the result to show the name of the event that is imported along with providing the event url. Since the id and identifier are both present in the result returned one can also make use of them to send GET, PATCH and other API requests to the events/ endpoint and get the corresponding relationship urls from it to query the other APIs. Thus, the entire data that is imported gets available to the frontend as well.

 

Reference Links:

 

Continue ReadingCreate Event by Importing JSON files in Open Event Server

Handling Requests for hasMany Relationships in Open Event Front-end

In Open Event Front-end we use Ember Data and JSON API specs to integrate the application with the server. Ember Data provides an easy way to handle API requests, however it does not support a direct POST for saving bulk data which was the problem we faced while implementing event creation using the API.

In this blog we will take a look at how we implemented POST requests for saving hasMany relationships, using an example of sessions-speakers route to see how we saved the tracks, microlocations & session-types. Lets see how we did it.

Fetching the data from the server

Ember by default does not support hasMany requests for getting related model data. However we can use external add on which enable the hasMany Get requests, we use ember-data-has-many-query which is a great add on for querying hasMany relations of a model.

let data = this.modelFor('events.view.edit');
data.tracks = data.event.get('tracks');
data.microlocations = data.event.get('microlocations');
data.sessionTypes = data.event.get('sessionTypes');
return RSVP.hash(data);

In the above example we are querying the tracks, microlocations & sessionTypes which are hasMany relationships, related to the events model. We can simply do a to do a GET request for the related model.

data.event.get('tracks');

In the above example we are retrieving the all the tracks of the event.

Sending a POST request for hasMany relationship
Ember currently does not saving bulk data POST requests for hasMany relations. We solved this by doing a POST request for individual data of the hasMany array.

We start with creating a `promises` array which contains all the individual requests. We then iterate over all the hasMany relations & push it to the `promises` array. Now each request is an individual promise.

let promises = [];

promises.push(this.get('model.event.tracks').toArray().map(track => track.save()));
promises.push(this.get('model.event.sessionTypes').toArray().map(type => type.save()));
promises.push(this.get('model.event.microlocations').toArray().map(location => location.save()));

Once we have all the promises we then use RSVP to make the POST requests. We make use of all() method which takes an array of promises as parameter and resolves all the promises. If the promises are not resolved successfully then we simply notify the user using the notify service, else we redirect to the home page.

RSVP.Promise.all(promises)
  .then(() => {
    this.transitionToRoute('index');
  }, function() {
    this.get('notify').error(this.l10n.t(Data did not save. Please try again'));
  });

The result of this we can now retrieve & create new tracks, microlocations & sessionTypes on sessions-speakers route.

Thank you for reading the blog, you can check the source code for the example here.

Resources

 

Continue ReadingHandling Requests for hasMany Relationships in Open Event Front-end

Export an Event using APIs of Open Event Server

We in FOSSASIA’s Open Event Server project, allow the organizer, co-organizer and the admins to export all the data related to an event in the form of an archive of JSON files. This way the data can be reused in some other place for various different purposes. The basic workflow is something like this:

  • Send a POST request in the /events/{event_id}/export/json with a payload containing whether you require the various media files.
  • The POST request starts a celery task in the background to start extracting data related to event and jsonifying them
  • The celery task url is returned as a response. Sending a GET request to this url gives the status of the task. If the status is either FAILED or SUCCESS then there is the corresponding error message or the result.
  • Separate JSON files for events, speakers, sessions, micro-locations, tracks, session types and custom forms are created.
  • All this files are then archived and the zip is then served on the endpoint /events/{event_id}/exports/{path}
  • Sending a GET request to the above mentioned endpoint downloads a zip containing all the data related to the endpoint.

Let’s dive into each of these points one-by-one

POST request ( /events/{event_id}/export/json)

For making a POST request you firstly need a JWT authentication like most of the other API endpoints. You need to send a payload containing the settings for whether you want the media files related with the event to be downloaded along with the JSON files. An example payload looks like this:

{
   "image": true,
   "video": true,
   "document": true,
   "audio": true
 }

def export_event(event_id):
    from helpers.tasks import export_event_task

    settings = EXPORT_SETTING
    settings['image'] = request.json.get('image', False)
    settings['video'] = request.json.get('video', False)
    settings['document'] = request.json.get('document', False)
    settings['audio'] = request.json.get('audio', False)
    # queue task
    task = export_event_task.delay(
        current_identity.email, event_id, settings)
    # create Job
    create_export_job(task.id, event_id)

    # in case of testing
    if current_app.config.get('CELERY_ALWAYS_EAGER'):
        # send_export_mail(event_id, task.get())
        TASK_RESULTS[task.id] = {
            'result': task.get(),
            'state': task.state
        }
    return jsonify(
        task_url=url_for('tasks.celery_task', task_id=task.id)
    )


Taking the settings about the media files and the event id, we pass them as parameter to the export event celery task and queue up the task. We then create an entry in the database with the task url and the event id and the user who triggered the export to keep a record of the activity. After that we return as response the url for the celery task to the user.

If the celery task is still underway it show a response with ‘state’:’WAITING’. Once, the task is completed, the value of ‘state’ is either ‘FAILED’ or ‘SUCCESS’. If it is SUCCESS it returns the result of the task, in this case the download url for the zip.

Celery Task to Export Event

Exporting an event is a very time consuming process and we don’t want that this process to come in the way of user interaction with other services. So we needed to use a queueing system that would queue the tasks and execute them in the background with disturbing the main worker from executing the other user requests. We have used celery to queue tasks in the background and execute them without disturbing the other user requests.

We have created a celery task namely “export.event” which calls the event_export_task_base() which in turn calls the export_event_json() where all the jsonification process is carried out. To start the celery task all we do is export_event_task.delay(event_id, settings) and it return a celery task object with a task id that can be used to check the status of the task.

@celery.task(base=RequestContextTask, name='export.event', bind=True)
def export_event_task(self, email, event_id, settings):
    event = safe_query(db, Event, 'id', event_id, 'event_id')
    try:
        logging.info('Exporting started')
        path = event_export_task_base(event_id, settings)
        # task_id = self.request.id.__str__()  # str(async result)
        download_url = path

        result = {
            'download_url': download_url
        }
        logging.info('Exporting done.. sending email')
        send_export_mail(email=email, event_name=event.name, download_url=download_url)
    except Exception as e:
        print(traceback.format_exc())
        result = {'__error': True, 'result': str(e)}
        logging.info('Error in exporting.. sending email')
        send_export_mail(email=email, event_name=event.name, error_text=str(e))

    return result


After exporting a path to the export zip is returned. We then get the downloading endpoint and return it as the result of the celery task. In case there is an error in the celery task, we print an entire traceback in the celery worker and return the error as a result.

Make the Exported Zip Ready

We have a separate export_helpers.py file in the helpers module of API for performing various tasks related to exporting all the data of the event. The most important function in this file is the export_event_json(). This file accepts the event_id and the settings dictionary. In the export helpers we have global constant dictionaries which contain the order in which the fields are to appear in the JSON files created while exporting.

Firstly, we create the directory for storing the exported JSON and finally the archive of all the JSON files. Then we have a global dictionary named EXPORTS which contains all the tables and their corresponding Models which we want to extract from the database and store as JSON.  From the EXPORTS dict we get the Model names. We use this Models to make queries with the given event_id and retrieve the data from the database. After retrieving data, we use another helper function named _order_json which jsonifies the sqlalchemy data in the order that is mentioned in the dictionary. After this we download the media data, i.e. the slides, images, videos etc. related to that particular Model depending on the settings.

def export_event_json(event_id, settings):
    """
    Exports the event as a zip on the server and return its path
    """
    # make directory
    exports_dir = app.config['BASE_DIR'] + '/static/uploads/exports/'
    if not os.path.isdir(exports_dir):
        os.mkdir(exports_dir)
    dir_path = exports_dir + 'event%d' % int(event_id)
    if os.path.isdir(dir_path):
        shutil.rmtree(dir_path, ignore_errors=True)
    os.mkdir(dir_path)
    # save to directory
    for e in EXPORTS:
        if e[0] == 'event':
            query_obj = db.session.query(e[1]).filter(
                e[1].id == event_id).first()
            data = _order_json(dict(query_obj.__dict__), e)
            _download_media(data, 'event', dir_path, settings)
        else:
            query_objs = db.session.query(e[1]).filter(
                e[1].event_id == event_id).all()
            data = [_order_json(dict(query_obj.__dict__), e) for query_obj in query_objs]
            for count in range(len(data)):
                data[count] = _order_json(data[count], e)
                _download_media(data[count], e[0], dir_path, settings)
        data_str = json.dumps(data, indent=4, ensure_ascii=False).encode('utf-8')
        fp = open(dir_path + '/' + e[0], 'w')
        fp.write(data_str)
        fp.close()
    # add meta
    data_str = json.dumps(
        _generate_meta(), sort_keys=True,
        indent=4, ensure_ascii=False
    ).encode('utf-8')
    fp = open(dir_path + '/meta', 'w')
    fp.write(data_str)
    fp.close()
    # make zip
    shutil.make_archive(dir_path, 'zip', dir_path)
    dir_path = dir_path + ".zip"

    storage_path = UPLOAD_PATHS['exports']['zip'].format(
        event_id=event_id
    )
    uploaded_file = UploadedFile(dir_path, dir_path.rsplit('/', 1)[1])
    storage_url = upload(uploaded_file, storage_path)

    return storage_url


After we receive the json data from the _order_json() function, we create a dump of the json using json.dumps with an indentation of 4 spaces and utf-8 encoding. Then we save this dump in a file named according to the model from which the data was retrieved. This process is repeated for all the models that are mentioned in the EXPORTS dictionary. After all the JSON files are created and all the media is downloaded, we make a zip of the folder.

To do this we use shutil.make_archive. It creates a zip and uploads the zip to the storage service used by the server such as S3, google storage, etc. and returns the url for the zip through which it can be accessed.

Apart from this function, the other major function in this file is to create an export job entry in the database so that we can keep a track about which used started a task related to which event and help us in debugging and security purposes.

Downloading the Zip File

After the exporting is completed, if you send a GET request to the task url, you get a response similar to this:

{
   "result": {
     "download_url": "http://localhost:5000/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
   },
   "state": "SUCCESS"
 }

So on opening the download url in the browser or using any other tool, you can download the zip file.

One big question however remains is, all the workflow is okay but how do you understand after sending the POST request, that the task is completed and ready to be downloaded? One way of solving this problem is a technique known as polling. In polling what we do is we send a GET request repeatedly after every fixed interval of time. So, what we do is from the POST request we get the url for the export task. You keep polling this task url until the state is either “FAILED” or “SUCCESS”. If it is a SUCCESS you append the download url somewhere in your website which can then clicked to download the archived export of the event.

 

Reference:

 

Continue ReadingExport an Event using APIs of Open Event Server

Uploading Files via APIs in the Open Event Server

There are two file upload endpoints. One is endpoint for image upload and the other is for all other files being uploaded. The latter endpoint is to be used for uploading files such as slides, videos and other presentation materials for a session. So, in FOSSASIA’s Orga Server project, when we need to upload a file, we make an API request to this endpoint which is turn uploads the file to the server and returns back the url for the uploaded file. We then store this url for the uploaded file to the database with the corresponding row entry.

Sending Data

The endpoint /upload/file  accepts a POST request, containing a multipart/form-data payload. If there is a single file that is uploaded, then it is uploaded under the key “file” else an array of file is sent under the key “files”.

A typical single file upload cURL request would look like this:

curl -H “Authorization: JWT <key>” -F file=@file.pdf -x POST http://localhost:5000/v1/upload/file

A typical multi-file upload cURL request would look something like this:

curl -H “Authorization: JWT <key>” -F files=@file1.pdf -F files=@file2.pdf -x POST http://localhost:5000/v1/upload/file

Thus, unlike other endpoints in open event orga server project, we don’t send a json encoded request. Instead it is a form data request.

Saving Files

We use different services such as S3, google cloud storage and so on for storing the files depending on the admin settings as decided by the admin of the project. One can even ask to save the files locally by passing a GET parameter force_local=true. So, in the backend we have 2 cases to tackle- Single File Upload and Multiple Files Upload.

Single File Upload

if 'file' in request.files:
        files = request.files['file']
        file_uploaded = uploaded_file(files=files)
        if force_local == 'true':
            files_url = upload_local(
                file_uploaded,
                UPLOAD_PATHS['temp']['event'].format(uuid=uuid.uuid4())
            )
        else:
            files_url = upload(
                file_uploaded,
                UPLOAD_PATHS['temp']['event'].format(uuid=uuid.uuid4())
            )


We get the file, that is to be uploaded using
request.files[‘file’] with the key as ‘file’ which was used in the payload. Then we use the uploaded_file() helper function to convert the file data received as payload into a proper file and store it in a temporary storage. After this, if force_local is set as true, we use the upload_local helper function to upload it to the local storage, i.e. the server where the application is hosted, else we use whatever service is set by the admin in the admin settings.

In uploaded_file() function of helpers module, we extract the filename and the extension of the file from the form-data payload. Then we check if the suitable directory already exists. If it doesn’t exist, we create a new directory and then save the file in the directory

extension = files.filename.split('.')[1]
        filename = get_file_name() + '.' + extension
        filedir = current_app.config.get('BASE_DIR') + '/static/uploads/'
        if not os.path.isdir(filedir):
            os.makedirs(filedir)
        file_path = filedir + filename
        files.save(file_path)


After that the upload function gets the settings key for either s3 or google storage and then uses the corresponding functions to upload this temporary file to the storage.

Multiple File Upload

 elif 'files[]' in request.files:
        files = request.files.getlist('files[]')
        files_uploaded = uploaded_file(files=files, multiple=True)
        files_url = []
        for file_uploaded in files_uploaded:
            if force_local == 'true':
                files_url.append(upload_local(
                    file_uploaded,
                    UPLOAD_PATHS['temp']['event'].format(uuid=uuid.uuid4())
                ))
            else:
                files_url.append(upload(
                    file_uploaded,
                    UPLOAD_PATHS['temp']['event'].format(uuid=uuid.uuid4())
                ))


In case of multiple files upload, we get a list of files instead of a single file. Hence we get the list of files sent as form data using
request.files.getlist(‘files[]’). Here ‘files’ is the key that is used and since it is an array of file content, hence it is written as files[]. We again use the uploaded_file() function to get back a list of temporary files from the content that has been uploaded as form-data. After that we loop over all the temporary files that are stored in the variable files_uploaded in the above code. Next, for every file in the list of temporary files, we use the upload() helper function to save these files in the storage system of the application.

In the uploaded_file() function of the helpers module, since this time there are multiple files and their content sent, so things work differently. We loop over all the files that are received and for each of these files we find their filename and extension. Then we create directories to save these files in and then save the content of the file with the corresponding filename and extension. After the file has been saved, we append it to a list and finally return the entire list so that we can get a list of all files.

if multiple:
        files_uploaded = []
        for file in files:
            extension = file.filename.split('.')[1]
            filename = get_file_name() + '.' + extension
            filedir = current_app.config.get('BASE_DIR') + '/static/uploads/'
            if not os.path.isdir(filedir):
                os.makedirs(filedir)
            file_path = filedir + filename
            file.save(file_path)
            files_uploaded.append(UploadedFile(file_path, filename))


The
upload() function then finally returns us the urls for the files after saving them.

API Response

The file upload endpoint either returns a single url or a list of urls depending on whether a single file was uploaded or multiple files were uploaded. The url for the file depends on the storage system that has been used. After the url or list of urls is received, we jsonify the entire response so that we can send a proper JSON response that can be parsed properly in the frontend and used for saving corresponding information to the database using the other API services.

A typical single file upload response looks like this:

{
     "url": "https://xyz.storage.com/asd/fgh/hjk/12332233.docx"
 }

Multiple file upload response looks like this:

{
     "url": [
         "https://xyz.storage.com/asd/fgh/hjk/12332233.docx",
         "https://xyz.storage.com/asd/fgh/hjk/66777777.ppt"
     ]
 }

You can find the related documentations and example payloads on how to use this endpoint to upload files here: http://open-event-api.herokuapp.com/#upload-file-upload.

 

Reference:

Continue ReadingUploading Files via APIs in the Open Event Server