Open Event Server – Export Attendees as CSV File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he/she makes his event public, others can view it and buy tickets if interested.

The organizer can see all the attendees in a very detailed view in the event management dashboard. He can see the statuses of all the attendees. The possible statuses are completed, placed, pending, expired and canceled, checked in and not checked in. He/she can take actions such as checking in the attendee.

If the organizer wants to download the list of all the attendees as a CSV file, he or she can do it very easily by simply clicking on the Export As and then on CSV.

Let us see how this is done on the server.

Server side – generating the Attendees CSV file

Here we will be using the csv package provided by python for writing the csv file.

import csv
  • We define a method export_attendees_csv which takes the attendees to be exported as a CSV file as the argument.
  • Next, we define the headers of the CSV file. It is the first row of the CSV file.
def export_attendees_csv(attendees):
   headers = ['Order#', 'Order Date', 'Status', 'First Name', 'Last Name', 'Email',
              'Country', 'Payment Type', 'Ticket Name', 'Ticket Price', 'Ticket Type']
  • A list is defined called rows. This contains the rows of the CSV file. As mentioned earlier, headers is the first row.
rows = [headers]
  • We iterate over each attendee in attendees and form a row for that attendee by separating the values of each of the columns by a comma. Here, every row is one attendee.
  • The newly formed row is added to the rows list.
for attendee in attendees:
   column = [str(attendee.order.get_invoice_number()) if attendee.order else '-',
             str(attendee.order.created_at) if attendee.order and attendee.order.created_at else '-',
             str(attendee.order.status) if attendee.order and attendee.order.status else '-',
             str(attendee.firstname) if attendee.firstname else '',
             str(attendee.lastname) if attendee.lastname else '',
             str(attendee.email) if attendee.email else '',
             str(attendee.country) if attendee.country else '',
             str(attendee.order.payment_mode) if attendee.order and attendee.order.payment_mode else '',
             str(attendee.ticket.name) if attendee.ticket and attendee.ticket.name else '',
             str(attendee.ticket.price) if attendee.ticket and attendee.ticket.price else '0',
             str(attendee.ticket.type) if attendee.ticket and attendee.ticket.type else '']

   rows.append(column)
  • rows contains the contents of the CSV file and hence it is returned.
return rows
  • We iterate over each item of rows and write it to the CSV file using the methods provided by the csv package.
writer = csv.writer(temp_file)
from app.api.helpers.csv_jobs_util import export_attendees_csv
content = export_attendees_csv(attendees)
for row in content:
   writer.writerow(row)

Obtaining the Attendees CSV file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/attendees/csv

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the attendees of the event as a CSV file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his/her Celery task. If the task completed successfully he/she will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above-mentioned URL.

References

Continue ReadingOpen Event Server – Export Attendees as CSV File

Open Event Server – Export Orders as CSV File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he/she makes his event public, others can view it and buy tickets if interested.

The organizer can see all the orders in a very detailed view in the event management dashboard. He can see the statuses of all the orders. The possible statuses are completed, placed, pending, expired and canceled.

If the organizer wants to download the list of all the orders as a CSV file, he or she can do it very easily by simply clicking on the Export As and then on CSV.

Let us see how this is done on the server.

Server side – generating the Orders CSV file

Here we will be using the csv package provided by python for writing the csv file.

import csv
  • We define a method export_orders_csv which takes the orders to be exported as a CSV file as the argument.
  • Next, we define the headers of the CSV file. It is the first row of the CSV file.
def export_orders_csv(orders):
   headers = ['Order#', 'Order Date', 'Status', 'Payment Type', 'Total Amount', 'Quantity',
              'Discount Code', 'First Name', 'Last Name', 'Email']
  • A list is defined called rows. This contains the rows of the CSV file. As mentioned earlier, headers is the first row.
rows = [headers]
  • We iterate over each order in orders and form a row for that order by separating the values of each of the columns by a comma. Here, every row is one order.
  • The newly formed row is added to the rows list.
for order in orders:
   if order.status != "deleted":
       column = [str(order.get_invoice_number()), str(order.created_at) if order.created_at else '',
                 str(order.status) if order.status else '', str(order.paid_via) if order.paid_via else '',
                 str(order.amount) if order.amount else '', str(order.get_tickets_count()),
                 str(order.discount_code.code) if order.discount_code else '',
                 str(order.user.first_name)
                 if order.user and order.user.first_name else '',
                 str(order.user.last_name)
                 if order.user and order.user.last_name else '',
                 str(order.user.email) if order.user and order.user.email else '']
       rows.append(column)
  • rows contains the contents of the CSV file and hence it is returned.
return rows
  • We iterate over each item of rows and write it to the CSV file using the methods provided by the csv package.
writer = csv.writer(temp_file)
from app.api.helpers.csv_jobs_util import export_orders_csv
content = export_orders_csv(orders)
for row in content:
   writer.writerow(row)

Obtaining the Orders CSV file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/orders/csv

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the orders of the event as a CSV file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}</span

The user can go to the above-returned URL and check the status of his/her Celery task. If the task completed successfully he/she will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the aabove-mentionedURL.

References

Continue ReadingOpen Event Server – Export Orders as CSV File

Implementing Tax Endpoint in Open Event Server

The Open Event Server enables organizers to manage events from concerts to conferences and meetups. It offers features for events with several tracks and venues. The Event organizers may want to charge taxes on the event tickets. The Open Event Server has a Tax endpoint in order to support it. This blog goes over it’s implementation details in the project.

Model

First up, we will discuss what fields have been stored in the database for Tax endpoint. The most important fields are as follows:

  • The tax rate charged in percentage
  • The id for the Tax
  • The registered company
  • The country
  • The address of the event organiser
  • The additional message to be included as the invoice footer

We also store a field to specify whether the tax should be included in the ticket price or not. Each Event can have only one associated Tax information. You can checkout the full model for reference here.

Schema

We have defined two schemas for the Tax endpoint. This is because there are a few fields which contain sensitive information and should only be shown to the event organizer or the admin itself while the others can be shown to the public. Fields like name and rate aren’t sensitive and can be disclosed to the public. They have been defined in the TaxSchemaPublic class. Sensitive information like the tax id, address, registered company have been included in the TaxSchema class which inherits from the TaxSchemaPublic class. You can checkout the full schema for reference here.

Resources

The endpoint supports all the CRUD operations i.e. Create, Read, Update and Delete.

Create and Update

The Tax entry for an Event can be created using a POST request to the /taxes endpoint. We analyze if the posted data contains a related event id or identifier which is necessary as every tax entry is supposed to be related with an event. Moreover we also check whether a tax entry already exists for the event or not since an event should have only one tax entry. An error is raised if that is not the case otherwise the tax entry is created and saved in the database. An existing entry can be updated using the same endpoint by making a PATCH request.  

Read

A Tax entry can be fetched using a GET request to the  /taxes/{tax_id}  endpoint with the id for the tax entry. The entry for an Event can also be fetched from /events/{event_id}/tax  endpoint.

Delete

An existing Tax entry can be deleted by making a DELETE request to the /taxes/{tax_id} endpoint with the id of the entry. We make sure the tax entry exists. An error is raised if that is not the case else we delete it from the database.

References

Continue ReadingImplementing Tax Endpoint in Open Event Server

Open Event Server – Export Event as a Pentabarf XML File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as a Pentabarf XML file. Pentabarf XML is used to store events/conferences in a format which most of the scheduling applications can read and add that particular event/conference to the user’s schedule.

Server side – generating the Pentabarf XML file

Here we will be using the pentabarf package for Python for parsing and creating the file.

from pentabarf.Conference import Conference
from pentabarf.Day import Day
from pentabarf.Event import Event
from pentabarf.Person import Person
from pentabarf.Room import Room
  • We define a class PentabarfExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the event in the required format:
event = EventModel.query.get(event_id)
diff = (event.ends_at - event.starts_at)

conference = Conference(title=event.name, start=event.starts_at, end=event.ends_at,
                       days=diff.days if diff.days > 0 else 1,
                       day_change="00:00", timeslot_duration="00:15",
                       venue=event.location_name)
dates = (db.session.query(cast(Session.starts_at, DATE))
        .filter_by(event_id=event_id)
        .filter_by(state='accepted')
        .filter(Session.deleted_at.is_(None))
        .order_by(asc(Session.starts_at)).distinct().all())
  • We have queried for the dates of the event and saved it in dates.
  • We will now iterate over each date and query the microlocations who have a session on that particular date.
for date in dates:
   date = date[0]
   day = Day(date=date)
   microlocation_ids = list(db.session.query(Session.microlocation_id)
                            .filter(func.date(Session.starts_at) == date)
                            .filter_by(state='accepted')
                            .filter(Session.deleted_at.is_(None))
                            .order_by(asc(Session.microlocation_id)).distinct())
  • For each microlocation thus obtained, we will query for accepted sessions to be held at those microlocations.
  • We will also initialize a Room for each microlocation.
for microlocation_id in microlocation_ids:
   microlocation_id = microlocation_id[0]
   microlocation = Microlocation.query.get(microlocation_id)
   sessions = Session.query.filter_by(microlocation_id=microlocation_id) \
       .filter(func.date(Session.starts_at) == date) \
       .filter_by(state='accepted') \
       .filter(Session.deleted_at.is_(None)) \
       .order_by(asc(Session.starts_at)).all()

   room = Room(name=microlocation.name)
  • We will now iterate over the aabove-obtained sessions and instantiate an Event for each session.
  • Then we will iterate over all the speakers of that session and instantiate a Person for each speaker.
  • Finally, we will add that Event to the Room we created earlier.
for session in sessions:

   session_event = Event(id=session.id,
                         date=session.starts_at,
                         start=session.starts_at,
                         duration=str(session.ends_at - session.starts_at) + "00:00",
                         track=session.track.name,
                         abstract=session.short_abstract,
                         title=session.title,
                         type='Talk',
                         description=session.long_abstract,
                         conf_url=url_for('event_detail.display_event_detail_home',
                                          identifier=event.identifier),
                         full_conf_url=url_for('event_detail.display_event_detail_home',
                                               identifier=event.identifier, _external=True),
                         released="True" if event.schedule_published_on else "False")

   for speaker in session.speakers:
       person = Person(id=speaker.id, name=speaker.name)
       session_event.add_person(person)

   room.add_event(session_event)
  • Then we will add the room to the day and then add each day to the conference.
day.add_room(room)
conference.add_day(day)
  • Finally, we will call the generate method of the conference to generate the XML file. This can be directly written to the file.
return conference.generate("Generated by " + get_settings()['app_name'])

Obtaining the Pentabarf XML file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/pentabarf

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as a Pentabarf XML file. It returns the task of the URL to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above-mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the Pentabarf XML format.

References

Continue ReadingOpen Event Server – Export Event as a Pentabarf XML File

Open Event Server – Export Event as xCalendar File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as an xCalendar file. xCal is an XML representation of the iCalendar standard. xCal is not an alternative nor next generation of iCalendar. xCal represents iCalendar components, properties, and parameters as defined in iCalendar. This format was selected to ease its translation back to the iCalendar format using an XSLT transform.

Server side – generating the xCal file

Here we will be using the xml.etree.ElementTree package for Python for parsing and creating XML data.

from xml.etree.ElementTree import Element, SubElement, tostring
  • We define a class XCalExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the calendar:
event = Event.query.get(event_id)

tz = event.timezone or 'UTC'
tz = pytz.timezone(tz)

i_calendar_node = Element('iCalendar')
i_calendar_node.set('xmlns:xCal', 'urn:ietf:params:xml:ns:xcal')
v_calendar_node = SubElement(i_calendar_node, 'vcalendar')
version_node = SubElement(v_calendar_node, 'version')
version_node.text = '2.0'
prod_id_node = SubElement(v_calendar_node, 'prodid')
prod_id_node.text = '-//fossasia//open-event//EN'
cal_desc_node = SubElement(v_calendar_node, 'x-wr-caldesc')
cal_desc_node.text = "Schedule for sessions at " + event.name
cal_name_node = SubElement(v_calendar_node, 'x-wr-calname')
cal_name_node.text = event.name
  • We query for the accepted sessions of the event and store it in sessions
sessions = Session.query \
   .filter_by(event_id=event_id) \
   .filter_by(state='accepted') \
   .filter(Session.deleted_at.is_(None)) \
   .order_by(asc(Session.starts_at)).all()
  • We then iterate through all the sessions in sessions.
  • If it is a valid session, we instantiate a SubElement and store required details
v_event_node = SubElement(v_calendar_node, 'vevent')

method_node = SubElement(v_event_node, 'method')
method_node.text = 'PUBLISH'

uid_node = SubElement(v_event_node, 'uid')
uid_node.text = str(session.id) + "-" + event.identifier

dtstart_node = SubElement(v_event_node, 'dtstart')
dtstart_node.text = tz.localize(session.starts_at).isoformat()

…. So on
  • We then loop through all the speakers in that particular session and add it to the xCal calendar node object as well.
for speaker in session.speakers:
   attendee_node = SubElement(v_event_node, 'attendee')
   attendee_node.text = speaker.name
  • And finally, the string of the calendar node is returned. This is the xCalendar file contents. This can be directly written to a file.
return tostring(i_calendar_node)

Obtaining the xCal file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/xcal

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as an xCal file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the xcs format.

References

Continue ReadingOpen Event Server – Export Event as xCalendar File

Open Event Server – Export Event as an iCalendar File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as an iCalendar file. Going by the Wikipedia definition, iCalendar is a computer file format which allows Internet users to send meeting requests and tasks to other Internet users by sharing or sending files in this format through various methods. The files usually have an extension of .ics. With supporting software, such as an email reader or calendar application, recipients of an iCalendar data file can respond to the sender easily or counter propose another meeting date/time. The file format is specified in a proposed internet standard (RFC 5545) for calendar data exchange.

Server side – generating the iCal file

Here we will be using the icalendar package for Python as the file writer.

from icalendar import Calendar, vCalAddress, vText
  • We define a class ICalExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the calendar:
event = EventModel.query.get(event_id)

cal = Calendar()
cal.add('prodid', '-//fossasia//open-event//EN')
cal.add('version', '2.0')
cal.add('x-wr-calname', event.name)
cal.add('x-wr-caldesc', "Schedule for sessions at " + event.name)
  • We query for the accepted sessions of the event and store it in sessions.
sessions = Session.query \
   .filter_by(event_id=event_id) \
   .filter_by(state='accepted') \
   .filter(Session.deleted_at.is_(None)) \
   .order_by(asc(Session.starts_at)).all()
  • We then iterate through all the sessions in sessions.
  • If it is a valid session, we instantiate an icalendar event and store required details.
event_component = icalendar.Event()
event_component.add('summary', session.title)
event_component.add('uid', str(session.id) + "-" + event.identifier)
event_component.add('geo', (event.latitude, event.longitude))
event_component.add('location', session.microlocation.name or '' + " " + event.location_name)
event_component.add('dtstart', tz.localize(session.starts_at))
event_component.add('dtend', tz.localize(session.ends_at))
event_component.add('email', event.email)
event_component.add('description', session.short_abstract)
event_component.add('url', url_for('event_detail.display_event_detail_home',
                                  identifier=event.identifier, _external=True))
  • We then loop through all the speakers in that particular session and add it to the iCal Event object as well.
for speaker in session.speakers:
   # Ref: http://icalendar.readthedocs.io/en/latest/usage.html#file-structure
   # can use speaker.email below but privacy reasons
   attendee = vCalAddress('MAILTO:' + event.email if event.email else 'undefined@email.com')
   attendee.params['cn'] = vText(speaker.name)
   event_component.add('attendee', attendee)
  • This event_component is then added to the cal object that we created in the beginning.
cal.add_component(event_component)
  • And finally, the cal.to_ical() is returned. This is the iCalendar file contents. This can be directly written to a file.
return cal.to_ical()

Obtaining the iCal file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/ical

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as an iCal file. It returns the task of the URL to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the ics format.

References

Continue ReadingOpen Event Server – Export Event as an iCalendar File

Adding Defaults Prior to Schema Validation Elegantly

The Open Event Server offers features for events with several tracks and venues. When we were designing the models for the API, we wanted to add default values for some fields in case they aren’t provided by the client. This blog discusses how we have implemented in the project using python decorators that complies to the DRY principle and is easy to test and maintain.

Problem

Let’s first discuss the problem at hand in detail. We use Marshmallow extensively in our project. Marshmallow is an ORM/ODM/framework-agnostic library for converting complex data types, such as objects, to and from native python data types. We use it for Validating the input data, Deserializing the input data to app-level objects and Serializing app-level objects to primitive Python types.

We can define Schema’s very easily using Marshmallow. It also provides an easy way to declare default values to the fields. Below is a sample schema declaration:

class SampleSchema(Schema):
    """
    Sample Schema declaration
    """

    class Meta:
        """
        Meta class for the Sample Schema
        """
        type_ = 'sample-schema'

    id = fields.Str(dump_only=True)
    field_without_default = fields.Str(required=True)
    field_with_default = fields.Boolean(required=True, default=False)

We have defined an id field for storing the unique ID of the Model. We have also defined two other fields. One of them named as “field_with_default” is a Boolean field and has a default value specified as False.

When a client makes a POST/PATCH request to the server, we first validate the input data sent to us by the clients against the defined schema. Marshmallow also supports schema validation but it doesn’t support using default values during deserialization of the input data. It meant that whenever the input data had a missing field, the server would throw a validation error even for a field for which the default values are defined. It was clearly wrong since if the default values are defined, we would want that value to be used for the field. This defeats the entire purpose of declaring default values at the first place.

So, we would ideally like the following behaviour from the Server:

  1. If the values are defined in the input data, use it during validation.
  2. If the value for a required field is not defined but default value has been defined in the Schema, then use that value.
  3. If no value has been defined for a required field and it doesn’t have any default value specified, then throw an error.

Solution

Marshmallow provides decorators like @pre_load and @post_load for adding pre-processing and post-processing methods. We can use them to add a method in each of the Schema classes which takes in the input data and the schema and adds default values to fields before we validate the input.

The first approach which we took was to add the following method to each of the schema classes defined in the project.

@pre_load
def patch_defaults(schema, in_data):
        data = in_data.get('data')
        if data is None or data.get('attributes') is None:
            return in_data
        attributes = data.get('attributes')
        for name, field in schema.fields.items():
            dasherized_name = dasherize(name)
            attribute = attributes.get(dasherized_name)
            if attribute is None:
                attributes[dasherized_name] = field.default
        return in_data

The method loops over all the fields defined in the schema class using schema.fields.item(). dasherize is a helper function defined in the utils class which converts underscores(_) in the variable name to dashes(-). After replacing the underscores with dashes we check if the value for the attribute is None. If it is None, then we assign it the specified default value.

Enhancing the solution

The above solution works but there is a problem. We have around 50 schemas defined in the project. Copy pasting this method 50 times would definitely violate the DRY principle. Moreover if we need to change this method in the future, we would have to do it 50 times.

One way to avoid it would be to add the patch_defaults method in a separate file and add a helper method make_object in each of the schema classes which just calls it.

@pre_load
def make_object(self, in_data):
    return patch_defaults(self, in_data)

We would still be repeating the helper method in 50 different files but since it’s sole purpose is to call the patch_defaults method, we won’t have to make changes in 50 files.

It certainly works well but we can go a step further and make it even easier. We can define a class decorator which would add the above make_object method to the class.

def use_defaults():
    """
    Decorator added to model classes which have default values specified for one of it's fields
    Adds the make_object method defined above to the class.
    :return: wrapper
    """
    def wrapper(k, *args, **kwargs):
        setattr(k, "make_object", eval("make_object", *args, **kwargs))
        return k
    return wrapper

Now we can simply add the use_defaults() decorator on the schema class and it would work.

References

Continue ReadingAdding Defaults Prior to Schema Validation Elegantly

Open Event Server – Change a Column from NULL to NOT NULL

FOSSASIA‘s Open Event Server uses alembic migration files to handle all database operations and updating. Whenever the database is changed a corresponding migration python script is made so that the database will migrate accordingly for other developers as well. But often we forget that the automatically generated script usually just add/deletes columns or alters the column properties. It does not handle the migration of existing data in that column. This can lead to huge data loss or error in migration as well.

For example :

def upgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.alter_column('ticket_holders', 'lastname',
                    existing_type=sa.VARCHAR(),
                    nullable=False)
    # ### end Alembic commands ###

Here, the goal was to change the column “ticket_holders” from nullable to not nullable. The script that alembic autogenerated just uses op.alter_column().

It does not count for the already existing data. So, if the column has any entries which are null, this migration will lead to an error saying that the column contains null entries and hence cannot be “NOT NULL”.

How to Handle This?

Before altering the column definition we can follow the following steps :

  1. Look for all the null entries in the column
  2. Give some arbitrary default value to those
  3. Now we can safely alter the column definition

Let’s see how we can achieve this. For connecting with the database we will use SQLAlchemy. First, we get a reference to the table and the corresponding column that we wish to alter.

ticket_holders_table = sa.sql.table('ticket_holders',
                                        sa.Column('lastname', sa.VARCHAR()))

 

Since we need the “last_name” column from the table “ticket_holders”, we specify it in the method argument.

Now, we will give an arbitrary default value to all the originally null entries in the column. In this case, I chose to use a space character.

op.execute(ticket_holders_table.update()
               .where(ticket_holders_table.c.lastname.is_(None))
               .values({'lastname': op.inline_literal(' ')}))

op.execute() can execute direct SQL commands as well but we chose to go with SQLAlchemy which builds an optimal SQL command from our modular input. One such example of a complex SQL command being directly executed is :

op.execute('INSERT INTO event_types(name, slug) SELECT DISTINCT event_type_id, lower(replace(regexp_replace(event_type_id, \'& |,\', \'\', \'g\'), \' \', \'-\')) FROM events where not exists (SELECT 1 FROM event_types where event_types.name=events.event_type_id) and event_type_id is not null;'))

Now that we have handled all the null data, it is safe to alter the column definition. So we proceed to execute the final statement –

op.alter_column('ticket_holders', 'lastname',
                    existing_type=sa.VARCHAR(),
                    nullable=False)

Now the entire migration script will run without any error. The final outcome would be –

  1. All the null “last_name” entries would be replaced by a space character
  2. The “last_name” column would now be a NOT NULL column.

References

Continue ReadingOpen Event Server – Change a Column from NULL to NOT NULL

Adding multiple email support for users on Open Event Server

The Open Event Server enables organizers to manage events from concerts to conferences and meet-ups. It offers features for events with several tracks and venues. Event managers can create invitation forms for speakers and build schedules in a drag and drop interface. The event information is stored in a database. The system provides API endpoints to fetch the data, and to modify and update it.

The Open Event Server is based on JSON 1.0 Specification and hence build on top of Flask Rest Json API (for building Rest APIs) and Marshmallow (for Schema).

In this blog, we will talk about how to add support of multiple emails for a user in Open Event Server. The focus is on model and schema creation for this support.

Model Creation

For the UserEmail, we’ll make our model as follows

from app.models import db

class UserEmail(db.Model):
“””user email model class”””
__tablename__ = ‘user_emails’
id = db.Column(db.Integer, primary_key=True)
email = db.Column(db.String(120), unique=True, nullable=False)
verified = db.Column(db.Boolean, default=False)
user_id = db.Column(db.Integer, db.ForeignKey(‘users.id’, ondelete=’CASCADE’))
user = db.relationship(“User”, backref=”emails”, foreign_keys=[user_id])

def __init__(self, email=None, user_id=None):
self.email = email
self.user_id = user_id

def __str__(self):
return ‘User:’ + unicode(self.user_id).encode(‘utf-8’) + ‘ email: ‘ + unicode(self.email).encode(‘utf-8’)

def __unicode__(self):
return unicode(self.id)

Now, let’s try to understand the attributes of this model.

  1. id is most important Column required in every model to set it as primary key and to uniquely identify an UserEmail object.
  2. email is that attribute which is required hence should be unique and non-nullable.
  3. Verified attribute is used to check whether a email is verified or not (thus should be boolean)
  4. User_id is the attribute which specifies id of the user whose email is contained in the UserEmail object.
  5. Finally, a relationship with the user of id user_id and these emails (associated with the User.id == user_id) will be stored in the attribute emails in User Model.

Schema Creation

For the model UserEmail, we’ll make our schema UserEmailSchema as follows

from marshmallow_jsonapi import fields
from marshmallow_jsonapi.flask import Schema, Relationshipfrom app.api.helpers.utilities import dasherizeclass UserEmailSchema(Schema):
“””   API Schema for user email Model   “””class Meta:
“””  Meta class for user email API schema  “””
type_ = ‘user-emails’
self_view = ‘v1.user_emails_detail’
self_view_kwargs = {‘id’: ‘<id>’}
inflect = dasherize

id = fields.Str(dump_only=True)
email = fields.Email(allow_none=False)
user_id = fields.Integer(allow_none=False)
user = Relationship(attribute=’user’,
self_view=’v1.user_email’,
self_view_kwargs={‘id’: ‘<id>’},
related_view=’v1.user_detail’,
related_view_kwargs={‘user_id’: ‘<id>’},
schema=’UserSchema’,
type_=’user’
)

  • Marshmallow-jsonapi provides a simple way to produce JSON API-compliant data in any Python web framework.

Now, let’s try to understand the schema UserEmailSchema

  1. id : Same as in model id is used as uniquely identify an UserEmail object.
  2. email : Same as in model email is required thus allow_none is set to False.
  3. User_id : user_id is the id of user whose email is contained in a UserEmailSchema object.
  4. User : It tells whole attributes of the user to which this email belongs to.

So, we saw how to add multiple email support for users on Open Event Server. We just required to create a model and its schema to add this feature. Similarly, to add support for any database model in the project, we need to create Model and Schema with all the attributes as specified in the model too. This Schema creation is done with guidelines of JSONAPI 1.0 Specification using Marshmallow.

Resources

Continue ReadingAdding multiple email support for users on Open Event Server

How Errors from Server Side are Handled On Open Event Frontend

This blog article will illustrate how the various error or status codes are handled  in  Open Event Frontend, and how the appropriate response is generated corresponding to those error codes. Open Event Frontend, relies on Open Event Server for all server operations. Open Event Server exposes  a well documented JSON:API Spec Compliant REST API. The clients using the api primarily interact with it using GET, POST , PATCH and DELETE requests. And thus for each request the API returns corresponding data as response along with it’s status code.

For instance whenever the app opens, for the landing page, all the events are fetched by making a GET request to the end point v1/events. If the request is successful and events data is returned, the status code is 200 which stands for OK in the http standard set by IANA.

Fig 1: Screenshot of google chrome developer consoles’ networking tab while making a request.

Since Open Event server is compliant with JSON:API Spec, to quote it’s official documentation, “Error objects MUST be returned as an array keyed by errors in the top level of a JSON API document.” Thus whenever there is an error, or the request is unsuccessful due to a variety of reasons, the server has a predefined format to convey the information to the front end.

The process is illustrated by the reset password form on open event frontend. When a user forgets his password, he/she has the option to reset it, using his email address. Thus the form just takes in the email address of the user and makes a POST request to the reset-password API endpoint of the server.

  • Once the request is made there are 3 possibilities (check references for error code significance):
    The request is successful and a status code of 200 is returned.
  • The email address user entered doesn’t exists and no record is found in the database. 422 status code should be returned.
  • The server is down, or the request is invalid (something unexpected has occurred). In all such scenarios error code 404 should be returned.

this.get('loader')
         .post('auth/reset-password', payload)
         .then(() => {
           this.set('successMessage', this.l10n.t('Please go to the link sent to your     

           email to reset your password'));
         })
         .catch(reason => {
           if (reason && reason.hasOwnProperty('errors') && reason.errors[0].status

               === 422) {
             this.set('errorMessage', this.l10n.t('No account is registered with this

                      email address.'));
           } else {
             this.set('errorMessage', this.l10n.t('An unexpected error occurred.'));
           }
         })
         .finally(()=> {
           this.set('isLoading', false);
         }
         );
Figure 2 : The reset password UI

Thus as mentioned in the JSON:API docs, the errors property is expected to contain the status code and error message(optional) , which ember handles via the the catch block. The catch block is executed whenever the response from the request is not successful. The contents of the response are present in the reason property. If the status of the error is 422, the corresponding message is stored inside the errorMessage property of the component which is further used to display the alert by rendering an error block on the forgot password form.

In case there is no error, the errorMessage is undefined, and the error block is not rendered at all. In case of any other unexpected error, the standard text is displayed by initialising the errorMessage property to it.

Resources

Continue ReadingHow Errors from Server Side are Handled On Open Event Frontend