Implementing Notifications in Open Event Server

In FOSSASIA’s Open Event Server project, along with emails, almost all actions have necessary user notifications as well. So, when a new session is created or a session is accepted by the event organisers, along with the email, a user notification is also sent. Though showing the user notification is mainly implemented in the frontend site but the content to be shown and on which action to show is strictly decided by the server project. A notification essentially helps an user to get the necessary information while staying in the platform itself and not needing to go to check his/her email for every action he performs. So unlike email which acts as a backup for the informations, notification is more of an instant thing. The API The Notifications API is mostly like all other JSON API endpoints in the open event project. However in Notifications API we do not allow any to send a POST request. The admin of the server is able to send a GET a request to view all the notifications that are there in the system while a user can only view his/her notification. As of PATCH we allow only the user to edit his/her notification to mark it as read or not read. Following is the schema for the API: class NotificationSchema(Schema): """ API Schema for Notification Model """ class Meta: """ Meta class for Notification API schema """ type_ = 'notification' self_view = 'v1.notification_detail' self_view_kwargs = {'id': '<id>'} self_view_many = 'v1.microlocation_list_post' inflect = dasherize id = fields.Str(dump_only=True) title = fields.Str(allow_none=True, dump_only=True) message = fields.Str(allow_none=True, dump_only=True) received_at = fields.DateTime(dump_only=True) accept = fields.Str(allow_none=True, dump_only=True) is_read = fields.Boolean() user = Relationship(attribute='user', self_view='v1.notification_user', self_view_kwargs={'id': '<id>'}, related_view='v1.user_detail', related_view_kwargs={'notification_id': '<id>'}, schema='UserSchema', type_='user' ) The main things that are shown in the notification from the frontend are the title and message. The title is the text that is shown without expanding the entire notification that gives an overview about the message in case you don’t want to read the entire message. The message however provides the entire detail that is associated with the action performed. The user relationship stores which user the particular notification is related with. It is a one-to-one relationship where one notification can only belong to one user. However one user can have multiple notifications. Another important attribute is the is_read attribute. This is the only attribute that is allowed to be changed. By default, when we make an entry in the database, is_read is set to FALSE. Once an user has read the notification, a request is sent from the frontend to change is_read to TRUE. The different actions for which we send notification are stored in the models/notification.py file as global variables. USER_CHANGE_EMAIL = "User email"' NEW_SESSION = 'New Session Proposal' PASSWORD_CHANGE = 'Change Password' EVENT_ROLE = 'Event Role Invitation' TICKET_PURCHASED = 'Ticket(s) Purchased' TICKET_PURCHASED_ATTENDEE = 'Ticket(s) purchased to Attendee ' EVENT_EXPORTED = 'Event Exported' EVENT_EXPORT_FAIL = 'Event Export Failed' EVENT_IMPORTED = 'Event Imported' HTML Templates The notification title and message that is stored in the…

Continue ReadingImplementing Notifications in Open Event Server

Implement Email in Open Event Server

In FOSSASIA’s Open Event Server project, we send out emails when various different actions are performed using the API. For example, when a new user is created, he/she receives an email welcoming him to the server as well as an email verification email. Users get role invites from event organisers in the form of emails, when someone buys a ticket he/she gets a PDF link to the ticket as email. So as you can understand all the important informations that are necessary to be notified to the user are sent as an email to the user and sometimes to the organizer as well. In FOSSASIA, we use sendgrid’s API or an SMTP server depending on the admin settings for sending emails. You can read more about how we use sendgrid’s API to send emails in FOSSASIA here. Now let’s dive into the modules that we have for sending the emails. The three main parts in the entire email sending are: Model - Storing the Various Actions Templates - Storing the HTML templates for the emails Email Functions - Individual functions for various different actions Let’s go through each of these modules one by one. Model USER_REGISTER = 'User Registration' USER_CONFIRM = 'User Confirmation' USER_CHANGE_EMAIL = "User email" INVITE_PAPERS = 'Invitation For Papers' NEXT_EVENT = 'Next Event' NEW_SESSION = 'New Session Proposal' PASSWORD_RESET = 'Reset Password' PASSWORD_CHANGE = 'Change Password' EVENT_ROLE = 'Event Role Invitation' SESSION_ACCEPT_REJECT = 'Session Accept or Reject' SESSION_SCHEDULE = 'Session Schedule Change' EVENT_PUBLISH = 'Event Published' AFTER_EVENT = 'After Event' USER_REGISTER_WITH_PASSWORD = 'User Registration during Payment' TICKET_PURCHASED = 'Ticket(s) Purchased' In the Model file, named as mail.py, we firstly declare the various different actions for which we send the emails out. These actions are globally used as the keys in the other modules of the email sending service. Here, we define global variables with the name of the action as strings in them. These are all constant variables, which means that there value remains throughout and never changes. For example, USER_REGISTER has the value ‘User Registration’, which essentially means that anything related to the USER_REGISTER key is executed when the User Registration action occurs. Or in other words, whenever an user registers into the system by signing up or creating a new user through the API, he/she receives the corresponding emails. Apart from this, we have the model class which defines a table in the database. We use this model class to store the actions performed while sending emails in the database. So we store the action, the time at which the email was sent, the recipient and the sender. That way we have a record about all the emails that were sent out via our server. class Mail(db.Model): __tablename__ = 'mails' id = db.Column(db.Integer, primary_key=True) recipient = db.Column(db.String) time = db.Column(db.DateTime(timezone=True)) action = db.Column(db.String) subject = db.Column(db.String) message = db.Column(db.String) def __init__(self, recipient=None, time=None, action=None, subject=None, message=None): self.recipient = recipient self.time = time if self.time is None: self.time = datetime.now(pytz.utc) self.action = action self.subject = subject…

Continue ReadingImplement Email in Open Event Server

Create Event by Importing JSON files in Open Event Server

Apart from the usual way of creating an event in  FOSSASIA’s Orga Server project by using POST requests in Events API, another way of creating events is importing a zip file which is an archive of multiple JSON files. This way you can create a large event like FOSSASIA with lots of data related to sessions, speakers, microlocations, sponsors just by uploading JSON files to the system. Sample JSON file can be found in the open-event project of FOSSASIA. The basic workflow of importing an event and how it works is as follows: First step is similar to uploading files to the server. We need to send a POST request with a multipart form data with the zipped archive containing the JSON files. The POST request starts a celery task to start importing data from JSON files and storing them in the database. The celery task URL is returned as a response to the POST request. You can use this celery task for polling purposes to get the status. If the status is FAILURE, we get the error text along with it. If status is SUCCESS we get the resulting event data In the celery task, each JSON file is read separately and the data is stored in the db with the proper relations. Sending a GET request to the above mentioned celery task, after the task has been completed returns the event id along with the event URL. Let’s see how each of these points work in the background. Uploading ZIP containing JSON Files For uploading a zip archive instead of sending a JSON data in the POST request we send a multipart form data. The multipart/form-data format of sending data allows an entire file to be sent as a data in the POST request along with the relevant file informations. One can know about various form content types here . An example cURL request looks something like this: curl -H "Authorization: JWT <access token>" -X POST -F 'file=@event1.zip' http://localhost:5000/v1/events/import/json The above cURL request uploads a file event1.zip from your current directory with the key as ‘file’ to the endpoint /v1/events/import/json. The user uploading the feels needs to have a JWT authentication key or in other words be logged in to the system as it is necessary to create an event. @import_routes.route('/events/import/<string:source_type>', methods=['POST']) @jwt_required() def import_event(source_type): if source_type == 'json': file_path = get_file_from_request(['zip']) else: file_path = None abort(404) from helpers.tasks import import_event_task task = import_event_task.delay(email=current_identity.email, file=file_path, source_type=source_type, creator_id=current_identity.id) # create import job create_import_job(task.id) # if testing if current_app.config.get('CELERY_ALWAYS_EAGER'): TASK_RESULTS[task.id] = { 'result': task.get(), 'state': task.state } return jsonify( task_url=url_for('tasks.celery_task', task_id=task.id) ) After the request is received we check if a file exists in the key ‘file’ of the form-data. If it is there, we save the file and get the path to the saved file. Then we send this path over to the celery task and run the task with the .delay() function of celery. After the celery task is started, the corresponding data about the import job is…

Continue ReadingCreate Event by Importing JSON files in Open Event Server

Export an Event using APIs of Open Event Server

We in FOSSASIA’s Open Event Server project, allow the organizer, co-organizer and the admins to export all the data related to an event in the form of an archive of JSON files. This way the data can be reused in some other place for various different purposes. The basic workflow is something like this: Send a POST request in the /events/{event_id}/export/json with a payload containing whether you require the various media files. The POST request starts a celery task in the background to start extracting data related to event and jsonifying them The celery task url is returned as a response. Sending a GET request to this url gives the status of the task. If the status is either FAILED or SUCCESS then there is the corresponding error message or the result. Separate JSON files for events, speakers, sessions, micro-locations, tracks, session types and custom forms are created. All this files are then archived and the zip is then served on the endpoint /events/{event_id}/exports/{path} Sending a GET request to the above mentioned endpoint downloads a zip containing all the data related to the endpoint. Let’s dive into each of these points one-by-one POST request ( /events/{event_id}/export/json) For making a POST request you firstly need a JWT authentication like most of the other API endpoints. You need to send a payload containing the settings for whether you want the media files related with the event to be downloaded along with the JSON files. An example payload looks like this: {  "image": true,  "video": true,  "document": true,  "audio": true } def export_event(event_id): from helpers.tasks import export_event_task settings = EXPORT_SETTING settings['image'] = request.json.get('image', False) settings['video'] = request.json.get('video', False) settings['document'] = request.json.get('document', False) settings['audio'] = request.json.get('audio', False) # queue task task = export_event_task.delay( current_identity.email, event_id, settings) # create Job create_export_job(task.id, event_id) # in case of testing if current_app.config.get('CELERY_ALWAYS_EAGER'): # send_export_mail(event_id, task.get()) TASK_RESULTS[task.id] = { 'result': task.get(), 'state': task.state } return jsonify( task_url=url_for('tasks.celery_task', task_id=task.id) ) Taking the settings about the media files and the event id, we pass them as parameter to the export event celery task and queue up the task. We then create an entry in the database with the task url and the event id and the user who triggered the export to keep a record of the activity. After that we return as response the url for the celery task to the user. If the celery task is still underway it show a response with ‘state’:’WAITING’. Once, the task is completed, the value of ‘state’ is either ‘FAILED’ or ‘SUCCESS’. If it is SUCCESS it returns the result of the task, in this case the download url for the zip. Celery Task to Export Event Exporting an event is a very time consuming process and we don’t want that this process to come in the way of user interaction with other services. So we needed to use a queueing system that would queue the tasks and execute them in the background with disturbing the main worker from executing the other user…

Continue ReadingExport an Event using APIs of Open Event Server

Uploading Files via APIs in the Open Event Server

There are two file upload endpoints. One is endpoint for image upload and the other is for all other files being uploaded. The latter endpoint is to be used for uploading files such as slides, videos and other presentation materials for a session. So, in FOSSASIA’s Orga Server project, when we need to upload a file, we make an API request to this endpoint which is turn uploads the file to the server and returns back the url for the uploaded file. We then store this url for the uploaded file to the database with the corresponding row entry. Sending Data The endpoint /upload/file  accepts a POST request, containing a multipart/form-data payload. If there is a single file that is uploaded, then it is uploaded under the key “file” else an array of file is sent under the key “files”. A typical single file upload cURL request would look like this: curl -H “Authorization: JWT <key>” -F file=@file.pdf -x POST http://localhost:5000/v1/upload/file A typical multi-file upload cURL request would look something like this: curl -H “Authorization: JWT <key>” -F files=@file1.pdf -F files=@file2.pdf -x POST http://localhost:5000/v1/upload/file Thus, unlike other endpoints in open event orga server project, we don’t send a json encoded request. Instead it is a form data request. Saving Files We use different services such as S3, google cloud storage and so on for storing the files depending on the admin settings as decided by the admin of the project. One can even ask to save the files locally by passing a GET parameter force_local=true. So, in the backend we have 2 cases to tackle- Single File Upload and Multiple Files Upload. Single File Upload if 'file' in request.files: files = request.files['file'] file_uploaded = uploaded_file(files=files) if force_local == 'true': files_url = upload_local( file_uploaded, UPLOAD_PATHS['temp']['event'].format(uuid=uuid.uuid4()) ) else: files_url = upload( file_uploaded, UPLOAD_PATHS['temp']['event'].format(uuid=uuid.uuid4()) ) We get the file, that is to be uploaded using request.files[‘file’] with the key as ‘file’ which was used in the payload. Then we use the uploaded_file() helper function to convert the file data received as payload into a proper file and store it in a temporary storage. After this, if force_local is set as true, we use the upload_local helper function to upload it to the local storage, i.e. the server where the application is hosted, else we use whatever service is set by the admin in the admin settings. In uploaded_file() function of helpers module, we extract the filename and the extension of the file from the form-data payload. Then we check if the suitable directory already exists. If it doesn’t exist, we create a new directory and then save the file in the directory extension = files.filename.split('.')[1] filename = get_file_name() + '.' + extension filedir = current_app.config.get('BASE_DIR') + '/static/uploads/' if not os.path.isdir(filedir): os.makedirs(filedir) file_path = filedir + filename files.save(file_path) After that the upload function gets the settings key for either s3 or google storage and then uses the corresponding functions to upload this temporary file to the storage. Multiple File Upload elif 'files[]' in request.files:…

Continue ReadingUploading Files via APIs in the Open Event Server

How User Event Roles relationship is handled in Open Event Server

Users and Events are the most important part of FOSSASIA's Open Event Server. Through the advent and upgradation of the project, the way of implementing user event roles has gone through a lot many changes. When the open event organizer server was first decoupled to serve as an API server, the user event roles like all other models was decided to be served as a separate API to provide a data layer above the database for making changes in the entries. Whenever a new role invite was accepted, a POST request was made to the User Events Roles table to insert the new entry. Whenever there was a change in the role of an user for a particular event, a PATCH request was made. Permissions were made such that a user could insert only his/her user id and not someone else’s entry. def before_create_object(self, data, view_kwargs): """ method to create object before post :param data: :param view_kwargs: :return: """ if view_kwargs.get('event_id'): event = safe_query(self, Event, 'id', view_kwargs['event_id'], 'event_id') data['event_id'] = event.id elif view_kwargs.get('event_identifier'): event = safe_query(self, Event, 'identifier', view_kwargs['event_identifier'], 'event_identifier') data['event_id'] = event.id email = safe_query(self, User, 'id', data['user'], 'user_id').email invite = self.session.query(RoleInvite).filter_by(email=email).filter_by(role_id=data['role'])\ .filter_by(event_id=data['event_id']).one_or_none() if not invite: raise ObjectNotFound({'parameter': 'invite'}, "Object: not found") def after_create_object(self, obj, data, view_kwargs): """ method to create object after post :param data: :param view_kwargs: :return: """ email = safe_query(self, User, 'id', data['user'], 'user_id').email invite = self.session.query(RoleInvite).filter_by(email=email).filter_by(role_id=data['role'])\ .filter_by(event_id=data['event_id']).one_or_none() if invite: invite.status = "accepted" save_to_db(invite) else: raise ObjectNotFound({'parameter': 'invite'}, "Object: not found") Initially what we did was when a POST request was sent to the User Event Roles API endpoint, we would first check whether a role invite from the organizer exists for that particular combination of user, event and role. If it existed, only then we would make an entry to the database. Else we would raise an “Object: not found” error. After the entry was made in the database, we would update the role_invites table to change the status for the role_invite. Later it was decided that we need not make a separate API endpoint. Since API endpoints are all user accessible and may cause some problem with permissions, it was decided that the user event roles would be handled entirely through the model instead of a separate API. Also, the workflow wasn’t very clear for an user. So we decided on a workflow where the role_invites table is first updated with the particular status and after the update has been made, we make an entry to the user_event_roles table with the data that we get from the role_invites table. When a role invite is accepted, sqlalchemy add() and commit() is used to insert a new entry into the table. When a role is changed for a particular user, we make a query, update the values and save it back into the table. So the entire process is handled in the data layer level rather than the API level. The code implementation is as follows: def before_update_object(self, role_invite, data, view_kwargs): """ Method to edit object…

Continue ReadingHow User Event Roles relationship is handled in Open Event Server

Testing Errors and Exceptions Using Unittest in Open Event Server

Like all other helper functions in FOSSASIA's Open Event Server, we also need to test the exception and error helper functions and classes. The error helper classes are mainly used to create error handler responses for known errors. For example we know error 403 is Access Forbidden, but we want to send a proper source message along with a proper error message to help identify and handle the error, hence we use the error classes. To ensure that future commits do not mismatch the error, we implemented the unit tests for errors. There are mainly two kind of error classes, one are HTTP status errors and the other are the exceptions. Depending on the type of error we get in the try-except block for a particular API, we raise that particular exception or error. Unit Test for Exception Exceptions are written in this form: @validates_schema def validate_quantity(self, data): if 'max_order' in data and 'min_order' in data: if data['max_order'] < data['min_order']: raise UnprocessableEntity({'pointer': '/data/attributes/max-order'}, "max-order should be greater than min-order")   This error is raised wherever the data that is sent as POST or PATCH is unprocessable. For example, this is how we raise this error: raise UnprocessableEntity({'pointer': '/data/attributes/min-quantity'},           "min-quantity should be less than max-quantity") This exception is raised due to error in validation of data where maximum quantity should be more than minimum quantity. To test that the above line indeed raises an exception of UnprocessableEntity with status 422, we use the assertRaises() function. Following is the code: def test_exceptions(self): # Unprocessable Entity Exception with self.assertRaises(UnprocessableEntity): raise UnprocessableEntity({'pointer': '/data/attributes/min-quantity'}, "min-quantity should be less than max-quantity") In the above code, with self.assertRaises() creates a context of exception type, so that when the next line raises an exception, it asserts that the exception that it was expecting is same as the exception raised and hence ensures that the correct exception is being raised Unit Test for Error In error helper classes, what we do is, for known HTTP status codes we return a response that is user readable and understandable. So this is how we raise an error: ForbiddenError({'source': ''}, 'Super admin access is required') This is basically the 403: Access Denied error. But with the “Super admin access is required” message it becomes far more clear. However we need to ensure that status code returned when this error message is shown still stays 403 and isn’t modified in future unwantedly. Here, errors and exceptions work a little different. When we declare a custom error class, we don’t really raise that error. Instead we show that error as a response. So we can’t use the assertRaises() function. However what we can do is we can compare the status code and ensure that the error raised is the same as the expected one. So we do this: def test_errors(self): with app.test_request_context(): # Forbidden Error forbidden_error = ForbiddenError({'source': ''}, 'Super admin access is required') self.assertEqual(forbidden_error.status, 403) # Not Found Error not_found_error = NotFoundError({'source': ''}, 'Object not found.') self.assertEqual(not_found_error.status, 404) Here we firstly create an…

Continue ReadingTesting Errors and Exceptions Using Unittest in Open Event Server

Open Event Server: Testing Image Resize Using PIL and Unittest

FOSSASIA's Open Event Server project uses a certain set of functions in order to resize image from its original, example to thumbnail, icon or larger image. How do we test this resizing of images functions in Open Event Server project? To test image dimensions resizing functionality, we need to verify that the the resized image dimensions is same as the dimensions provided for resize.  For example, in this function, we provide the url for the image that we received and it creates a resized image and saves the resized version. def create_save_resized_image(image_file, basewidth, maintain_aspect, height_size, upload_path, ext='jpg', remove_after_upload=False, resize=True): """ Create and Save the resized version of the background image :param resize: :param upload_path: :param ext: :param remove_after_upload: :param height_size: :param maintain_aspect: :param basewidth: :param image_file: :return: """ filename = '{filename}.{ext}'.format(filename=get_file_name(), ext=ext) image_file = cStringIO.StringIO(urllib.urlopen(image_file).read()) im = Image.open(image_file) # Convert to jpeg for lower file size. if im.format is not 'JPEG': img = im.convert('RGB') else: img = im if resize: if maintain_aspect: width_percent = (basewidth / float(img.size[0])) height_size = int((float(img.size[1]) * float(width_percent))) img = img.resize((basewidth, height_size), PIL.Image.ANTIALIAS) temp_file_relative_path = 'static/media/temp/' + generate_hash(str(image_file)) + get_file_name() + '.jpg' temp_file_path = app.config['BASE_DIR'] + '/' + temp_file_relative_path dir_path = temp_file_path.rsplit('/', 1)[0] # create dirs if not present if not os.path.isdir(dir_path): os.makedirs(dir_path) img.save(temp_file_path) upfile = UploadedFile(file_path=temp_file_path, filename=filename) if remove_after_upload: os.remove(image_file) uploaded_url = upload(upfile, upload_path) os.remove(temp_file_path) return uploaded_url In this function, we send the image url, the width and height to be resized to, and the aspect ratio as either True or False along with the folder to be saved. For this blog, we are gonna assume aspect ratio is False which means that we don’t maintain the aspect ratio while resizing. So, given the above mentioned as parameter, we get the url for the resized image that is saved. To test whether it has been resized to correct dimensions, we use Pillow or as it is popularly know, PIL. So we write a separate function named getsizes() within which get the image file as a parameter. Then using the Image module of PIL, we open the file as a JpegImageFile object. The JpegImageFile object has an attribute size which returns (width, height). So from this function, we return the size attribute. Following is the code: def getsizes(self, file): # get file size *and* image size (None if not known) im = Image.open(file) return im.size As we have this function, it’s time to look into the unit testing function. So in unit testing we set dummy width and height that we want to resize to, set aspect ratio as false as discussed above. This helps us to test that both width and height are properly resized. We are using a creative commons licensed image for resizing. This is the code: def test_create_save_resized_image(self): with app.test_request_context(): image_url_test = 'https://cdn.pixabay.com/photo/2014/09/08/17/08/hot-air-balloons-439331_960_720.jpg' width = 500 height = 200 aspect_ratio = False upload_path = 'test' resized_image_url = create_save_resized_image(image_url_test, width, aspect_ratio, height, upload_path, ext='png') resized_image_file = app.config.get('BASE_DIR') + resized_image_url.split('/localhost')[1] resized_width, resized_height = self.getsizes(resized_image_file) In the above code from create_save_resized_image, we receive the url…

Continue ReadingOpen Event Server: Testing Image Resize Using PIL and Unittest

Creating Unit Tests for File Upload Functions in Open Event Server with Python Unittest Library

In FOSSASIA's Open Event Server, we use the Python unittest library for unit testing various modules of the API code. Unittest library provides us with various assertion functions to assert between the actual and the expected values returned by a function or a module. In normal modules, we simply use these assertions to compare the result since the parameters mostly take as input normal data types. However one very important area for unittesting is File Uploading. We cannot really send a particular file or any such payload to the function to unittest it properly, since it expects a request.files kind of data which is obtained only when file is uploaded or sent as a request to an endpoint. For example in this function: def uploaded_file(files, multiple=False): if multiple: files_uploaded = [] for file in files: extension = file.filename.split('.')[1] filename = get_file_name() + '.' + extension filedir = current_app.config.get('BASE_DIR') + '/static/uploads/' if not os.path.isdir(filedir): os.makedirs(filedir) file_path = filedir + filename file.save(file_path) files_uploaded.append(UploadedFile(file_path, filename)) else: extension = files.filename.split('.')[1] filename = get_file_name() + '.' + extension filedir = current_app.config.get('BASE_DIR') + '/static/uploads/' if not os.path.isdir(filedir): os.makedirs(filedir) file_path = filedir + filename files.save(file_path) files_uploaded = UploadedFile(file_path, filename) return files_uploaded So, we need to create a mock uploading system to replicate this check. So inside the unittesting function we create an api route for this particular scope to accept a file as a request. Following is the code: @app.route("/test_upload", methods=['POST']) def upload(): files = request.files['file'] file_uploaded = uploaded_file(files=files) return jsonify( {'path': file_uploaded.file_path, 'name': file_uploaded.filename}) In the above code, it creates an app route with endpoint test_upload. It accepts a request.files. Then it sends this object to the uploaded_file function (the function to be unittested), gets the result of the function, and returns the result in a json format. With this we have the endpoint to mock a file upload ready. Next we need to send a request with file object. We cannot send a normal data which would then be treated as a normal request.form. But we want to receive it in request.files. So we create 2 different classes inheriting other classes. def test_upload_single_file(self): class FileObj(StringIO): def close(self): pass class MyRequest(Request): def _get_file_stream(*args, **kwargs): return FileObj() app.request_class = MyRequest MyRequest class inherits the Request class of Flask framework. We define the file stream of the Request class as the FileObj. Then, we set the request_class attribute of the Flask app to this new MyRequest class. After we have it all setup, we need to send the request and see if the uploaded file is being saved properly or not. For this purpose we take help of StringIO library. StringIO creates a file-like class which can be then used to replicate a file uploading system. So we send the data as {‘file’: (StringIO('1,2,3,4'), 'test_file.csv')}. We send this as data to the /test_upload endpoint that we have created previously. As a result, the endpoint receives the function, saves the file, and returns the filename and file_path for the stored file. with app.test_request_context(): client = app.test_client() resp = client.post('/test_upload',…

Continue ReadingCreating Unit Tests for File Upload Functions in Open Event Server with Python Unittest Library

Rendering Open Event Server’s API-Blueprint document

After writing the FOSSASIA's Open Event Server project API- Blueprint Document manually, we wanted to know how we could render the document, how to check it in an HTML-client friendly format and how to make it change the look as we go. In order to do that, we found two rendering ways. They are: 1) The apiary editor: This editor helps us to render API blueprints and print them in user readable API documented format. When we create the API blueprint manually, we always follow the pattern write an api blueprint i.e the name and metadata, then followed by resource groups and actions, which was already discussed in the last blog. In order to use the apiary editor, we start off by creating our first project. Initially during the our first use of this editor, we will get a default “polls and vote” example api project. This is a template we can use as guide. The pole/vote api looks something like this in the editor mode:   Apiary has a facility to test an API, document an API, inspect an API or simply edit an API. We first start off by creating a project “open-event-api”. Next, in the editor mode of the apiary, we add the contents of our api-blueprint documents. Here is an example of how USERS API is rendered. If we get our request and response correctly, on clicking List All Users we will get a good 200 response like this in the editor: However, if we tend to go off format with the api-blueprint, we get an invalid error: The final rendering and how the API result can be seen in the document mode with the respective API’s request and response. The document mode request and response look like this: This rendered doc can be viewed publicly with the link got in the document mode. Similarly, we test it out in the editor for the rest of the ap. This is a simple way to render your api blueprint. 2)  The aglio renderer: Since API blueprint is presented in the form of .apib format, the con side of it is it is not easily viewable by viewers. Even though we use apiary, view the rendered docs along with getting a shareable link, we would surely like the docs for our API server to be hosted in our server as well. So, we use Aglio exactly to do that . It is an API Blueprint renderer which supports multiple themes. It converts the apib file into user readable formats such as pdf, html, etc. Here since we want to host it as a webpage, we render it in the form of .html.  It outputs static HTML of the result and can be served by any web host. Since API Blueprint is a Markdown-based document format, this lets us write API descriptions and documentation in a simple and straightforward way. An example of how aglio rendered document in a three column format looks like: The best thing about Aglio is not…

Continue ReadingRendering Open Event Server’s API-Blueprint document