Implement Email in Open Event Server

In FOSSASIA’s Open Event Server project, we send out emails when various different actions are performed using the API. For example, when a new user is created, he/she receives an email welcoming him to the server as well as an email verification email. Users get role invites from event organisers in the form of emails, when someone buys a ticket he/she gets a PDF link to the ticket as email. So as you can understand all the important informations that are necessary to be notified to the user are sent as an email to the user and sometimes to the organizer as well. In FOSSASIA, we use sendgrid’s API or an SMTP server depending on the admin settings for sending emails. You can read more about how we use sendgrid’s API to send emails in FOSSASIA here. Now let’s dive into the modules that we have for sending the emails. The three main parts in the entire email sending are: Model - Storing the Various Actions Templates - Storing the HTML templates for the emails Email Functions - Individual functions for various different actions Let’s go through each of these modules one by one. Model USER_REGISTER = 'User Registration' USER_CONFIRM = 'User Confirmation' USER_CHANGE_EMAIL = "User email" INVITE_PAPERS = 'Invitation For Papers' NEXT_EVENT = 'Next Event' NEW_SESSION = 'New Session Proposal' PASSWORD_RESET = 'Reset Password' PASSWORD_CHANGE = 'Change Password' EVENT_ROLE = 'Event Role Invitation' SESSION_ACCEPT_REJECT = 'Session Accept or Reject' SESSION_SCHEDULE = 'Session Schedule Change' EVENT_PUBLISH = 'Event Published' AFTER_EVENT = 'After Event' USER_REGISTER_WITH_PASSWORD = 'User Registration during Payment' TICKET_PURCHASED = 'Ticket(s) Purchased' In the Model file, named as mail.py, we firstly declare the various different actions for which we send the emails out. These actions are globally used as the keys in the other modules of the email sending service. Here, we define global variables with the name of the action as strings in them. These are all constant variables, which means that there value remains throughout and never changes. For example, USER_REGISTER has the value ‘User Registration’, which essentially means that anything related to the USER_REGISTER key is executed when the User Registration action occurs. Or in other words, whenever an user registers into the system by signing up or creating a new user through the API, he/she receives the corresponding emails. Apart from this, we have the model class which defines a table in the database. We use this model class to store the actions performed while sending emails in the database. So we store the action, the time at which the email was sent, the recipient and the sender. That way we have a record about all the emails that were sent out via our server. class Mail(db.Model): __tablename__ = 'mails' id = db.Column(db.Integer, primary_key=True) recipient = db.Column(db.String) time = db.Column(db.DateTime(timezone=True)) action = db.Column(db.String) subject = db.Column(db.String) message = db.Column(db.String) def __init__(self, recipient=None, time=None, action=None, subject=None, message=None): self.recipient = recipient self.time = time if self.time is None: self.time = datetime.now(pytz.utc) self.action = action self.subject = subject…

Continue ReadingImplement Email in Open Event Server

Create Event by Importing JSON files in Open Event Server

Apart from the usual way of creating an event in  FOSSASIA’s Orga Server project by using POST requests in Events API, another way of creating events is importing a zip file which is an archive of multiple JSON files. This way you can create a large event like FOSSASIA with lots of data related to sessions, speakers, microlocations, sponsors just by uploading JSON files to the system. Sample JSON file can be found in the open-event project of FOSSASIA. The basic workflow of importing an event and how it works is as follows: First step is similar to uploading files to the server. We need to send a POST request with a multipart form data with the zipped archive containing the JSON files. The POST request starts a celery task to start importing data from JSON files and storing them in the database. The celery task URL is returned as a response to the POST request. You can use this celery task for polling purposes to get the status. If the status is FAILURE, we get the error text along with it. If status is SUCCESS we get the resulting event data In the celery task, each JSON file is read separately and the data is stored in the db with the proper relations. Sending a GET request to the above mentioned celery task, after the task has been completed returns the event id along with the event URL. Let’s see how each of these points work in the background. Uploading ZIP containing JSON Files For uploading a zip archive instead of sending a JSON data in the POST request we send a multipart form data. The multipart/form-data format of sending data allows an entire file to be sent as a data in the POST request along with the relevant file informations. One can know about various form content types here . An example cURL request looks something like this: curl -H "Authorization: JWT <access token>" -X POST -F 'file=@event1.zip' http://localhost:5000/v1/events/import/json The above cURL request uploads a file event1.zip from your current directory with the key as ‘file’ to the endpoint /v1/events/import/json. The user uploading the feels needs to have a JWT authentication key or in other words be logged in to the system as it is necessary to create an event. @import_routes.route('/events/import/<string:source_type>', methods=['POST']) @jwt_required() def import_event(source_type): if source_type == 'json': file_path = get_file_from_request(['zip']) else: file_path = None abort(404) from helpers.tasks import import_event_task task = import_event_task.delay(email=current_identity.email, file=file_path, source_type=source_type, creator_id=current_identity.id) # create import job create_import_job(task.id) # if testing if current_app.config.get('CELERY_ALWAYS_EAGER'): TASK_RESULTS[task.id] = { 'result': task.get(), 'state': task.state } return jsonify( task_url=url_for('tasks.celery_task', task_id=task.id) ) After the request is received we check if a file exists in the key ‘file’ of the form-data. If it is there, we save the file and get the path to the saved file. Then we send this path over to the celery task and run the task with the .delay() function of celery. After the celery task is started, the corresponding data about the import job is…

Continue ReadingCreate Event by Importing JSON files in Open Event Server

Export an Event using APIs of Open Event Server

We in FOSSASIA’s Open Event Server project, allow the organizer, co-organizer and the admins to export all the data related to an event in the form of an archive of JSON files. This way the data can be reused in some other place for various different purposes. The basic workflow is something like this: Send a POST request in the /events/{event_id}/export/json with a payload containing whether you require the various media files. The POST request starts a celery task in the background to start extracting data related to event and jsonifying them The celery task url is returned as a response. Sending a GET request to this url gives the status of the task. If the status is either FAILED or SUCCESS then there is the corresponding error message or the result. Separate JSON files for events, speakers, sessions, micro-locations, tracks, session types and custom forms are created. All this files are then archived and the zip is then served on the endpoint /events/{event_id}/exports/{path} Sending a GET request to the above mentioned endpoint downloads a zip containing all the data related to the endpoint. Let’s dive into each of these points one-by-one POST request ( /events/{event_id}/export/json) For making a POST request you firstly need a JWT authentication like most of the other API endpoints. You need to send a payload containing the settings for whether you want the media files related with the event to be downloaded along with the JSON files. An example payload looks like this: {  "image": true,  "video": true,  "document": true,  "audio": true } def export_event(event_id): from helpers.tasks import export_event_task settings = EXPORT_SETTING settings['image'] = request.json.get('image', False) settings['video'] = request.json.get('video', False) settings['document'] = request.json.get('document', False) settings['audio'] = request.json.get('audio', False) # queue task task = export_event_task.delay( current_identity.email, event_id, settings) # create Job create_export_job(task.id, event_id) # in case of testing if current_app.config.get('CELERY_ALWAYS_EAGER'): # send_export_mail(event_id, task.get()) TASK_RESULTS[task.id] = { 'result': task.get(), 'state': task.state } return jsonify( task_url=url_for('tasks.celery_task', task_id=task.id) ) Taking the settings about the media files and the event id, we pass them as parameter to the export event celery task and queue up the task. We then create an entry in the database with the task url and the event id and the user who triggered the export to keep a record of the activity. After that we return as response the url for the celery task to the user. If the celery task is still underway it show a response with ‘state’:’WAITING’. Once, the task is completed, the value of ‘state’ is either ‘FAILED’ or ‘SUCCESS’. If it is SUCCESS it returns the result of the task, in this case the download url for the zip. Celery Task to Export Event Exporting an event is a very time consuming process and we don’t want that this process to come in the way of user interaction with other services. So we needed to use a queueing system that would queue the tasks and execute them in the background with disturbing the main worker from executing the other user…

Continue ReadingExport an Event using APIs of Open Event Server

DetachedInstanceError: Dealing with Celery, Flask’s app context and SQLAlchemy in the Open Event Server

In the open event server project, we had chosen to go with celery for async background tasks. From the official website, What is celery? Celery is an asynchronous task queue/job queue based on distributed message passing. What are tasks? The execution units, called tasks, are executed concurrently on a single or more worker servers using multiprocessing. After the tasks had been set up, an error constantly came up whenever a task was called The error was: DetachedInstanceError: Instance <User at 0x7f358a4e9550> is not bound to a Session; attribute refresh operation cannot proceed The above error usually occurs when you try to access the session object after it has been closed. It may have been closed by an explicit session.close() call or after committing the session with session.commit(). The celery tasks in question were performing some database operations. So the first thought was that maybe these operations might be causing the error. To test this theory, the celery task was changed to : @celery.task(name='lorem.ipsum') def lorem_ipsum():    pass But sadly, the error still remained. This proves that the celery task was just fine and the session was being closed whenever the celery task was called. The method in which the celery task was being called was of the following form: def restore_session(session_id):    session = DataGetter.get_session(session_id)    session.deleted_at = None    lorem_ipsum.delay()    save_to_db(session, "Session restored from Trash")    update_version(session.event_id, False, 'sessions_ver') In our app, the app_context was not being passed whenever a celery task was initiated. Thus, the celery task, whenever called, closed the previous app_context eventually closing the session along with it. The solution to this error would be to follow the pattern as suggested on http://flask.pocoo.org/docs/0.12/patterns/celery/. def make_celery(app):    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])    celery.conf.update(app.config)    task_base = celery.Task    class ContextTask(task_base):        abstract = True        def __call__(self, *args, **kwargs):            if current_app.config['TESTING']:                with app.test_request_context():                    return task_base.__call__(self, *args, **kwargs)            with app.app_context():                return task_base.__call__(self, *args, **kwargs)    celery.Task = ContextTask    return celery celery = make_celery(current_app) The __call__ method ensures that celery task is provided with proper app context to work with.  

Continue ReadingDetachedInstanceError: Dealing with Celery, Flask’s app context and SQLAlchemy in the Open Event Server

Setting up Celery with Flask

In this article, I will explain how to use Celery with a Flask application. Celery requires a broker to run. The most famous of the brokers is Redis. So to start using Celery with Flask, first we will have to setup the Redis broker. Redis can be downloaded from their site http://redis.io. I wrote a script that simplifies downloading, building and running the redis server. #!/bin/bash # This script downloads and runs redis-server. # If redis has been already downloaded, it just runs it if [ ! -d redis-3.2.1/src ]; then wget http://download.redis.io/releases/redis-3.2.1.tar.gz tar xzf redis-3.2.1.tar.gz rm redis-3.2.1.tar.gz cd redis-3.2.1 make else cd redis-3.2.1 fi src/redis-server When the above script is ran from the first time, the redis folder doesn't exist so it downloads the same, builds it and then runs it. In subsequent runs, it will skip the downloading and building part and just run the server. Now that the redis server is running, we will have to install its Python counterpart. pip install redis After the redis broker is set, now its time to setup the celery extension. First install celery by using pip install celery. Then we need to setup celery in the flask app definition. # in app.py def make_celery(app): # set redis url vars app.config['CELERY_BROKER_URL'] = environ.get('REDIS_URL', 'redis://localhost:6379/0') app.config['CELERY_RESULT_BACKEND'] = app.config['CELERY_BROKER_URL'] # create context tasks in celery celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL']) celery.conf.update(app.config) TaskBase = celery.Task class ContextTask(TaskBase): abstract = True def __call__(self, *args, **kwargs): with app.app_context(): return TaskBase.__call__(self, *args, **kwargs) celery.Task = ContextTask return celery celery = make_celery(current_app) Now that Celery is setup on our project, let’s define a sample task. @app.route('/task') def view(): background_task.delay(*args, **kwargs) return 'OK' @celery.task def background_task(*args, **kwargs): # code # more code Now to run the celery workers, execute celery worker -A app.celery That should be all. Now to run our little project, we can execute the following script. bash run_redis.sh & # to run redis celery worker -A app.celery & # to run celery workers python app.py If you are wondering how to run the same on Heroku, just use the free heroku-redis extension. It will start the redis server on heroku. Then to run the workers and app, set the Procfile as - web: sh heroku.sh Then set the heroku.sh as - #!/bin/bash celery worker -A app.celery & gunicorn app:app That’s a basic guide on how to run a Flask app with Celery and Redis. If you want more information on this topic, please see my post Ideas on Using Celery in Flask for background tasks.

Continue ReadingSetting up Celery with Flask

Ideas on using Celery with Flask for background tasks

Simply put, Celery is a background task runner. It can run time-intensive tasks in the background so that your application can focus on the stuff that matters the most. In context of a Flask application, the stuff that matters the most is listening to HTTP requests and returning response. By default, Flask runs on a single-thread. Now if a request is executed that takes several seconds to run, then it will block all other incoming requests as it is single-threaded. This will be a very bad-experience for the user who is using the product. So here we can use Celery to move time-hogging part of that request to the background. I would like to let you know that by “background”, Celery means another process. Celery starts worker processes for the running application and these workers receive work from the main application. Celery requires a broker to be used. Broker is nothing but a database that stores results of a celery task and provides a shared interface between main process and worker processes. The output of the work done by the workers is stored in the Broker. The main application can then access these results from the Broker. Using Celery to set background tasks in your application is as simple as follows - @celery.task def background_task(*args, **kwargs): # do stuff # more stuff Now the function background_task becomes function-able as a background task. To execute it as a background task, run - task = background_task.delay(*args, **kwargs) print task.state # task current state (PENDING, SUCCESS, FAILURE) Till now this may look nice and easy but it can cause lots of problems. This is because the background tasks run in different processes than the main application. So the state of the worker application differs from the real application. One common problem because of this is the lack of request context. Since a celery task runs in a different process, so the request context is not available. Therefore the request headers, cookies and everything else is not available when the task actually runs. I too faced this problem and solved it using an excellent snippet I found on the Internet. """ Celery task wrapper to set request context vars and global vars when a task is executed Based on http://xion.io/post/code/celery-include-flask-request-context.html """ from celery import Task from flask import has_request_context, make_response, request, g from app import app # the flask app __all__ = ['RequestContextTask'] class RequestContextTask(Task): """Base class for tasks that originate from Flask request handlers and carry over most of the request context data. This has an advantage of being able to access all the usual information that the HTTP request has and use them within the task. Pontential use cases include e.g. formatting URLs for external use in emails sent by tasks. """ abstract = True #: Name of the additional parameter passed to tasks #: that contains information about the original Flask request context. CONTEXT_ARG_NAME = '_flask_request_context' GLOBALS_ARG_NAME = '_flask_global_proxy' GLOBAL_KEYS = ['user'] def __call__(self, *args, **kwargs): """Execute task code with given…

Continue ReadingIdeas on using Celery with Flask for background tasks