Open Event Server – Export Orders as CSV File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he/she makes his event public, others can view it and buy tickets if interested.

The organizer can see all the orders in a very detailed view in the event management dashboard. He can see the statuses of all the orders. The possible statuses are completed, placed, pending, expired and canceled.

If the organizer wants to download the list of all the orders as a CSV file, he or she can do it very easily by simply clicking on the Export As and then on CSV.

Let us see how this is done on the server.

Server side – generating the Orders CSV file

Here we will be using the csv package provided by python for writing the csv file.

import csv
  • We define a method export_orders_csv which takes the orders to be exported as a CSV file as the argument.
  • Next, we define the headers of the CSV file. It is the first row of the CSV file.
def export_orders_csv(orders):
   headers = ['Order#', 'Order Date', 'Status', 'Payment Type', 'Total Amount', 'Quantity',
              'Discount Code', 'First Name', 'Last Name', 'Email']
  • A list is defined called rows. This contains the rows of the CSV file. As mentioned earlier, headers is the first row.
rows = [headers]
  • We iterate over each order in orders and form a row for that order by separating the values of each of the columns by a comma. Here, every row is one order.
  • The newly formed row is added to the rows list.
for order in orders:
   if order.status != "deleted":
       column = [str(order.get_invoice_number()), str(order.created_at) if order.created_at else '',
                 str(order.status) if order.status else '', str(order.paid_via) if order.paid_via else '',
                 str(order.amount) if order.amount else '', str(order.get_tickets_count()),
                 str(order.discount_code.code) if order.discount_code else '',
                 str(order.user.first_name)
                 if order.user and order.user.first_name else '',
                 str(order.user.last_name)
                 if order.user and order.user.last_name else '',
                 str(order.user.email) if order.user and order.user.email else '']
       rows.append(column)
  • rows contains the contents of the CSV file and hence it is returned.
return rows
  • We iterate over each item of rows and write it to the CSV file using the methods provided by the csv package.
writer = csv.writer(temp_file)
from app.api.helpers.csv_jobs_util import export_orders_csv
content = export_orders_csv(orders)
for row in content:
   writer.writerow(row)

Obtaining the Orders CSV file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/orders/csv

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the orders of the event as a CSV file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}</span

The user can go to the above-returned URL and check the status of his/her Celery task. If the task completed successfully he/she will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the aabove-mentionedURL.

References

Continue ReadingOpen Event Server – Export Orders as CSV File

How api.susi.ai responds to a query and send response

A direct way to access raw data for a query is https://api.susi.ai. It is an API of susi server which sends responses to a query by a user in form of a JSON (JavaScript Object Notation) object (more on JSON here). This JSON object is the raw form of any response to a query which is a bunch of key (attribute) value pair. This data is then send from the server to various APIs like chat.susi.ai, susi bots, android and ios apps of SUSI.AI.

Whenever a user in using an API, for example chat.susi.ai, the user send a query to the API. This query is then sent by the API as a request to the susi server. The server then process the request and sends the answer to the query in form of json data as a response to the request. This request is then used by the API to display the answer after applying stylings to the json data.

Continue ReadingHow api.susi.ai responds to a query and send response

How to integrate SUSI.AI with Google Sign In API

Google Sign-In manages the OAuth 2.0 flow and token lifecycle, simplifying our integration with Google APIs. A user always has the option to revoke access to an application at any time. Users can see a list of all the apps which they have given permission to access their account details and are using the login API. In this blog post I will discuss how to integrate this API with susi server API to enhance our AAA system. More on Google API here.

Continue ReadingHow to integrate SUSI.AI with Google Sign In API

Implementing Tax Endpoint in Open Event Server

The Open Event Server enables organizers to manage events from concerts to conferences and meetups. It offers features for events with several tracks and venues. The Event organizers may want to charge taxes on the event tickets. The Open Event Server has a Tax endpoint in order to support it. This blog goes over it’s implementation details in the project.

Model

First up, we will discuss what fields have been stored in the database for Tax endpoint. The most important fields are as follows:

  • The tax rate charged in percentage
  • The id for the Tax
  • The registered company
  • The country
  • The address of the event organiser
  • The additional message to be included as the invoice footer

We also store a field to specify whether the tax should be included in the ticket price or not. Each Event can have only one associated Tax information. You can checkout the full model for reference here.

Schema

We have defined two schemas for the Tax endpoint. This is because there are a few fields which contain sensitive information and should only be shown to the event organizer or the admin itself while the others can be shown to the public. Fields like name and rate aren’t sensitive and can be disclosed to the public. They have been defined in the TaxSchemaPublic class. Sensitive information like the tax id, address, registered company have been included in the TaxSchema class which inherits from the TaxSchemaPublic class. You can checkout the full schema for reference here.

Resources

The endpoint supports all the CRUD operations i.e. Create, Read, Update and Delete.

Create and Update

The Tax entry for an Event can be created using a POST request to the /taxes endpoint. We analyze if the posted data contains a related event id or identifier which is necessary as every tax entry is supposed to be related with an event. Moreover we also check whether a tax entry already exists for the event or not since an event should have only one tax entry. An error is raised if that is not the case otherwise the tax entry is created and saved in the database. An existing entry can be updated using the same endpoint by making a PATCH request.  

Read

A Tax entry can be fetched using a GET request to the  /taxes/{tax_id}  endpoint with the id for the tax entry. The entry for an Event can also be fetched from /events/{event_id}/tax  endpoint.

Delete

An existing Tax entry can be deleted by making a DELETE request to the /taxes/{tax_id} endpoint with the id of the entry. We make sure the tax entry exists. An error is raised if that is not the case else we delete it from the database.

References

Continue ReadingImplementing Tax Endpoint in Open Event Server

Skill Ratings Over Time

The SUSI SKill CMS provides an option to rate and review a skill. These feedbacks help the skill creators to improve the skills. Also, the ratings and reviews can be updated by the reviewer. But the CMS only provides the current rating of a skill. What if a user or a developer wants to see how that skill has performed over time? Are there any improvements in the skill or not?

For that, we need the skill ratings over time !

Server side implementation

Create a ratingsOverTime.json file to store the monthly average rating of the skills and make a JSONTray object for that in src/ai/susi/DAO.java file. The JSON file contains the timestamp for every month, the average ratings on a skill in that month and the total number of ratings in that month.

public static JsonTray ratingsOverTime;

Path ratingsOverTime_per = susi_skill_rating_dir.resolve("ratingsOverTime.json");
Path ratingsOverTime_vol = susi_skill_rating_dir.resolve("ratingsOverTime_session.json");
ratingsOverTime = new JsonTray(ratingsOverTime_per.toFile(), ratingsOverTime_vol.toFile(), 1000000);
OS.protectPath(ratingsOverTime_per);
OS.protectPath(ratingsOverTime_vol);

Now whenever a user rates a skill, the data in ratingsOverTime.json needs to be updated. For this fetch the overall rating data of the current month. Multiply the average rating with the total number of ratings (count) of that month.

sum = average_rating X number_of_ratings

Then add the rating given by the current user to this sum and divide by count + 1 to again get the new average rating. Also increment the total number of ratings by 1.

new_sum = sum + rating_by_user

new_avg = new_sum/(count+1)

number_of_ratings =  number_of_ratings + 1

float totalRating = skillRating * ratingCount;
float newAvgRating = (totalRating + skill_stars)/(ratingCount + 1);
ratingObject.put("rating", newAvgRating);
ratingObject.put("count", ratingCount + 1);

Now we have got the ratings over time stored in ratingsOverTime.json file. An API to access this data is also required. So create an API GetRatingOverTime.java returns the ratings over time of a particular skill. The API has the following attributes :

Endpoint : /cms/getRatingsOverTime.json

Minimum user role : anonymous

Parameters : model, group, language and skill

JSONArray skillRatings = languageName.getJSONArray(skill_name);
result.put("skill_name", skill_name);
result.put("ratings_over_time", skillRatings);
return new ServiceResponse(result);

It fetches the data corresponding to the skill from ratingsOverTime.json and returns it to the CMS.

Add this API to SusiServer.java

//Skill ratings over time
GetRatingsOverTime.class

References

Continue ReadingSkill Ratings Over Time

Creating Feedback Logs for Analysis

The thumbs up and thumbs down feedback on the clients is meant for the improvement of the skills in SUSI.AI. So we need to scope the feedback system to a particular interaction rather than skill as a whole. The feedback logs can be used for various kinds of analysis and machine learning.

Server side implementation

Components of Feedback Log:

  • User ID – For identification of a feedback given by a particular user. For consistency in data, the user should not be able to change the feedback over the same interaction.
  • Interaction:
    • User query
    • SUSI Reply
  • Client location – The response of a skill may not be interesting for the users of a particular country. That means the skill should give localised results.
  • Skill path – The path on the server where the skill is stored.

Create a feedbackLogs.json file to store the logs of feedback given by the user and make a JSONTray object for that in src/ai/susi/DAO.java file. The JSON file contains the above mentioned components.

public static JsonTray feedbackLogs; 

Path feedbackLogs_per = susi_skill_rating_dir.resolve("feedbackLogs.json");
Path feedbackLogs_vol = susi_skill_rating_dir.resolve("feedbackLogs_session.json");
feedbackLogs = new JsonTray(feedbackLogs_per.toFile(), feedbackLogs_vol.toFile(), 1000000);
OS.protectPath(feedbackLogs_per);
OS.protectPath(feedbackLogs_vol);

Create FeedbackLogService.java file that acts as an API to create the feedback logs. The API accepts the feedback data from the client and stores it into the json file using DAO object. The user should be logged in to give feedback on an interaction. So keep the minimum user role as USER to access the API.

JSONObject feedbackLogObject = new JSONObject();
feedbackLogObject.put("timestamp", timestamp);
feedbackLogObject.put("uuid", idvalue);
feedbackLogObject.put("feedback", skill_rate);
feedbackLogObject.put("user_query", user_query);
feedbackLogObject.put("susi_reply", susi_reply);
feedbackLogObject.put("country_name", country_name);
feedbackLogObject.put("country_code", country_code);
feedbackLogObject.put("skill_path", skill_path);

The API is accessible at /cms/feedbackLog.json endpoint.

Send feedback log from Web Client

The feedback API should be called only if the user is logged in. When the user presses the feedback buttons fetch the required data for log (access token, user query, susi response, country and user feedback) and POST them on the feedbackLog.json API.

let rateEndPoint =   BASE_URL + '/cms/feedbackLog.json?model=' + skill.model + '&group=' + skill.group + '&language=' + skill.language + '&skill=' + skill.skill + '&rating=' + rating + '&access_token=' + accessToken + '&user_query=' + interaction.userQuery + '&susi_reply=' + interaction.susiReply + '&country_name=' + country.countryName + '&country_code=' + country.countryCode ;

$.ajax({
  url: rateEndPoint,
  success: function(response) {
      console.log('Skill rated successfully');
  }
})

References

Continue ReadingCreating Feedback Logs for Analysis

Showing only those languages for which skills are available

SUSI.AI is available for almost all the internationally recognised languages of the world. An author is allowed to create a skill in any of these languages. But there are some languages for which skills have not been created yet. So only those languages should be shown in the SUSI Skill CMS for which the skills are available. The approach is that all the languages must be listed while creating the skills but only non-empty languages must be listed while filtering skills on the CMS category page.

Updating the get languages API

  1. Add an API parameter in GetAllLanguages.java to fetch the group name. It is used to fetch the list of languages for which skills are available in that particular group. If no group is passed it means that the all the languages are to be listed. For that, we can use any group and show all the languages in that group. Say “Knowledge”.

String group_name = call.get("group", null);
if (group_name == null) {
    File group = new File(model, "Knowledge");
}

 

  1. Now check if the file inside the group folder is a directory. If yes then add it to the list of languages to be returned.

String[] languages = group.list((current, name) -> new File(current, name).isDirectory());

 

  1. If the languages corresponding to a particular category are to be fetched then first checked if the group is “All” or any specific group. Since the “All” category is not stored as such so we need to iterate over all the groups present in the parent directory ie, the model directory.

String[] group_names = model.list((current, name) -> new File(current, name).isDirectory());

 

  1. Now iterate over all the groups present in the group_names array and list the files present in it. Apply a filter to the list that accepts a file only if it is a directory and not empty ie, contains at least 1 language. Add that file to the list of languages.

group.list(new FilenameFilter() {
	@Override
	public boolean accept(File file, String s) {
		Boolean accepted = new File(file, s).list().length > 1;
		if (accepted) {
			if (!languages.contains(s)) {
		    	languages.add(s);
			}
		}
		return accepted;
	}
});

 

  1. The processing of getting languages for a particular group is same, just the iteration over the model directory is not required.

Resources

Continue ReadingShowing only those languages for which skills are available

Open Event Server – Export Event as a Pentabarf XML File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as a Pentabarf XML file. Pentabarf XML is used to store events/conferences in a format which most of the scheduling applications can read and add that particular event/conference to the user’s schedule.

Server side – generating the Pentabarf XML file

Here we will be using the pentabarf package for Python for parsing and creating the file.

from pentabarf.Conference import Conference
from pentabarf.Day import Day
from pentabarf.Event import Event
from pentabarf.Person import Person
from pentabarf.Room import Room
  • We define a class PentabarfExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the event in the required format:
event = EventModel.query.get(event_id)
diff = (event.ends_at - event.starts_at)

conference = Conference(title=event.name, start=event.starts_at, end=event.ends_at,
                       days=diff.days if diff.days > 0 else 1,
                       day_change="00:00", timeslot_duration="00:15",
                       venue=event.location_name)
dates = (db.session.query(cast(Session.starts_at, DATE))
        .filter_by(event_id=event_id)
        .filter_by(state='accepted')
        .filter(Session.deleted_at.is_(None))
        .order_by(asc(Session.starts_at)).distinct().all())
  • We have queried for the dates of the event and saved it in dates.
  • We will now iterate over each date and query the microlocations who have a session on that particular date.
for date in dates:
   date = date[0]
   day = Day(date=date)
   microlocation_ids = list(db.session.query(Session.microlocation_id)
                            .filter(func.date(Session.starts_at) == date)
                            .filter_by(state='accepted')
                            .filter(Session.deleted_at.is_(None))
                            .order_by(asc(Session.microlocation_id)).distinct())
  • For each microlocation thus obtained, we will query for accepted sessions to be held at those microlocations.
  • We will also initialize a Room for each microlocation.
for microlocation_id in microlocation_ids:
   microlocation_id = microlocation_id[0]
   microlocation = Microlocation.query.get(microlocation_id)
   sessions = Session.query.filter_by(microlocation_id=microlocation_id) \
       .filter(func.date(Session.starts_at) == date) \
       .filter_by(state='accepted') \
       .filter(Session.deleted_at.is_(None)) \
       .order_by(asc(Session.starts_at)).all()

   room = Room(name=microlocation.name)
  • We will now iterate over the aabove-obtained sessions and instantiate an Event for each session.
  • Then we will iterate over all the speakers of that session and instantiate a Person for each speaker.
  • Finally, we will add that Event to the Room we created earlier.
for session in sessions:

   session_event = Event(id=session.id,
                         date=session.starts_at,
                         start=session.starts_at,
                         duration=str(session.ends_at - session.starts_at) + "00:00",
                         track=session.track.name,
                         abstract=session.short_abstract,
                         title=session.title,
                         type='Talk',
                         description=session.long_abstract,
                         conf_url=url_for('event_detail.display_event_detail_home',
                                          identifier=event.identifier),
                         full_conf_url=url_for('event_detail.display_event_detail_home',
                                               identifier=event.identifier, _external=True),
                         released="True" if event.schedule_published_on else "False")

   for speaker in session.speakers:
       person = Person(id=speaker.id, name=speaker.name)
       session_event.add_person(person)

   room.add_event(session_event)
  • Then we will add the room to the day and then add each day to the conference.
day.add_room(room)
conference.add_day(day)
  • Finally, we will call the generate method of the conference to generate the XML file. This can be directly written to the file.
return conference.generate("Generated by " + get_settings()['app_name'])

Obtaining the Pentabarf XML file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/pentabarf

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as a Pentabarf XML file. It returns the task of the URL to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above-mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the Pentabarf XML format.

References

Continue ReadingOpen Event Server – Export Event as a Pentabarf XML File

Open Event Server – Export Event as xCalendar File

FOSSASIA‘s Open Event Server is the REST API backend for the event management platform, Open Event. Here, the event organizers can create their events, add tickets for it and manage all aspects from the schedule to the speakers. Also, once he makes his event public, others can view it and buy tickets if interested.

To make event promotion easier, we also provide the event organizer to export his event as an xCalendar file. xCal is an XML representation of the iCalendar standard. xCal is not an alternative nor next generation of iCalendar. xCal represents iCalendar components, properties, and parameters as defined in iCalendar. This format was selected to ease its translation back to the iCalendar format using an XSLT transform.

Server side – generating the xCal file

Here we will be using the xml.etree.ElementTree package for Python for parsing and creating XML data.

from xml.etree.ElementTree import Element, SubElement, tostring
  • We define a class XCalExporter which has a static method export(event_id).
  • Query the event using the event_id passed and start forming the calendar:
event = Event.query.get(event_id)

tz = event.timezone or 'UTC'
tz = pytz.timezone(tz)

i_calendar_node = Element('iCalendar')
i_calendar_node.set('xmlns:xCal', 'urn:ietf:params:xml:ns:xcal')
v_calendar_node = SubElement(i_calendar_node, 'vcalendar')
version_node = SubElement(v_calendar_node, 'version')
version_node.text = '2.0'
prod_id_node = SubElement(v_calendar_node, 'prodid')
prod_id_node.text = '-//fossasia//open-event//EN'
cal_desc_node = SubElement(v_calendar_node, 'x-wr-caldesc')
cal_desc_node.text = "Schedule for sessions at " + event.name
cal_name_node = SubElement(v_calendar_node, 'x-wr-calname')
cal_name_node.text = event.name
  • We query for the accepted sessions of the event and store it in sessions
sessions = Session.query \
   .filter_by(event_id=event_id) \
   .filter_by(state='accepted') \
   .filter(Session.deleted_at.is_(None)) \
   .order_by(asc(Session.starts_at)).all()
  • We then iterate through all the sessions in sessions.
  • If it is a valid session, we instantiate a SubElement and store required details
v_event_node = SubElement(v_calendar_node, 'vevent')

method_node = SubElement(v_event_node, 'method')
method_node.text = 'PUBLISH'

uid_node = SubElement(v_event_node, 'uid')
uid_node.text = str(session.id) + "-" + event.identifier

dtstart_node = SubElement(v_event_node, 'dtstart')
dtstart_node.text = tz.localize(session.starts_at).isoformat()

…. So on
  • We then loop through all the speakers in that particular session and add it to the xCal calendar node object as well.
for speaker in session.speakers:
   attendee_node = SubElement(v_event_node, 'attendee')
   attendee_node.text = speaker.name
  • And finally, the string of the calendar node is returned. This is the xCalendar file contents. This can be directly written to a file.
return tostring(i_calendar_node)

Obtaining the xCal file:

Firstly, we have an API endpoint which starts the task on the server.

GET - /v1/events/{event_identifier}/export/xcal

Here, event_identifier is the unique ID of the event. This endpoint starts a celery task on the server to export the event as an xCal file. It returns the URL of the task to get the status of the export task. A sample response is as follows:

{
  "task_url": "/v1/tasks/b7ca7088-876e-4c29-a0ee-b8029a64849a"
}

The user can go to the above-returned URL and check the status of his Celery task. If the task completed successfully he will get the download URL. The endpoint to check the status of the task is:

and the corresponding response from the server –

{
  "result": {
    "download_url": "/v1/events/1/exports/http://localhost/static/media/exports/1/zip/OGpMM0w2RH/event1.zip"
  },
  "state": "SUCCESS"
}

The file can be downloaded from the above mentioned URL.

Hence, now the event can be added to any scheduling app which recognizes the xcs format.

References

Continue ReadingOpen Event Server – Export Event as xCalendar File

Integrating Stripe OAuth in Open Event Frontend

Why is Stripe Oauth needed in frontend? Open event allows organizers to add tickets and accepts payments for tickets through various modes for example, Credit card, Debit card, Netbanking and offline payments. Stripe allows users to accept payments into their linked accounts on various online platforms after they provide client secret and publishable key. So to enable online payments in open event, organizers were required to authenticate their stripe account. This is done through Stripe OAuth.

Flow of OAuth

To allow organizers to link their stripe account admin has to enable stripe under payment gateway in admin settings. Admin provides his client ID and secret key. Admin also sets the redirect URL for his app on the stripe dashboard. After enabling these settings organizer will see an option to link their stripe account to open event when they are creating an event with paid tickets.

Here is what open event frontend does when we click connect to stripe button:

  1. Opens a popup to allow organizer to fill his stripe credentials and authorize open event app to access their secret and publishable key.
  2. Once the organizer fills his credentials and authorizes open event app, open event frontend fetches organizers auth code and saves it to server.
  3. Server on receiving auth code from frontend makes a request to stripe using the auth code to retrieve the publishable key and secret key.
  4. Once these are fetched server saves this information against the event so that all payments for that event can go to the linked stripe account.

Implementing the Frontend portion:

  • Choosing the library:

After looking at various libraries that support OAuth for Ember applications we decided to use Torii. Torii is the library that allows the addition of OAuth for various social apps such as Facebook, Google and Stripe too. It allows writing a custom provider for OAuth in case we do not want to use clients for which torii provides supports by default.

  • Implementing Stripe Provider:

Default provider for stripe given by torii fetched the client ID and redirect URL from environment.js file. But since in open event we have already saved client id of admin in our database so we will extend default stripe provider and modify its client Id so that it fetches client id from server. Code for extending default provider is given here:

import stripeConnect from 'torii/providers/stripe-connect';
import { alias } from '@ember/object/computed';
import { inject } from '@ember/service';
import { configurable } from 'torii/configuration';

function currentUrl() {
 let url = [window.location.protocol,
   '//',
   window.location.host].join('');
 if (url.substr(-1) !== '/') {
   url += '/';
 }
 return url;
}

export default stripeConnect.extend({

 settings: inject(),

 clientId: alias('settings.stripeClientId'),

 redirectUri: configurable('redirectUri', function() {
   return `${currentUrl()}torii/redirect.html`;
 })

});

 

We have fetched clientId from our settings service as alias(‘settings.stripeClientId’).

We have already defined settings in our services so we just need to inject the service here to be able to use it.

By default torii provides redirect url as {currentUrl}/torii/redirect.html. But in open event frontend we allow organizers to edit information on two routes and torii suggests in its docs to use {baseUrl}/torii/redirect.html as the redirect url to avoid potential vulnerability. So we also modified the default redirect url building method.

Saving information to server

Once we get the authorization token from stripe we send it to the server and save it to stripe-authorization model. The logic for the same is given below:

connectStripe() {
     this.get('data.event.stripeAuthorization.content') ? '' : this.set('data.event.stripeAuthorization', this.store.createRecord('stripe-authorization'));
     this.get('torii').open('stripe')
       .then(authorization => {
         this.set('data.event.stripeAuthorization.stripeAuthCode', authorization.authorizationCode);
       })
       .catch(error => {
         this.get('notify').error(this.get('l10n').t(`${error.message}. Please try again`));
       });
   },

 

This action gets called when we click on connect to stripe button. This action calls the stripe provider and opens a popup to enable the organizer to authenticate his stripe account.
Full code for this can be seen here.

In this way we connect the stripe service to open event to allow the organizer to receive payments for his events.

Resources
  • Stripe : Documentation on Stripe-Connect : Link
  • Torii: Library to implement Oauth. : Link
  • Implementation: Link to PR showing its implementation : Link
Continue ReadingIntegrating Stripe OAuth in Open Event Frontend