Setting up Codecov in Badgeyay

 

BadgeYaY already has Travis CI and Codacy to test code quality and Pull Request but there was no support for testing Code Coverage in repository against every Pull Request. So I decided to go with setting up Codecov to test the code coverage.

In this blog post, I’ll be discussing how I have set up codecov in BadgeYaY in my Pull Request.

First, let’s understand what is codecov and why do we need it. For that we have to first understand what is code coverage then we will move on to how to add Codecov with help of Travis CI .

Let’s get started and understand it step by step.

What is Code Coverage ?

Code coverage is a measurement used to express which lines of code were executed by a test suite. We use three primary terms to describe each lines executed.

  • hit indicates that the source code was executed by the test suite.
  • partial indicates that the source code was not fully executed by the test suite; there are remaining branches that were not executed.
  • miss indicates that the source code was not executed by the test suite.

Coverage is the ratio of hits / (hit + partial + miss). A code base that has 5 lines executed by tests out of 12 total lines will receive a coverage ratio of 41% . In BadgeYaY , Code Coverage is 100%.

How CodeCov helps in Code Coverage ?

Codecov focuses on integration and promoting healthy pull requests. Codecov delivers <<<or “injects”>>> coverage metrics directly into the modern workflow to promote more code coverage, especially in pull requests where new features and bug fixes commonly occur.

I am listing down top 5 Codecov Features:

We can change the configuration of how Codecov processes reports and expresses coverage information. Let’s see how we configure it according to BadgeYaY by integrating it with Travis CI.

Now generally, the codecov works better with Travis CI. With the one line

 bash <(curl -s https://codecov.io/bash)

 

the code coverage can now be easily reported.

Add a script for testing:

"scripts": {
   - nosetests app/tests/test.py -v --with-coverage
}

Here is a particular example of travis.yml from the project repository of BadgeYaY:

Script:
- python app/main.py >> log.txt 2>&1  &
- nosetts app/tests/test.py -v --with-coverage
- python3 -m pyflakes

after_success:
- bash <(curl -s https://codecov.io/bash)

 

Let’s have a look at Codecov.yml to check exact configuration that I have used for BadgeYaY.

Codecov:
  # yes: will delay sending notifications until all ci is finished
  notify:
    require_ci_to_pass: yes

coverage:
  # how many decimal places to display in the UI: 0 <= value <= 4
  precision: 2
  # how coverage is rounded: down/up/nearest
  round: down 
  # custom range of coverage colors from red -> yellow -> green 
  range: "70...100"

  status:
     # measuring the overall project coverage
    project: yes
     # pull requests only: this commit status will measure the
       entire pull requests Coverage Diff. Checking if the lines
       adjusted are covered at least X%.
    patch: yes
     # if there are any unexpected changes in coverage
    changes: no

Comment:

  layout: "reach, diff, flags, files, footer"
  behavior: default
  require_changes: no

 

Now when anyone makes a Pull Request to BadgeYaY, Codecov will analyze the Pull Request according to above configuration and generate a Report showing the code coverage of that Pull Request.

 

Below is the screenshot of all test passing in BadgeYaY repository

This is how we setup codecov in BadgeYaY repository. And like this way, it can be set up in other repositories as well.

The related PR of this work is https://github.com/fossasia/badgeyay/pull/400

Resources :

  • CodeCov Documentation – Link
Continue ReadingSetting up Codecov in Badgeyay

Setting up Codecov in Susper repository hosted on Github

In this blog post, I’ll be discussing how we setup codecov in Susper.

  • What is Codecov and in what projects it is being used in FOSSASIA?

Codecov is a famous code coverage tool. It can be easily integrated with the services like Travis CI. Codecov also provides more features with the services like Docker.

Projects in FOSSASIA like Open Event Orga Server, Loklak search, Open Event Web App uses Codecov. Recently, in the Susper project also the code coverage tool has been configured.

  • How we setup Codecov in our project repository hosted on Github?

The simplest way to setup Codecov in a project repository is by installing codecov.io using the terminal command:

npm install --save-dev codecov.io

Susper works on tech-stack Angular 2 (we have recently upgraded it to Angular v4.1.3) recently. Angular comes with Karma and Jasmine for testing purpose. There are many repositories of FOSSASIA in which Codecov has been configured like this. But with, Angular this case is a little bit tricky. So, using alone:

bash <(curl -s https://codecov.io/bash)

won’t generate code coverage because of the presence of Karma and Jasmine. It will require two packages: istanbul as coverage reporter and jasmine as html reporter. I have discussed them below.

Install these two packages:

  • Karma-coverage-istanbul-reporter
  • npm install karma-coverage-istanbul-reporter --save-dev
  • Karma-jasmine html reporter
  • npm install karma-jasmine-html-reporter --save-dev

    After installing the codecov.io, the package.json will be updated as follows:

  • "devDependencies": {
      "codecov": "^2.2.0",
      "karma-coverage-istanbul-reporter": "^1.3.0",
      "karma-jasmine-html-reporter": "^0.2.2",
    }

    Add a script for testing:

  • "scripts": {
       "test": "ng test --single-run --code-coverage --reporters=coverage-istanbul"
    }

    Now generally, the codecov works better with Travis CI. With the one line bash <(curl -s https://codecov.io/bash) the code coverage can now be easily reported.

Here is a particular example of travis.yml from the project repository of Susper:

script:
 - ng test --single-run --code-coverage --reporters=coverage-istanbul
 - ng lint
 
after_success:
 - bash <(curl -s https://codecov.io/bash)
 - bash ./deploy.sh

Update karma.config.js as well:

Module.exports = function (config) {
  config.set({
    plugins: [
      require('karma-jasmine-html-reporter'),
      require('karma-coverage-istanbul-reporter')
    ],
    preprocessors: {
      'src/app/**/*.js': ['coverage']
    },
    client {
      clearContext: false
    },
    coverageIstanbulReporter: {
      reports: ['html', 'lcovonly'],
      fixWebpackSourcePaths: true
    },
    reporters: config.angularCli && config.angularCli.codeCoverage
      ? ['progress', 'coverage-istanbul'],
      : ['progress', 'kjhtml'],
  })
}

This karma.config.js is an example from the Susper project. Find out more here: https://github.com/fossasia/susper.com/pull/420
This is how we setup codecov in Susper repository. And like this way, it can be set up in other repositories as well which supports Angular 2 or 4 as tech stack.

Continue ReadingSetting up Codecov in Susper repository hosted on Github

Code Quality in the knittingpattern Python Library

In our Google Summer of Code project a part of our work is to bring knitting to the digital age. We is Kirstin Heidler and Nicco Kunzmann. Our knittingpattern library aims at being the exchange and conversion format between different types of knit work representations: hand knitting instructions, machine commands for different machines and SVG schemata.

Cafe instructions
The generated schema from the knittingpattern library.
Cafe
The original pattern schema Cafe.

 

 

 

 

 

 

 


The image above was generated by this Python code:

import knittingpattern, webbrowser
example = knittingpattern.load_from().example("Cafe.json")
webbrowser.open(example.to_svg(25).temporary_path(".svg"))

So far about the context. Now about the Quality tools we use:

Untitled

Continuous integration

We use Travis CI [FOSSASIA] to upload packages of a specific git tag  automatically. The Travis build runs under Python 3.3 to 3.5. It first builds the package and then installs it with its dependencies. To upload tags automatically, one can configure Travis, preferably with the command line interface, to save username and password for the Python Package Index (Pypi).[TravisDocs] Our process of releasing a new version is the following:

  1. Increase the version in the knitting pattern library and create a new pull request for it.
  2. Merge the pull request after the tests passed.
  3. Pull and create a new release with a git tag using
    setup.py tag_and_deploy

Travis then builds the new tag and uploads it to Pypi.

With this we have a basic quality assurance. Pull-requests need to run all tests before they can be merge. Travis can be configured to automatically reject a request with errors.

Documentation Driven Development

As mentioned in a blog post, documentation-driven development was something worth to check out. In our case that means writing the documentation first, then the tests and then the code.

Writing the documentation first means thinking in the space of the mental model you have for the code. It defines the interfaces you would be happy to use. A lot of edge cases can be thought of at this point.

When writing the tests, they are often split up and do not represent the flow of thought any more that you had when thinking about your wishes. Tests can be seen as the glue between the code and the documentation. As it is with writing code to pass the tests, in the conversation between the tests and the documentation I find out some things I have forgotten.

When writing the code in a test-driven way, another conversation starts. I call implementing the tests conversation because the tests talk to the code that it should be different and the code tells the tests their inconsistencies like misspellings and bloated interfaces.

With writing documentation first, we have the chance to have two conversations about our code, in spoken language and in code. I like it when the code hears my wishes, so I prefer to talk a bit more.

Testing the Documentation

Our documentation is hosted on Read the Docs. It should have these properties:

  1. Every module is documented.
  2. Everything that is public is documented.
  3. The documentation is syntactically correct.

These are qualities that can be tested, so they are tested. The code can not be deployed if it does not meet these standards. We use Sphinx for building the docs. That makes it possible to tests these properties in this way:

  1. For every module there exists a .rst file which automatically documents the module with autodoc.
  2. A Sphinx build outputs a list of objects that should be covered by documentation but are not.
  3. Sphinx outputs warnings throughout the build.

testing out documentation allows us to have it in higher quality. Many more tests could be imagined, but the basic ones already help.

Code Coverage

It is possible to test your code coverage and see how well we do using Codeclimate.com. It gives us the files we need to work on when we want to improve the quality of the package.

Landscape

Landscape is also free for open source projects. It can give hints about where to improve next. Also it is possible to fail pull requests if the quality decreases. It shows code duplication and can run pylint. Currently, most of the style problems arise from undocumented tests.

Summary

When starting with the more strict quality assurance, the question arose if that would only slow us down. Now, we have learned to write properly styled pep8 code and begin to automatically do what pylint demands. High test-coverage allows us to change the underlying functionality without changing the interface and without fear we may break something irrecoverably. I feel like having a burden taken from me with all those free tools for open-source software that spare my time to set quality assurance up.

Future Work

In the future we like to also create a user interface. It is hard, sometimes, to test these. So, we plan not to put it into the package but build it on the package.

Continue ReadingCode Quality in the knittingpattern Python Library