For those of you hosting your code at GitHub, there is a fantastic project called Travis-CI. It will run continuous integration tests on your open source project for free. I have used it for CI testing of my Python projects. It is really easy to get started, just follow the steps below.

Note that the open source version of Travis-CI does only support public repositories and builds running up to 20 minutes. If you have a private repository you will need to sign up for Travis Pro, which currently is in beta. Check out http://www.travis-ci.com/ for more information.

Sign up for Travis-CI and enable CI tests

First of all, you must connect Travis-CI to your GitHub profile. Just go to travis-ci.org and sign in with your GitHub account. Travis will then find your public repositories.

You can then flick the on/off switch to your repos to enable Travis-CI.

Configure Travis-CI

Ok, now we need to tell Travis what to test and which Python versions you want to support. Just create a file called .travis.yml in your repo root folder.

.travis.yml
language: python
python:
  - "2.5"
  - "2.6"
  - "2.7"
script: nosetests

What we do here is that we tell Travis that it’s a Python project which supports Python versions 2.5, 2.6 and 2.7. Travis will then make sure that your project runs well with all those Python versions.

The script command is important. You could define your own script to run tests for your project here. For more details on Nose tests, see the Nose documentation.

Handling Python dependencies

If you have any specific Python dependencies, Travis has got you covered. Travis will test your project inside a Virtualenv, so all you need to do is to write a requirements.txt and tell Travis to install it. Add the following to your .travis.yml:

.travis.yml
install: "pip install -r requirements.txt --use-mirrors"

Build only specific branches

Per default Travis will build once you push to a branch on GitHub. That behavior can be annoying if you are commiting often to you development branches. You can restrict which branches Travis is monitoring for changes in the .travis.yml. Either you blacklist branches or whitelist them:

.travis.yml
# Whitelisting example
branches:
  only:
    - master

# Blacklisting example
branches:
  except:
    - develop
    - feature/add-travis-support

Add a nice build status icon to GitHub README

A cool bonus is the build status icon that you can add to your GitHub README file. Just add a line like this to the README:


E.g:


The result is a status image like this:

Summary

This is just a simple Travis-CI setup. You can of course do much more with Travis, just dig into the docs. Happy testing!

This is a simple tutorial to get started with Django and the asynchronous task queuing system called Celery. We will implement a model storing log entries. We will send the log entries asynchronously to the database, so that we don’t kill our website with tasks that the user doesn’t care about.

Let’s get started!

Basic setup of the Django project

I’m assuming you already have installed Django. If you haven’t, please check out the Django installation documentation.

Start with creating your new Django project and add an application called core, which we will use for demonstation purposes in this tutorial.

django-admin.py startproject celerytest
cd celerytest
./manage.py startapp core

Install celery and the Django helper app django-celery.

pip install celery django-celery

You will need to add djcelery, kombu.transport.django and core to your Django settings.py.

import djcelery
djcelery.setup_loader()

[...]

INSTALLED_APPS = (
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.sites',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'core',                     # Add our core app
    'djcelery',                 # Add Django Celery
    'kombu.transport.django',   # Add support for the django:// broker
)

[...]

BROKER_URL = 'django://'

We are setting django as our broker for the time being. When you are in production you will probably want to use a backend like RabbitMQ or Redis. Read more about the BROKER_URL in the documentation.

Now, let’s create a model for our test system (core/models.py).

from django.db import models


class LogEntry(models.Model):
    """
    Definition of a log entry
    """
    timestamp = models.DateTimeField(auto_now_add=True)
    severity = models.CharField(blank=False, max_length=10)
    message = models.TextField(blank=False)

This describes that each log entry has a timestamp, a severity level and a message. To reflect the new model in the database, sync the schema with:

./manage.py syncdb

Next we need to write a simple index view (core/views.py).

"""
Views for the Celery test project
"""
from core.models import LogEntry
from django.http import HttpResponse


def index(request):
    """
    My index page
    """
    log = LogEntry(severity='INFO', message='Rendering the index page')
    log.save()
    return HttpResponse('Hello!')

As you can see we are now writing a log message object to the database every time the index page is requested.

You will also need to update your urls.py to look like this (celerytest/urls.py):

from django.conf.urls import patterns, include, url

urlpatterns = patterns('',
    url(r'^$', 'core.views.index', name='index'),
)

Alright. We’re set, now fire up your development server and point your browser at .

./manage.py runserver

It should look something like this

So, what happens here now is that every time the user loads the index page a log entry is written to the database. But what would happen if the database suddenly became slow. Or even worse, died! Then the web page which was meant to be so simple, just returning a Hello! became slow because of a database request that the user doesn’t even care about.

The answer to this problem, of course, is asynchronous calls. In the next header in this tutorial we will convert our current synchronous logger with an asynchronous version using Celery.

Writing your first Celery task

Okay. Now we have a working Django project and some basic configuration of Celery. We will now:

  • Add a Celery task
  • Update the view to write log entries to the database asynchronously

Start with creating a new file called core/tasks.py. It will contain all your Celery tasks.

"""
Celery tasks
"""

from celery import task
from models import LogEntry


@task()
def write_log_entry(severity, message):
    """
    Write a log entry to the database
    """
    log = LogEntry(severity=severity, message=message)
    log.save()

What we do here is pretty straight forward. We create a new object of LogEntry which we assign a severity and message, then we save it to the database. The magic comes with the decorator @task which gives access to all Celery methods.

Next thing we need to do is to update the view to make use of our new task (core/views.py).

"""
Views for the Celery test project
"""

from core.tasks import write_log_entry
from django.http import HttpResponse


def index(request):
    """
    My index page
    """
    write_log_entry.delay(severity='INFO', message='Rendering the index page')
    return HttpResponse('Hello!')

As you can see, we are simply importing the task we just wrote and send a severity and message to it. But we are also calling delay(), which is one of the methods provided by Celery. It tells Celery that this request should be handled asynchronously.

If you now point your browser to (hit it a few times to create some messages), there will be no print lines in the standard output. The log message is instead sent to Celery for processing. But we do not have any Celery worker yet. So no one processes those messages for us.

You can see that by opening the Django dbshell.

./manage.py dbshell
sqlite> select count(*) from core_logentry;
2

There are only a few rows in my core_logentry table. If we now start the celeryd we will soon process the messages that are waiting for processing and add them to core_logentry. Start celeryd and add some info logging:

./manage.py celeryd -l info

You will see messages like those:

[2012-11-13 15:08:07,098: INFO/MainProcess] Got task from broker: core.tasks.write_log_entry[fcc23783-c4c0-4a29-a3db-a7c159335c9f]
[2012-11-13 15:08:07,439: INFO/MainProcess] Task core.tasks.write_log_entry[d6ca47b7-fe5a-4e39-8655-2a8689172d32] succeeded in 0.03084897995s: None

Which indicates that Celery got the tasks we sent before and that it handled them. You can now check the count with the dbshell again.

./manage.py dbshell
sqlite> select count(*) from core_logentry;
8

In my case we now have 8 lines of logs in the database. So what we can see here is that if the Celery worker (celeryd) is not running, then no messages are processed. Which in turn proves that we have now actually an asynchronous log system.

We’re done, but this is just scratching the surface of what Celery can do (and it is not the exact way it should be setup in a production environment). See the Celery documentation for more details.

When you are developing web applications you will sooner or later run need to optimize the application performance. I am often developing using Django or Flask. There are good debug plugins for both frameworks. In this article we will look closer at the Django Debug Toolbar. The toolbar has been ported to Flask as well.

Installing the debug toolbar

First off you need to install the toolbar:

pip install django-debug-toolbar

If your’e not using pip you could always fetch the code from the project GitHub page

Configure Django

We start with adding a new middleware class in settings.py. This is a standard MIDDLEWARE_CLASSES from Django. The observant reader might notice that we have added a new class, debug_toolbar.middleware.DebugToolbarMiddleware.

MIDDLEWARE_CLASSES = (
    'debug_toolbar.middleware.DebugToolbarMiddleware',
    'django.middleware.common.CommonMiddleware',
    'django.contrib.sessions.middleware.SessionMiddleware',
    'django.middleware.csrf.CsrfViewMiddleware',
    'django.contrib.auth.middleware.AuthenticationMiddleware',
    'django.contrib.messages.middleware.MessageMiddleware',
    # Uncomment the next line for simple clickjacking protection:
    # 'django.middleware.clickjacking.XFrameOptionsMiddleware',
)

If you haven’t already, you must make sure that your internal IP is added to the

INTERNAL_IPS = ('127.0.0.1',)

You might not have INTERNAL_IPS defined per default, then just add it.

Finally, you must add debug_toolbar to you INSTALLED_APPS

INSTALLED_APPS = (
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.sites',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'django.contrib.admin',
    'debug_toolbar',
)

Debugging with the debug toolbar

You will now have a toolbar on the right side of your web page. Of course, you should configure it to be invisible when you are in production mode. The toolbar will look like this:

There are a bunch of interesting features here, but the one I have found most useful is the SQL query analyzer. Especially in cases where you have many calls or fetch objects via foreign keys from your model.

As you might know I love well formatted code. PEP8 helps us Python developers to create beautiful code.

I have just released a Git pre-commit hook checking the code quality of Python files about to be committed. This hook looks for files ending with .py or with python in the she bang. Then it passes the file to pylint for quality assurance. If the file is too ugly you won’t be able to commit the file.

It is super easy to install the hook. Just download it from my GitHub page and then save it under .git/hooks/pre-commit in your Git repository. That’s it.

The hook can be found at GitHub.

This is what your prompt will look like when you commit a Python file.

This blog is now built using both Jekyll and Octopress instead of just Jekyll. It took some time and effort to get it right with Octopress. Mainly because I got lost in the Ruby mines.

Octopress comes with a bunch of features that actually takes a little while to implement in Jekyll. Like comments, GitHub and Twitter integration. It also comes with preintegrated support for various social networks.

And, best of all, because it’s all Jekyll it also runs on GitHub Pages!