PyOAE renamed DjOAE

2 05 2012

I’ve been talking to several folks since my last post on PyOAE and it has become clear that the name doesn’t convey the right message. The questions often center around the production usage of a native Python webapp or the complexity of writing your own framework from scratch. To address this issue I have renamed PyOAE to DjOAE to reflect its true nature.

It is a DJango web application and the reason I chose DJango was because I didn’t want to write yet another framework. I could have chosen any framework, even a Java framework if such a thing existed, but I chose Django because it has good production experience with some large sites, a vibrant community and has already solved most of the problems that a framework should have solved.

The latest addition to that set of problems already solved, that I have needed is data and schema migration. DjOAE is intended to be deployed in a DevOps like way with hourly deployments  if needed. To make that viable the code base has to address schema and data migrations as they happen. I have started to use South that not only provides a framework for doing this, but automates roll forward and roll back of database schema and data (if possible). For the deployer the command is ever so simple.

python migrate

Which queries the database to work out where it is relative to the code and then upgrades it to match the code.

This formalizes the process that has been used for years in Sakai CLE into a third party component used by thousands and avoids the nightmare scenario where all data migration has to be worked out when a release is performed.

I have to apologise to anyone upstream for the name change as it will cause some disruption, but better now than later. Fortunately clones are simple to adjust, as git seems to only care about the commit sha1 so a simple edit to .git/config changing

url = ssh://
url = ssh://

should be enough.

If you are the standard settings you will need to rename your database. I did this with pgAdminIII without dropping the database.



3 responses

2 05 2012
Charles Severance

This is cool because Django is the framework used by Google App Engine (with special Google tweaks). Since 2009, I have taught roughly 550 University of Michigan graduate students AppEngine and so they know wgsi, templates, models, etc. I chose App Engine/Django because I felt it was the single most complete and approachable full-stack web framework and we teach Python programming in the previous course. The students love it.

While I am not sure how may super-scalable apps run *on* the Google infrastructure, Google sure has made Django far more popular. I think lots of folks cut their teeth on App Engine for free (buying my awesome O’Reilly book) and then when they want to go large-scale – move to Django. My Appengine Book has a web site at and my Python book is at

6 05 2012

Is there any chance of making the repository public?

I’m currently working on a Django backed app and we’re using Piston for our CRUD-typed APIs. I’d be interested to see what you’re using? I have a feeling we might get away with just using plain Django views.

Did you have a look at Sentry for doing error logging? We’ve recently enabled this and it’s 10 times better than digging through enormous amount of log files.

6 05 2012

I am using simplejson with custom encoders to produce the Json, manly because its quick and uses C code when it can. Those are wired in using standaard DJango views, nothing special. I thought about using Piston, but found simplejson, encoders and views simple and easy so I never got to trying it. I am generally not doing any error logging, so I use the standard logging API and when I have everything in an area working I remove all the routine logging so that all that appears in the log file is 1 line per request. That seems to be quite effective in avoiding loads of lines in the log files. I’ll have a look at Sentry though it looks usefull, as does the Engineering blog at Instagram, where they run 100+ DJango instances on EC2 using a PostgreSQL + REDIS backend. Interestingly they have chosen Solr rather than Elastic Search for their search backend.

%d bloggers like this: