If you are writing a Python package that shall be used in more than one project, you propably stumbled upon the problem that you have to support different versions of Python, Django or any kind of other dependency. Your legacy code is using Django 1.4, but you still want your package to be used in new projects based on Django 1.8? No problem here, you can do that easily and still keep sane.
Doing feature detection
Let's assume you noticed some change between two Django versions of a feature that you are using in your package. For example Django 1.6 and newer has the Model options attribute called model_name. It let's you get the lower case name of a model like this:
This attribute didn't exist in Django 1.5 and earlier. So what if you still need to support older versions? You can simply throw in a check if you can use the feature you want to access, and provide an alternative path for older Django versions (something like a polyfill):
if hastattr(MyModel._meta, 'model_name'): model_name = MyModel._meta.model_name else: model_name = MyModel._meta.object_name.lower() # Equivalent to newer model_name attribute
Ok, that was easy. However keep your future you in mind. In about seven days you will have forgotten the exact reason why you are actully testing for the existince of this strange attribute … What was I doing back then? Why do I need that else case? These look like two redundant code paths…
And future you is right. So you will refactor this and only use the first code path since this works for you. And then someone will complain that there is an issue with your package when running Django 1.4 :)
So do yourself a favour and put a comment in front of this codeblock:
# ``model_name`` was introduced with Django 1.6 if not hastattr(MyModel._meta, 'model_name'): model_name = MyModel._meta.model_name # Keep Django 1.5 support else: model_name = MyModel._meta.object_name.lower()
Use a compat.py module
So supporting multiple versions is fairly easy now. You are allowed to go different code paths, yes. That's kind of the only option you have when something changed fundamentally in your dependencies. However you don't want to sprinkle your code base with these if/else statements in every place you want to use the newer feature.
So I recommend the most natural thing when discovering code duplication: Abstract it away into a function. For example:
def get_model_name(model): # ``model_name`` was introduced with Django 1.6 if not hastattr(model._meta, 'model_name'): model_name = model._meta.model_name # Keep Django 1.5 support else: model_name = model._meta.object_name.lower() # Somewhere else in the code: model_name = get_model_name(MyModel)
No code duplication, no added complexity in all the places that you want to use the model_name attribute. Cool.
Now one final advice. Put all of these functions that are there for supporting all the differing versions of your dependencies into one module. I would suggest calling it compat.py. The reason for this is that you have one single point were you can look for all the special logic that is required in your app for supporting those odd deps. It also has the benefit that you simply don't even need to think about were to put this code. It's not really suitable for utils.py and shouldn't litter any other business logic of your app. So keep it separate and you win a little more overview.
Testing with multiple versions
There are quite a few other guides online for how to test your package against multiple versions of your dependencies. I just want to say so much: Do it! And do it in an automated fashion.
If you don't have a testsuite yet, go make some. My own experience tells me that supporting multiple configurations of your same package is a physically hurting pain if you don't have at least some minimum test coverage. I recommend using tox for running your tests against multiple dependency versions. You don't have to start with 100% coverage. Any kind is good for the start, as long as you run the most important classes and functions at least once.
Have a look at the tox.ini file of my django-superform project. You can then run a specific combination of dependencies by using tox's -e option:
tox -e py27-14 # Test against Django 1.4 on Python 2.7
Dropping support for old versions
Finally there will come the point where you don't want to keep the support for an ancient version of your dependency around, let's say Python 2.5. You start to scan through the code base in search for all the switches you introduced like if sys.version_info < (2, 5). However that quickly get's messy. What about cases where you don't use sys.version_info or django.VERSION but are doing some kind of feature detection, like:
try: import json except ImportError: from django.utils import simplejson as json
The json package is only available in Python 2.6 and higher, so when removing Python 2.5 support, we don't need the except ImportError case anymore. However a quick grep through the code base won't reveal this snippet, so in most cases you will miss quite a few places that you actualy wanted to refactor.
Anyways, you have a compat.py module in place. Have you? So we are ready to just quickly scan this file for anything that seems Python 2.5 related and all unnecessary code. Now remove the testcase for the old Python from the tox.ini file and we are done. Congratuliations! One less dependency option to support.
Inspired by django-rest-framework
I first stumbled upon a compat.py module in django-rest-framework and I'm applying this pattern now since then. I'm using a compat.py module (or something named similiarly for other programming languages) in nearly any package I have to maintain. It helped me quite bit in the past to keep sane and move forward quickly when new Django versions are released and support for older once are dropped.
I do recommend to read through bits and pieces of the django-rest-framework source code. It's absolutely lean and full of good coding practices. You'll learn something new from it, no matter what skill level you are.