14. Useful Design Patterns – Expert Python Programming

Chapter 14. Useful Design Patterns

A design pattern is a reusable, somewhat language-specific solution to a common problem in software design. The most popular book on this topic is Design Patterns: Elements of Reusable Object-Oriented Software, written by Gamma, Helm, Johnson, and Vlissides a.k.a. the Gang of Four or GoF. It is considered as a major writing in this area, and provides a catalogue of 23 design patterns with examples in SmallTalk and C++.

While designing a code application, these patterns are good and known references. They ring a bell to all developers since they describe proven development paradigms. But they should be studied with the used language in mind, since some of them do not make sense in some languages or are already built-in.

This chapter describes the most useful patterns in Python or that are interesting to discuss, with toy implementation examples. The following are the three sections that correspond to design pattern categories defined by the GoF:

  • Creational patterns: Patterns that are used to generate objects with specific behaviors

  • Structural patterns: Patterns that help in structuring the code for specific use cases

  • Behavioral patterns: Patterns that help in structuring processes

Creational Patterns

A creational pattern provides a particular instantiation mechanism. It can be a particular object factory or even a class factory.

This is an important pattern in compiled languages such as C, since it is harder to generate types on-demand at run time.

But this feature is built-in in Python, for instance the type built-in, which lets you define a new type by code:

>>> MyType = type('MyType', (object,), {'a': 1})
>>> ob = MyType()
>>> type(ob)
<class '__main__.MyType'>
>>> ob.a
>>> isinstance(ob, object)

Classes and types are built-in factories and you can interact with class and object generation using meta-classes, for instance (see Chapter 3). These features are the basics to implement the Factory design pattern and we won't further describe it in this section.

Besides Factory, the only other creational design pattern from the GoF that is interesting to describe in Python is Singleton.



Singleton restricts instantiation of a class to one object.

The Singleton pattern makes sure that a given class has always only one living instance in the application. This can be used, for example, when you want to restrict a resource access to one and only one memory context in the process. For instance, a database connector class can be a Singleton that deals with synchronization and manages its data in memory. It makes the assumption that no other instance is interacting with the database in the meantime.

This pattern can simplify a lot the way concurrency is handled in an application. Utilities that provide application-wide functions are often declared as Singletons. For instance, in web applications, a class that is in charge of reserving a unique document ID would benefit from the Singleton pattern. There should be one and only one utility doing this job.

Implementing the Singleton pattern is straightforward with the __new__ method:

>>> class Singleton(object):
...     def __new__(cls, *args, **kw):
...         if not hasattr(cls, '_instance'):
...	          orig = super(Singleton, cls) 			  
...             cls._instance = orig.__new__(cls, *args, **kw)
...         return cls._instance
>>> class MyClass(Singleton):
...     a = 1
>>> one = MyClass()
>>> two = MyClass()
>>> two.a = 3
>>> one.a

Although the problem with this pattern is subclassing; all instances will be instances of MyClass no matter what the method resolution order (__mro__) says:

>>> class MyOtherClass(MyClass):
...     b = 2
>>> three = MyOtherClass()
>>> three.b
Traceback (most recent call last):
  File "<stdin>", line 1, in ?
AttributeError: 'MyClass' object has no attribute 'b'

To avoid this limitation, Alex Martelli proposed an alternative implementation based on shared state called Borg.

The idea is quite simple. What really matters in the Singleton pattern is not the number of living instances a class has, but rather the fact that they all share the same state at all times. So Alex Martelli came up with a class that makes all instances of the class share the same __dict__:

>>> class Borg(object):
...     _state = {}
...     def __new__(cls, *args, **kw):
...         ob = super(Borg, cls).__new__(cls, *args, **kw)
...         ob.__dict__ = cls._state
...         return ob
>>> class MyClass(Borg):
...     a = 1
>>> one = MyClass()
>>> two = MyClass()
>>> two.a = 3
>>> one.a
>>> class MyOtherClass(MyClass):
...     b = 2
>>> three = MyOtherClass()
>>> three.b
>>> three.a
>>> three.a = 2
>>> one.a

This fixes the subclassing issue, but is still dependent on how the subclass code works. For instance, if __getattr__ is overridden, the pattern can be broken.

Nevertheless, Singletons should not have several levels of inheritance. A class that is marked as a Singleton is already specific.

That said, this pattern is considered by many developers as a heavy way to deal with uniqueness in an application. If a Singleton is needed, why not use a module with functions instead, since a Python module is a Singleton?


The Singleton factory is an implicit way of dealing with the uniqueness in your application. You can live without it. Unless you are working in a framework à la Java that requires such a pattern, use a module instead of a class.

Structural Patterns

Structural patterns are really important in big applications. They decide how the code is organized and give developers recipes on how to interact with each part of the application.

The most well-known implementation of structural patterns in the Python world is the Zope Component Architecture (ZCA, see http://wiki.zope.org/zope3/ComponentArchitectureOverview). It implements most of the patterns described in this section and provides a rich set of tools to work with them. The ZCA is intended to run not only in the Zope framework, but also in other frameworks such as Twisted. It provides an implementation of interfaces and adapters among other things.

So it should be considered instead of re-writing such patterns from scratch, even if it is not a big work.

There are a lot of structural patterns derived from the GoF 11 originals.

Python provides a Decorator-like pattern, for instance, that allows decorating a function using the @decorator syntax, but not at run time. This will be extended to classes in the future version of the language (see http://www.python.org/dev/peps/pep-3129).

Other popular patterns are:

  • Adapter

  • Proxy

  • Facade



Adapter wraps a class or an object A so that it works in a context intended for a class or an object B.

When some code is intended to work with a given class, it is fine to feed it with objects from another class as long as they provide the methods and attributes used by the code. This forms the basics of the duck-typing philosophy in Python.


If it walks like a duck and talks like a duck, then it's a duck!

Of course, this assumes that the code isn't calling instanceof to verify that the instance is of a specific class.


I said that it was a duck; there's no need to check its DNA!

The Adapter pattern is based on this philosophy and defines a wrapping mechanism, where a class or an object is wrapped in order to make it work in a context that was not primarily intended for it. StringIO is a typical example, as it adapts the str type so it can be used as a file type:

>>> from StringIO import StringIO
>>> my_file = StringIO(u'some content')
>>> my_file.read()
u'some content'
>>> my_file.seek(0)
>>> my_file.read(1)

Let's take another example. A DublinCoreInfos class knows how to display Dublin Core information (see http://dublincore.org/) for a given document. It reads a few fields such as the author's name or the title, and prints them. To be able to display Dublin Core for a file, it has to be adapted the same way StringIO does. The figure gives an UML-like diagram of the pattern.

DublinCoreAdapter wraps a file instance and provides metadata access over it:

>>> from os.path import split, splitext
>>> class DublinCoreAdapter(object):
...     def __init__(self, filename):
...         self._filename = filename
...     def title(self):
...         return splitext(split(self._filename)[-1])[0]
...     def creator(self):
...         return 'Unknown'    # we could get it for real
...     def languages(self):
...         return ('en',)
>>> class DublinCoreInfo(object):
...     def summary(self, dc_ob):
...         print 'Title: %s' % dc_ob.title()
...         print 'Creator: %s' % dc_ob.creator()
...         print 'Languages: %s' % \
...                   ', '.join(dc_ob.languages())
>>> adapted = DublinCoreAdapter('example.txt')
>>> infos = DublinCoreInfo()
>>> infos.summary(adapted)
Title: example
Creator: Unknown
Languages: en

Besides the fact that it allows substitution, the Adapter pattern can also change the way developers work. Adapting an object to work in a specific context makes the assumption that the class of the object does not matter at all. What matters is that this class implements what DublinCoreInfo is waiting for. And this behavior is fixed or completed by an adapter. So the code can simply tell somehow whether it is compatible with objects that are implementing a specific behavior. This can be expressed by interfaces.


An interface is a definition of an API. It describes a list of methods and attributes a class should have to implement with the desired behavior. This description does not implement any code, but just defines an explicit contract for any class that wishes to implement the interface. Any class can then implement one or several interfaces in whichever way it wants.

While Python prefers duck typing over explicit interface definitions, it may be better to use them sometimes. For instance, explicit interface definition makes it easier for a framework to define functionalities over interfaces.

The benefit is that classes are loosely coupled, which is considered as a good practice. For example, to perform a given process, a class A does not depend on a class B, but rather on an interface I. Class B implements I, but it could be any other class.

This technique is built-in in Java, for instance, where the code can deal with objects that implement a given interface, no matter what kind of class it comes from. It is an explicit duck-typing behavior: Java uses interfaces to verify a type safety at compile time rather than using duck typing to tie things together at run time.

Many developers request that interfaces be added to Python as a core language feature. Currently, those who wish to make use of explicit interfaces are forced to use Zope Interfaces (see http://pypi.python.org/pypi/zope.interface) or PyProtocols (see http://peak.telecommunity.com/PyProtocols.html).

Previously, Guido rejected the request to add interfaces to the language since they don't fit in Python's dynamic, duck-typing nature. However, interface systems have proven their worth in certain situations, so Python 3000 will come with a feature called optional type annotations. This feature can be used as syntax for third-party interface libraries.


Abstract Base Classes (ABC) support was added lately in Python 3000 (see http://www.python.org/dev/peps/pep-3119). extract from the PEP:

"ABCs are simply Python classes that are added into an object's inheritance tree to signal certain features of that object to an external inspector"


Adapter is perfect to loosely couple a class and an execution context.

But if you use Adapter like a programming philosophy rather than a quick fix to force an object in a specific process, you should also consider using interfaces.



Proxy provides indirect access to an expensive or a distant resource.

A Proxy is between a Client and a Subject, as shown in the figure. It is intended to optimize Subject accesses if they are expensive. For instance, the memoize decorator described in the previous chapter can be considered as a Proxy.

A Proxy can also be used to provide smart access to a subject. For instance, big video files can be wrapped into proxies to avoid loading them into memory when the user just asks for their titles.

An example is given by the urllib2 module. urlopen is a proxy for the content located at a remote URL. When it is created, headers can be retrieved independently from the content itself:

>>> class Url(object):
...     def __init__(self, location):
...         self._url = urlopen(location)
...     def headers(self):
...         return dict(self._url.headers.items())
...     def get(self):
...         return self._url.read()
>>> python_org = Url('http://python.org')
>>> python_org.headers()
{'content-length': '16399', 'accept-ranges': 'bytes', 'server': 'Apache/2.2.3 (Debian) DAV/2 SVN/1.4.2 mod_ssl/2.2.3 OpenSSL/0.9.8c', 'last-modified': 'Mon, 09 Jun 2008 15:36:07 GMT', 'connection': 'close', 'etag': '"6008a-400f-91f207c0"', 'date': 'Tue, 10 Jun 2008 22:17:19 GMT', 'content-type': 'text/html'}

This can be used to decide whether the page has been changed before getting its body to update a local copy, by looking at the last-modified header. Let's take an example with a big file:

>>> ubuntu_iso = Url('http://ubuntu.mirrors.proxad.net/hardy/ubuntu-8.04-desktop-i386.iso')
>>> ubuntu_iso.headers['last-modified']
'Wed, 23 Apr 2008 01:03:34 GMT'

Another use case of proxies is data uniqueness.

For example, let's consider a website that presents the same document in several locations. Extra fields specific to each location are appended to the document, such as a hit counter and a few permission settings. A proxy can be used in that case to deal with location-specific matters, and also to point to the original document instead of copying it. So a given document can have many proxies and if its content changes, all locations will benefit from it without having to deal with version synchronization.


Use Proxy as a local handle of something that may live somewhere else to:

  • Make the process faster.

  • Avoid external resource access.

  • Reduce memory load.

  • Ensure data uniqueness.



Facade provides a high-level, simpler access to a subsystem.

A Facade is nothing but a shortcut to use a functionality of the application, without having to deal with the underlying complexity of a subsystem. This can be done, for instance, by providing high-level functions at the package level.


See Tracking Verbosity in Chapter 4 for examples.

Facade is usually done on existing systems, where a package's frequent usage is synthesized in high-level functions. Usually, no classes are needed to provide such a pattern and simple functions in the __init__.py module are sufficient.


Facade simplifies the usage of your packages. Facades are usually added after a few iterations with usage feedback.

Behavioral Patterns

Behavioral patterns are intended to simplify the interactions between classes by structuring the processes with which they interact.

This section provides three examples:

  • Observer

  • Visitor

  • Template



This is used to notify a list of objects with a state change.

Observer allows adding features in an application in a pluggable way by de-coupling the new functionality from the existing code base. An event framework is a typical implementation of the Observer pattern and is described in the figure that follows. Every time an event occurs, all observers for this event are notified with the subject that has triggered this event.

An event is when something happens. In graphical user interface applications, event-driven programming (see http://en.wikipedia.org/wiki/Event-driven_programming) is often used to link the code to user actions. For instance, a function can be linked to the MouseMove event and so it is called every time the mouse moves over the window. In that case, de-coupling the code from the window management matters simplifies the work a lot: Functions are written separately and then registered as event observers. This approach exists from the earliest versions of Microsoft's MFC framework (see http://en.wikipedia.org/wiki/Microsoft_Foundation_Class_Library), and in all GUI development tools such as Delphi:

But the code can also generate events. For instance, in an application that stores documents in a database, DocumentCreated, DocumentModified, and DocumentDeleted can be three events provided by the code.

A new feature that works on documents can register itself as an observer to get notified every time a document is created, modified, or deleted, and do the appropriate work. A document indexer could be added that way in an application. Of course, this requires that all the code in charge of creating, modifying, or deleting documents is triggering events. But this is rather easier than adding indexing hooks all over the application code base!

An Event class can be implemented for registration of observers in Python by working at the class level:

>>> class Event(object):
...     _observers = []
...     def __init__(self, subject):
...         self.subject = subject
...     @classmethod
...     def register(cls, observer):
...         if observer not in cls._observers:
...             cls._observers.append(observer)
...     @classmethod
...     def unregister(cls, observer):
...         if observer in cls._observers:
...             self._observers.remove(observer)
...     @classmethod
...     def notify(cls, subject):
...         event = cls(subject)
...         for observer in cls._observers:
...             observer(event)

The idea is that observers register themselves using the Event class method and get notified with Event instances that carry the subject that triggered them:

>>> class WriteEvent(Event):
...     def __repr__(self):
...         return 'WriteEvent'
>>> def log(event):
...     print '%s was written' % event.subject
>>> WriteEvent.register(log)
>>> class AnotherObserver(object):
...     def __call__(self, event):
...         print 'Yeah %s told me !' % event
>>> WriteEvent.register(AnotherObserver())
>>> WriteEvent.notify('a given file')
a given file was written
Yeah WriteEvent told me !

This implementation could be enhanced by:

  • Allowing the developer to change the order

  • Making the event object hold more information than just the subject


De-coupling your code is fun and Observer is the right pattern to do it. It componentizes your application and makes it extensible.

If you want to use an existing tool, try Pydispatch. It provides a nice multi-consumer and multi-producer dispatch mechanism. See http://www.sqlobject.org/module-sqlobject.include.pydispatch.html.



Visitor helps in separating algorithms from data structures.

Visitor has a similar goal to that of the Observer. It allows extending the functionalities of a given class without changing its code. But Visitor goes a bit further by defining a class that is responsible for holding data, and pushes the algorithms to into other classes called Visitors. Each visitor is specialized in one algorithm and can apply it on the data. This behavior is quite similar to the MVC paradigm (see http://en.wikipedia.org/wiki/Model-view-controller) where documents are passive containers pushed to views through controllers, or where models contain data that is altered by a controller.

Visitor is done by providing an entry point in the data class that can be visited by all kinds of visitors. A generic description is a Visitable class that accepts Visitor instances and calls them, as shown in the figure below:

The Visitable class decides how it calls the Visitor class, for instance, by deciding which method is called. For instance, a visitor in charge of printing built-in type content can implement visit_TYPENAME methods, and each of these types can call the given method in its accept method:

>>> class vlist(list):
...     def accept(self, visitor):
...         visitor.visit_list(self)
>>> class vdict(dict):
...     def accept(self, visitor):
...         visitor.visit_dict(self)
>>> class Printer(object):
...     def visit_list(self, ob):
...         print 'list content :'
...         print str(ob)
...     def visit_dict(self, ob):
...         print 'dict keys: %s' % ','.join(ob.keys())
>>> a_list = vlist([1, 2, 5])
>>> a_list.accept(Printer())
list content :
[1, 2, 5]
>>> a_dict = vdict({'one': 1, 'two': 2, 'three': 3})
>>> a_dict.accept(Printer())
dict keys: one,three,two

But this pattern means that each visited class needs to have an accept method to be visited, which is quite painful.

Since Python allows code introspection, a better idea is to automatically link visitors and visited class:

>>> def visit(visited, visitor):
...     cls = visited.__class__.__name__
...     meth = 'visit_%s' % cls
...     method = getattr(visitor, meth, None)
...     if meth is None:
...         meth(visited)
>>> visit([1, 2, 5], Printer())
list content :
[1, 2, 5]
>>> visit({'one': 1, 'two': 2, 'three': 3}, Printer())
dict keys: three,two,one

This pattern is used in this way in the compiler.visitor module, for instance, by the ASTVisitor class that calls the visitor with each node of the compiled code tree. This is because Python doesn't have a match operator like Haskell.

Another example is a directory walker that calls Visitor methods depending on the file extension:

>>> def visit(directory, visitor):
...     for root, dirs, files in os.walk(directory):
...         for file in files:
...             # foo.txt → .txt
...             ext = os.path.splitext(file)[-1][1:]
...             if hasattr(visitor, ext):
...                 getattr(visitor, ext)(file)
>>> class FileReader(object):
...     def pdf(self, file):
...         print 'processing %s' % file
>>> walker = visit('/Users/tarek/Desktop', FileReader())
processing slides.pdf
processing sholl23.pdf


If your application has data structures that are visited by more than one algorithm, the Visitor pattern will help in separating concerns: It is better for a data container to focus only on providing access to data and holding them, and nothing else.

A good practice is to create data structures that do not have any method, like a struct in C would be.



Template helps in designing a generic algorithm by defining abstract steps, which are implemented in subclasses.

Template uses the Liskov substitution principle, which says:

"If S is a subtype of T, then objects of type T in a program may be replaced with objects of type S without altering any of the desirable properties of that program." (Wikipedia)

In other words, an abstract class can define how an algorithm works through steps that are implemented in concrete classes. The abstract class can also give a basic or partial implementation of the algorithm, and let developers override its parts. For instance, some methods of the Queue class in the Queue module can be overridden to make its behavior vary.

Let's implement an example shown in the figure that follows. Indexer is an indexer class that processes a text in five steps, which are common steps no matter what indexing technique is used:

  • Text normalization

  • Text split

  • Stop words removal

  • Stem words

  • Frequency

An Indexer provides a partial implementation for the process algorithm, but requires _remove_stop_words and _stem_words to be implemented in a subclass. BasicIndexer implements the strict minimum, while LocalIndex uses a stop word file and a stem words database. FastIndexer implements all steps and could be based on a fast indexer such as Xapian or Lucene.

A toy implementation can be:

>>> class Indexer(object):
...     def process(self, text):
...         text = self._normalize_text(text)
...         words = self._split_text(text)
...         words = self._remove_stop_words(words)
...         stemmed_words = self._stem_words(words)
...         return self._frequency(stemmed_words)
...     def _normalize_text(self, text):
...         return text.lower().strip()
...     def _split_text(self, text):
...         return text.split()
...     def _remove_stop_words(self, words):
...         raise NotImplementedError
...     def _stem_words(self, words):
...         raise NotImplementedError
...     def _frequency(self, words):
...         counts = {}
...         for word in words:
...             counts[word] = counts.get(word, 0) + 1

From there, a BasicIndexer implementation can be:

>>> from itertools import groupby
>>> class BasicIndexer(Indexer):
...     _stop_words = ('he', 'she', 'is', 'and', 'or')
...     def _remove_stop_words(self, words):
...         return (word for word in words
...                 if word not in self._stop_words)
...     def _stem_words(self, words):
...         return ((len(word) > 2 and word.rstrip('aeiouy')
...                  or word)
...                 for word in words)
...     def _frequency(self, words):
...         freq = {}
...         for word in words:
...             freq[word] = freq.get(word, 0) + 1
>>> indexer = BasicIndexer()
>>> indexer.process(('My Tailor is rich and he is also '
...                  'my friend'))
{'tailor': 1, 'rich': 1, 'my': 2, 'als': 1, 'friend': 1}


Template should be considered for an algorithm that may vary and can be expressed into isolated sub-steps.

This is probably the most used pattern in Python.


Design patterns are reusable, somewhat language-specific solutions to common problems in software design. They are a part of the culture of all developers, no matter what language they use.

So having implementation examples for the most used patterns for a given language is a great way to document it. There are implementations in Python of each of the GoF design patterns on various websites when it makes sense. The Python Cookbook at http://aspn.activestate.com/ASPN/Python/Cookbook/ in particular is a good place to look.

Python provides some built-in features to use some pattern, and this chapter shows how to implement a few other design patterns:

  • Singleton

  • Adapter

  • Proxy

  • Facade

  • Observer

  • Visitor

  • Template


For more information on Design Patterns:

Watch Alex Martelli's talk: http://www.youtube.com/watch?v=0vJJlVBVTFg.

A nice pattern by Shannon Behrens is at: http://www.linuxjournal.com/article/8747V