As you know, Django uses new database connection for each request. This works well initially. However as the load on the server increases, creating/destroying connections to database starts taking significant amount of time. You will find many questions about using some kind of connection pooling for Django on sites like StackOverflow . For example, Django persistent database connection.
At BootStrapToday we use sqlalchemy's connection pooling mechanism with Django for pooling the database connections. We use variation of approach by Igor Katson described in http://dumpz.org/67550/. Igor's approach requires patching Django which we wanted to avoid. Hence we created a small function that we import in one of __init__.py (or models.py) (i.e. some file which gets imported early in the application startup).
import sqlalchemy.pool as pool pool_initialized=False def init_pool(): if not globals().get('pool_initialized', False): global pool_initialized pool_initialized = True try: backendname = settings.DATABASES['default']['ENGINE'] backend = load_backend(backendname) #replace the database object with a proxy. backend.Database = pool.manage(backend.Database) backend.DatabaseError = backend.Database.DatabaseError backend.IntegrityError = backend.Database.IntegrityError logging.info("Connection Pool initialized") except: logging.exception("Connection Pool initialization error") #Now call init_pool function to initialize the connection pool. No change required in the # Django code. init_pool()
So far this seems to be working quite well.