Limiting Memory Use in a *Large* Django QuerySet

Chris W. picture Chris W. · Jan 31, 2011 · Viewed 10.9k times · Source

I have a task which needs to be run on 'most' objects in my database once every some period of time (once a day, once a week, whatever). Basically this means that I have some query that looks like this running in it's own thread.

for model_instance in SomeModel.objects.all():
    do_something(model_instance)

(Note that it's actually a filter() not all() but none-the-less I still end up selecting a very large set of objects.)

The problem I'm running into is that after running for a while the thread is killed by my hosting provider because I'm using too much memory. I'm assuming all this memory use is happening because even though the QuerySet object returned by my query initially has a very small memory footprint it ends up growing as the QuerySet object caches each model_instance as I iterate through them.

My question is, "what is the best way to iterate through almost every SomeModel in my database in a memory efficient way?" or perhaps my question is "how do I 'un-cache' model instances from a django queryset?"

EDIT: I'm actually using the results of the queryset to build a series of new objects. As such, I don't end up updating the queried-for objects at all.

Answer

mpaf picture mpaf · Nov 29, 2013

What about using django core's Paginator and Page objects documented here:

https://docs.djangoproject.com/en/dev/topics/pagination/

Something like this:

from django.core.paginator import Paginator
from djangoapp.models import SomeModel

paginator = Paginator(SomeModel.objects.all(), 1000) # chunks of 1000

for page_idx in range(1, paginator.num_pages):
    for row in paginator.page(page_idx).object_list:
        # here you can do what you want with the row
    print "done processing page %s" % page_idx