I have django application, running under lighttpd via fastcgi. FCGI running script looks like:
python manage.py runfcgi socket=<path>/main.socket
method=prefork \
pidfile=<path>/server.pid \
minspare=5 maxspare=10 maxchildren=10 maxrequests=500 \
I use SQLite. So I have 10 proccess, which all work with the same DB. Next I have 2 views:
def view1(request)
...
obj = MyModel.objects.get_or_create(id=1)
obj.param1 = <some value>
obj.save ()
def view2(request)
...
obj = MyModel.objects.get_开发者_StackOverflow中文版or_create(id=1)
obj.param2 = <some value>
obj.save ()
And If this views are executed in two different threads sometimes I get MyModel instance in DB with id=1 and updated either param1 or param2 (BUT not both) - it depends on which process was the first. (of course in real life id changes, but sometimes 2 processes execute these two views with same id)
The question is: What should I do to get instance with updated param1 and param2? I need something for merging changes in different processes.
One decision is create interprocess lock object but in this case I will get sequence executing views and they will not be able to be executed simultaneously, so I ask help
DUPE OF Django: How can I protect against concurrent modification of data base entries
SQLite is not good choice if you need such concurrent access to database. I suggest switching to some other rdbms, like MySQL or PostgreSQL, and also take into account get_or_create fragility:
How do I deal with this race condition in django?
Regarding the above link, there is also second solution to that problem - using READ COMMITED isolation level, instead of REPEATABLE READ. But it's less tested (At least in MySQL), so there might be more bugs/problems with it.
精彩评论