开发者

Switching to PostgreSQL fails loading datadump

开发者 https://www.devze.com 2023-01-03 21:14 出处:网络
Using sqlite3 and Django I want to change to PostgreSQL and keep all data intact. I used ./manage.py dumpdata > dump.json to dump the data, and changed settings to use PostgreSQL.

Using sqlite3 and Django I want to change to PostgreSQL and keep all data intact. I used ./manage.py dumpdata > dump.json to dump the data, and changed settings to use PostgreSQL.

With an empty database ./manage.py loaddata dump.json resulted in errors about tables not existing, so I ran ./manage.py syncdb and tried again. That results in this error:

Problem installing fixture 'dump.json': Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/django/core/management/commands/loaddata.py", line 163, in handle
    obj.save()
  File "/usr/lib/python2.6/site-packages/django/core/serializers/base.py", line 163, in save
    models.Model.save_base(self.object, raw=True)
  File "/usr/lib/python2.6/site-packages/django/db/models/base.py", line 495, in save_base
    rows = manager.filter(pk=pk_val)._up开发者_如何学运维date(values)
  File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 448, in _update
    return query.execute_sql(None)
  File "/usr/lib/python2.6/site-packages/django/db/models/sql/subqueries.py", line 124, in execute_sql
    cursor = super(UpdateQuery, self).execute_sql(result_type)
  File "/usr/lib/python2.6/site-packages/django/db/models/sql/query.py", line 2347, in execute_sql
    cursor.execute(sql, params)
  File "/usr/lib/python2.6/site-packages/django/db/backends/util.py", line 19, in execute
    return self.cursor.execute(sql, params)
IntegrityError: duplicate key value violates unique constraint "django_content_type_app_label_key"

Is this not the correct way to move data from one database to another? How should I switch database backend safely?


The problem is simply that you're getting the content types defined twice - once when you do syncdb, and once from the exported data you're trying to import. Since you may well have other items in your database that depend on the original content type definitions, I would recommend keeping those.

So, after running syncdb, do manage.py dbshell and in your database do TRUNCATE django_content_type; to remove all the newly-defined content types. Then you shouldn't get any conflicts - on that part of the process, in any case.


There is a big discussion about it on the Django ticket 7052. The right way now is to use the --natural parameter, example: ./manage.py dumpdata --natural --format=xml --indent=2 > fixture.xml

In order for --natural to work with your models, they must implement natural_key and get_by_natural_key, as described on the Django documentation regarding natural keys.

Having said that, you might still need to edit the data before importing it with ./manage.py loaddata. For instance, if your applications changed, syncdb will populate the table django_content_type and you might want to delete the respective entries from the xml-file before loading it.


This worked for me. You probably want to ensure the server is stopped so no new data is lost. Dump it:

$ python manage.py dumpdata --exclude auth.permission --exclude contenttypes --natural > db.json

Make sure your models don't have signals (e.g. post_save) or anything that creates models. If you do, comment it out momentarily.

Edit settings.py to point to the new database and set it up:

$ python manage.py syncdb

$ python manage.py migrate

Load the data:

./manage.py loaddata db.json


I used pgloader, just take a few seconds to migrate successfully:

$ pgloader project.load

project.load file with:

load database
  from sqlite:////path/to/dev.db
  into postgresql://user:pwd@localhost/db_name

with include drop, create tables, create indexes, reset sequences

 set work_mem to '16MB', maintenance_work_mem to '512 MB';
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号