开发者

Django shift to PostgreSQL fails to import fixtures, stating data too long

开发者 https://www.devze.com 2022-12-22 01:21 出处:网络
I am switching from using SQLite3 to PostgreSQL, and hoped that I could populate the database using the fixtures I had been using to populate SQLite3. However, I am getting these errors:

I am switching from using SQLite3 to PostgreSQL, and hoped that I could populate the database using the fixtures I had been using to populate SQLite3. However, I am getting these errors:

$ python manage.py loaddata fixtures/core.json fixtures/auth.json

Installing json fixture 'fixtures/core' from absolute path.
Problem installing fixture 'fixtures/core.json': Traceback (most recent call last):
  File "/home/mvid/webapps/nihl/nihlapp/django/core/management/commands/loaddata.py", line 153, in handle
    obj.save()
  File "/home/mvid/webapps/nihl/nihlapp/django/core/serializers/base.py", line 163, in save
    models.Model.save_base(self.object, raw=True)
  File "/home/mvid/webapps/nihl/nihlapp/django/db/models/base.py", line 495, in save_base
    result = manager._insert(values, return_id=update_pk)
  File "/home/mvid/webapps/nihl/nihlapp/django/db/models/manager.py", line 177, in _insert
    return insert_query(self.model, values, **kwargs)
  File "/home/mvid/webapps/nihl/nihlapp/django/db/models/query.py", line 1087, in insert_query
    return query.execute_sql(return_id)
  File "/home/mvid/webapps/nihl/nihlapp/django/db/models/sql/subqueries.py", line 320, in execute_sql
    cursor = super(InsertQuery, self).execute_sql(None)
  File "/home/mvid/webapps/nihl/nihlapp/django/db/models/sql/query.py", line 2369, in execute_sql
    cursor.execute(sql, params)
  File "/home/mvid/webapps/nihl/nihlapp/django/db/backends/ut开发者_StackOverflow社区il.py", line 19, in execute
    return self.cursor.execute(sql, params)
DataError: value too long for type character varying(30)

I never used to get any data length errors, and I have not changed the models between database switches. PostgreSQL is running utf8. Is there a way to see exactly which json values it fails on so that I can update the respective models? Any idea as to why the values worked in SQLite but fail in PostgreSQL?


Sqlite does not enforce the length of a varchar(n). From the sqlite FAQ:

http://www.sqlite.org/faq.html#q9


Check the Postgres log file, it will log the full failed SQL statement including table name and offending string value.


To solve the actual problem, change your declaration to text. That will allow you to import your data and get it cleaned up. Then you can reapply a constraint.


Check the length of data from SQLLite inside fixtures/core.json that you are trying to insert to Postgres. It looks like the data exceed 30 length characters. Check your django model for property that is limited to 30 characters, then check your fixtures for data that exceeds that maximum length.

0

精彩评论

暂无评论...
验证码 换一张
取 消