Changing database location is simple – just launch dump on source database server, import it into destination database, redirect domain and voila! You can use this method to migrate your database into newer database engine version. But what can you do if you realize whole backend must be changed (i.e. from MySQL to PostgreSQL)?
Migrating SQL dump to different database dialect is not very easy (column types / dates formats as first examples come to mind). But you don't have to operate on SQL dumps. The simple answer here is: "dumpdata".
Django uses special manage.py script to manage typical operations like: initialisation of database, preloading data, dropping database etc. The command:
manage.py dumpdata appname
prints on stdout all data contained in appname in universal Json format. Then you can load dump just created by using:
manage.py sqlreset gabinet | psql ... manage.py loaddata filename.json
Database state must be reset before import. That's why sqlreset is used. sqlreset alone prints DROP DATABASE statements on stdout allows to purge database from tables (if passed to SQL execution tool).
Additionally you can gzip JSON data created to make migration (much) faster:
manage.py dumpdata appname | gzip -c | ssh destinationserv 'cat > data.json.gz' (login to destinationserv ...) manage.py sqlreset appname | psql ... gzip -dc data.json.gz | manage.py loaddata -