Wednesday, November 17, 2010

Export and Import data in Oracle (including BLOB and XMLTYPE fields)

To copy data from one database schema to another: that's the point.
You never needed it before ? You will, no doubt. And at that moment, you'll find very handy the wizard functionality of Toad (or similar softwares).
To use it, just right-click on the desired table in the left column, select "Copy data to another schema" from the context menĂ¹ and follow the wizard.
You will be asked to select a destination connection (aka database), a destination user (aka schema), then to select the options you want to set: commit frequency (default is every 500 rows), append or truncate (delete+insert) mode, "Where" conditions and so on. Then press "Execute" and wait for the script to end.
Note that you can move data from a user to another user on the same database, or between different users of different databases too.
Great, you moved some data from one table to one identical table on a different place.

But wait... what if you have some BLOB or XMLTYPE fields ? They are very complex, huge fields, containing tons of bytes and they are not managed by Toad or by any other wizard I know (except for some expensive 3rd party tools that you need to buy just to do this single operation).

This is the interesting part of the post: Oracle itself provides some tools to export and import ANY kind of data without any problems.

These tools are EXP and IMP and they are two command-line executables given to you by the Oracle (Client, Server, doesn't matter) installer, if you choose to install Oracle Tools too. You can find them in the BIN folder, under your Oracle folder.

Their behavior is well explained in the Oracle Wiki, but let's see how to use them in a few seconds ;)
Remember to write all that stuff in a single row, I'm putting it on different lines to keep the code clean.

EXP syntax for dummies

EXP username/password@dbInstance
file=dumpPath\dumpFilename
log=logPath\logFilename
tables=table1,table2,...tableN
query=\" put_here_your_optional_query_conditions \"


Example n.1: export the hypothetical tables EMPLOYEES, CUSTOMERS and DOCUMENTS from a user "MySociety", password "myPass", located in a database mapped as "my_database" on tnsnames.ora, without where conditions, in the folder "myexports" located on C: drive in the file system, with subfolders for dumps and logs.

EXP MySociety/myPass@my_database file=C:\myexports\dmp\allthebigtables.dmp log=C:\myexports\logs\allthebigtables.txt tables=EMPLOYEES, CUSTOMERS, DOCUMENTS



Example n.2: export the same tables EMPLOYEES and CUSTOMERS, specifying that the field AGE of both must be over 50.

EXP MySociety/myPass@my_database file=C:\myexports\dmp\overFifty.dmp log=C:\myexports\logs\overFifty.txt tables=EMPLOYEES, CUSTOMERS query=\" where age>50 \"



Example n.3: export the same tables EMPLOYEES and CUSTOMERS, specifying that the field AGE of EMPLOYEES must be over 50, and the field AGE of CUSTOMERS must be under 25. You will need to write two rows and execute them consecutively to do that.

EXP MySociety/myPass@my_database file=C:\myexports\dmp\empOverFifty.dmp log=C:\myexports\logs\empOverFifty.txt tables=EMPLOYEES query=\" where age>50 \"

EXP MySociety/myPass@my_database file=C:\myexports\dmp\customerUnderTwentfive.dmp log=C:\myexports\logs\customerUnderTwentfive.txt tables=CUSTOMERS query=\" where age<25 \"



This script is so powerfull that it will read and export all the data from all the tables targeted in one single dump file. Now let's see how to import that data in a different database and schema.


IMP Syntax for dummies

IMP username/password@dbInstance
file=dumpPath\dumpFilename
log=logPath\logFilename
tables=table1,table2,...tableN
ignore=Y


Example: we want to import the previously saved "allthebigtables.dmp" file into a database called "external_database", in a schema with username "YourSociety" and passowrd "yourPass"


IMP YourSociety/yourPass@external_database file=C:\myexports\dmp\allthebigtables.dmp log=C:\myexports\logs\allthebigtables_import.txt tables=EMPLOYEES, CUSTOMERS, DOCUMENTS ignore=Y


This will import all the data previously exported into the new database, ignoring possible constraints problems.

Now you are ready to backup or move all your data between all your database instances and schemas.

As a further step, it's a very good idea to create batch files to achieve this, especially if you think that the operation you are performing now will be repeated in the future (like moving data between different environments, for example from Production to Test).

To do this, open a text editor, create a new text file, put all the stuff you've written into it, and save it with the ".bat" extension.
To avoid confusion, I recommend to use at least one file for exports and another one for imports ;)
P.S: Remember to use full path before IMP or EXP, to be able to launch the batch file from any place on your system, without touching environment variables like PATH and CLASSPATH.

Feel free to comment if you find this post useful :)

Special thanks to Giovanni Galasso for the support.

No comments:

Post a Comment