Introduction
In order to create a test system for one of our customers I needed to copy some schemas from the live oracle database. As the main schema contained a lot of objects, I realised that the best thing to use was Data Pump. Something I'd never used it before.
Oracle Data Pump first appeared in 10g and provided a newer and more flexible alternative to the 'exp' and 'imp' utilities from previous Oracle versions. It's a server based technology so you'll need access to the file directory structure on the database server. That means you'll need to be able to remote connect to both source and target servers.
Running an Export
We're going to start by running a schema export, it's quite straight forward, but we need to ensure that we have a directory object configured in Oracle. You could go ahead and just add one, but it's worth looking to see if there's one already set up that you can use. Log into the source server (in my case Live) and type the following:-
SQL> select directory_name, directory_path from all_directories;
The results should give you a column listing the object name, and a second stating it's actual directory path on disk. It makes sense to choose one where the directory actually exists, and where you have file creations rights, but that should go without saying!
If nothing suitable exists then go ahead and create one, and then grant yourself read and write on it.
You run the export from the operating systems command prompt, here's what I used:-
expdp <my user>/<my password> directory=<my directory object> dumpfile=<my export file>.dmp schemas=<schema to export>
When it runs it will output process to the screen and may take a number of minutes to complete (depending on number of objects and size of tables). If you'd prefer the progress can be sent to a file just by including the following paramater:-
logfile=<export log>.log
Now if you go to the directory listed in the directory_path you should see your DMP file waiting for you. They can be quite large but normally they zip quite well to make file transfer quicker.
Running an Import
The obvious next step was to copy the dump file over to the target server (in my case the new test system), but don't worry about where to put it just yet. What we need to do again is find an Oracle directory object to use for the import process.
I used the same query as before:-
SQL> select directory_name, directory_path from all_directories;
If a directory object exists then move your dump file into it, or (as before) create your own directory object in Oracle.
Before running the import we need to connect as sysdba and create the empty schema in the test system.
SQL> create user newschema identified by pwd4newschema;
(NB. The "Identified By" parameter is the password.)Finally at the command prompt run the import command:-
impdp <my user>/<my password> DIRECTORY=<my directory object> DUMPFILE=<my export file>.dmp
Again the progress will be reported to the screen. Scan through it and check you don't get any errors. I had some dependancy issues because two other schemas referenced in the packaged functions where missing. If this happens to you, copy the missing references and recompile your packages.