Oracle DBA – A lifelong learning experience

Using Datapump from SQLDeveloper

Posted by John Hallas on November 8, 2012

One problem that we all have is with exporting/importing between different versions of the datapump client. The following error is not uncommon

UDI-00018: Data Pump client is incompatible with database version

 Co-incidentally to having some problems with this, Vitaly Kaminsky, a colleague worked out a method of overcoming this by using SQLDeveloper and below is his document describing how to do it. Any kudos to me please, any problems contact him at

The “quick and dirty” way to save time while copying data, DDL, schemas and objects between Oracle databases using SQL Developer vs traditional Exp/Imp routines.

As every DBA knows, small, daily tasks and requests like “will you please copy…” or “please refresh..” may quickly consume considerable amount of time and effort, leaving you wondering where the day has gone. One of the most convenient ways to satisfy those requests is to use free tools like SQL Developer (and yes, there are many others, like Toad, but you have to pay your license fees).

Most of us have a considerable estate to look after, often consisting of some large production clusters and dozens, or hundreds, or thousands of test, development, integration, UAT and other databases, running on VMs or physical boxes.

In case of using traditional expdp/impdp routines, copying data and DDL between those small DBs may require more time for the setup than the actual process of moving data. I would estimate the time required to check the filesystem, permissions, create directory objects etc. to be in the region of 20 to 30 min per request. This is the actual time which you save by using SQL Developer Database Copy feature, because the other bits, like creating the script, specifying schemas, etc. will take about the same time.

The actual movement of data by Database Copy is performed by SQL Developer by running DDL and DML on the target system and pumping the I/O via the initiating workstation. This is the limiting factor and I would suggest to use this method if the actual volume of data does not exceed 500GB, otherwise, the setup time-saving will be lost on I/O.

Using SQL Developer 3.1.06, the process itself is wonderfully simple, you just need to make sure your connections have been setup properly with sufficient privileges:

1. Navigate to Database Copy in Tools menu:       

2. Select Source and Target databases and whether you want to copy schemas or objects:

3. Select required objects (ALL for full schemas):

4. Select source schema(s):

5. Select objects to copy (this example shows the selection of system objects which is sufficient for the demo):

6. On the next step you can walk through the objects to apply any required filtering:

7. Proceed to summary:

8. Once you press FINISH, the process will start and you can monitor the progress in the log window (just an example of text here):

That’s it, all done.

The limitation of the above is that you can’t save the specs for subsequent reuse, but the whole purpose of this exercise is to save time for one-off requests.



One Response to “Using Datapump from SQLDeveloper”

  1. SutoCom said

    Reblogged this on Sutoprise Avenue, A SutoCom Source.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: